runtime error

Exit code: 1. Reason: ļæ½ā–ˆā–ˆā–ˆā–‰| 309/310 [00:49<00:00, 6.47it/s, Materializing param=model.layers.27.self_attn.q_proj.weight] Loading weights: 100%|ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–‰| 309/310 [00:49<00:00, 6.47it/s, Materializing param=model.layers.27.self_attn.v_proj.weight] Loading weights: 100%|ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–‰| 309/310 [00:49<00:00, 6.47it/s, Materializing param=model.layers.27.self_attn.v_proj.weight] Loading weights: 100%|ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆ| 310/310 [00:49<00:00, 6.89it/s, Materializing param=model.layers.27.self_attn.v_proj.weight] Loading weights: 100%|ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆ| 310/310 [00:49<00:00, 6.89it/s, Materializing param=model.norm.weight] Loading weights: 100%|ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆ| 310/310 [00:49<00:00, 6.89it/s, Materializing param=model.norm.weight] Loading weights: 100%|ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆ| 310/310 [00:49<00:00, 6.30it/s, Materializing param=model.norm.weight] āœ… Model loaded! šŸ“¦ Skipping LoRA (v9 test)... āœ… No LoRA adapters to add! šŸ“Š Generating 500 training samples... āœ… Training data generated! āœ… GRPO config created! šŸ“¦ Creating GRPO trainer... Traceback (most recent call last): File "/app/train_arithmetic_v9_no_lora.py", line 223, in <module> main() File "/app/train_arithmetic_v9_no_lora.py", line 196, in main trainer = GRPOTrainer( ^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/trl/trainer/grpo_trainer.py", line 544, in __init__ super().__init__( File "/usr/local/lib/python3.11/site-packages/transformers/trainer.py", line 443, in __init__ validate_quantization_for_training(model) File "/usr/local/lib/python3.11/site-packages/transformers/trainer_utils.py", line 131, in validate_quantization_for_training raise ValueError( ValueError: You cannot perform fine-tuning on purely quantized models. Please attach trainable adapters on top of the quantized model to correctly perform fine-tuning. Please see: https://huggingface.co/docs/transformers/peft for more details

Container logs:

Fetching error logs...