runtime error
Exit code: 1. Reason: 'base_model.model.model.layers.9.self_attn.q_proj.lora_A.weight', 'base_model.model.model.layers.9.self_attn.q_proj.lora_B.weight'] did execute the if block Downloading shards: 0%| | 0/2 [00:00<?, ?it/s][A Downloading shards: 50%|█████ | 1/2 [00:11<00:11, 11.09s/it][A Downloading shards: 100%|██████████| 2/2 [00:12<00:00, 5.21s/it][A Downloading shards: 100%|██████████| 2/2 [00:12<00:00, 6.09s/it] Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s][A Loading checkpoint shards: 50%|█████ | 1/2 [00:04<00:04, 4.16s/it][A Loading checkpoint shards: 100%|██████████| 2/2 [00:04<00:00, 2.32s/it] Traceback (most recent call last): File "/home/user/app/app.py", line 128, in <module> current_model = load_model("Hugging face dataset") File "/home/user/app/app.py", line 119, in load_model model = AutoModelForCausalLM.from_pretrained("./", torch_dtype=torch.float16, device_map="auto") File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 564, in from_pretrained return model_class.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4330, in from_pretrained model.load_adapter( File "/usr/local/lib/python3.10/site-packages/transformers/integrations/peft.py", line 239, in load_adapter incompatible_keys = set_peft_model_state_dict( File "/usr/local/lib/python3.10/site-packages/peft/utils/save_and_load.py", line 451, in set_peft_model_state_dict load_result = model.load_state_dict(peft_model_state_dict, strict=False) File "/usr/local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 2581, in load_state_dict raise RuntimeError( RuntimeError: Error(s) in loading state_dict for LlamaForCausalLM: size mismatch for lm_head.lora_B.default.weight: copying a param with shape torch.Size([68097, 64]) from checkpoint, the shape in current model is torch.Size([68096, 64]).
Container logs:
Fetching error logs...