Not working.

#12
by vladlen32230 - opened

KeyError: 'model.layers.18.self_attn.kv_a_proj_with_mqa.weight'
vLLM can't load the model

Try pip install vllm==0.9.2

Try pip install vllm==0.9.2

Thanks! I also had to downgrade transformers to 4.52.0

Sign up or log in to comment