Model type llama4 not supported.

#1
by tibip - opened

I get "ValueError: Model type llama4 not supported." when I try to run:

python -m mlx_vlm.generate --model mlx-community/Llama-4-Scout-17B-16E-Instruct-8bit --max-tokens 100 --temperature 0.0 --prompt "Describe this image." --image "some_image.jpg"

I have the latest version of mlx-vlm: 0.1.21.

What am I missing?

try

pip install -U mlx-vlm --force

The 'force' parameter did it for me, I guess it was using some cache before, and not picking up the latest.

It didn't help me, though. I get the same "Model type llama4 not supported." even after running
pip install -U mlx-vlm --force

Sign up or log in to comment