Run on vLLM
Got the error below:
ValueError: The checkpoint you are trying to load has model type deepseek_vl_v2
but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
Got the error below:
ValueError: The checkpoint you are trying to load has model typedeepseek_vl_v2
but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
same error when deploying using vllm. Did you manage to solve? Thanks
Got the error below:
ValueError: The checkpoint you are trying to load has model typedeepseek_vl_v2
but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.same error when deploying using vllm. Did you manage to solve? Thanks
Hi, Have you solved the problem, Thanks
Got the error below:
ValueError: The checkpoint you are trying to load has model typedeepseek_vl_v2
but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.same error when deploying using vllm. Did you manage to solve? Thanks
Hi, Have you solved the problem, Thanks
same
I also got the same error
Transformers supported model list does not contain DeepSeek. You will have to import the model provided by DeepSeek source code
import torch
from transformers import AutoModelForCausalLM
from deepseek_vl.models import DeepseekVLV2Processor, DeepseekVLV2ForCausalLM
# specify the path to the model
model_path = "deepseek-ai/deepseek-vl2-small"
vl_chat_processor: DeepseekVLV2Processor = DeepseekVLV2Processor.from_pretrained(model_path)
tokenizer = vl_chat_processor.tokenizer
model = AutoModelForCausalLM.from_pretrained(model_path, trust_remote_code=True)