vocabulary size mismatch - wrong tokenizer.model

#21
by mradermacher - opened

The model has 32000 tokens, but tokenizer.model has 32001, which means it cannot be converted to gguf (at least not without fixing it).

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment