Running into error: Asking to pad but the tokenizer does not have a padding token.
#29
by
smehta12
- opened
Hello, When I use this model to create embeddings with langchain and chromadb. It throws the error below when it runs add_texts().
ValueError: Asking to pad but the tokenizer does not have a padding token. Please select a token to use as ‘pad_token' '(tokenizer.pad_token = tokenizer.eos_token e.g.)' or add a new pad token via 'tokenizer.add_special_tokens({'pad_token': '[PAD]’})'.)
Following is the code:
from langchain_huggingface.embeddings import HuggingFaceEmbeddings
from langchain_chroma import Chroma
embeddings = HuggingFaceEmbeddings(model_name="codellama/CodeLlama-7b-hf")
vector_store = Chroma(embedding_function=embeddings)
vector_store.add_texts(<texts>)