Text Generation
Transformers
English
alpaca
bloom
LLM

Max prompt length

#6
by ak2023 - opened

What's the maximum prompt length the model can take as an input?

Among the instruction, input (if there is some input), and output, the max length was set to 256 tokens

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment