Possible error in prompt format?

#2
by mukundtibrewala - opened

Hey there! Thanks for providing this dataset, as well as instructions for fine-tuning the Llama 2 chat models. I was looking through the data, and I noticed that your represent gaps in multi-turn conversations like this: {{ agent response }} </s>\ <s>[INST]. In particular, there is a \ between the </s> tag and the <s> tag.

Was this intentional? According to the HF prompt template guide, it should just look like </s><s>[INST].

Thanks in advance!

This is actually a newline character I tried to escape but I don't remember why... You're right, the prompt template doesn't mention it.

I changed the dataset and uploaded the fixed version. Thank you for noticing!

mlabonne changed discussion status to closed

Awesome, thanks for fixing! Btw, I used your tutorial to train a Llama 13b model via SageMaker, but I noticed that the model does not seem to output the end token </s>, so it keeps generating tokens until it reaches the max output length. The generated tokens even include [INST]...[/INST] tags, i.e. it's mimicking the user response! I must have messed something up with my training data, but let me know if you have any pointers/suggestions. Thanks!

On a possibly related note -- I was under the impression that the training dataset had to be tokenized before passing it into Trainer? Your linked tutorial doesn't do so, and I'm a little confused.

mukundtibrewala changed discussion status to open

Sorry for the late reply. About your first question, yes this is something that I observed too. I've never investigated this weird behavior because I'm not really interested in chat models. I would say it's possible that the model is trained on the instructions too, which would explain this output. The tokenization is taken care of for us by the SFTTrainer in this example so no worries about that. I recommend using Axolotl on non-chat models as described in this article instead.

Got it, thanks for the info! :)

mukundtibrewala changed discussion status to closed

Sign up or log in to comment