What was the lora_config used to create this fine-tuned version of falcon-7b?
#42
by
ecorro
- opened
I'm trying to fine tune it with another instruct dataset and want to do it using bfp16 instead of 4-bit QLoRA. Can anyone please point me to an appropriate lora_config setup? Particularly the 'target_modules' prameter.
ecorro
changed discussion status to
closed
Here the answer to my question https://gist.github.com/pacman100/1731b41f7a90a87b457e8c5415ff1c14
ecorro
changed discussion status to
open
ecorro
changed discussion status to
closed