Update README.md
Browse files
README.md
CHANGED
@@ -82,4 +82,7 @@ So far, it seems like the strongest anti-refusal bias is at 0 ctx - the first pr
|
|
82 |
- num_train_epochs: 1.4
|
83 |
|
84 |
Script used for SFT training can be found here:
|
85 |
-
https://huggingface.co/adamo1139/Yi-34B-200K-AEZAKMI-RAW-2301-LoRA/blob/main/yi-34b-aezakmi-sft-1-hf.py
|
|
|
|
|
|
|
|
82 |
- num_train_epochs: 1.4
|
83 |
|
84 |
Script used for SFT training can be found here:
|
85 |
+
https://huggingface.co/adamo1139/Yi-34B-200K-AEZAKMI-RAW-2301-LoRA/blob/main/yi-34b-aezakmi-sft-1-hf.py
|
86 |
+
|
87 |
+
### Credits
|
88 |
+
Thanks to mlabonne, Daniel Han and Michael Han for providing open source code that was used for fine-tuning.
|