PEFT
English
music
ThatOneShortGuy commited on
Commit
93319e3
·
1 Parent(s): 507d680

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +49 -1
README.md CHANGED
@@ -1,6 +1,54 @@
1
  ---
2
  library_name: peft
 
 
 
 
 
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4
  ## Training procedure
5
 
6
 
@@ -17,4 +65,4 @@ The following `bitsandbytes` quantization config was used during training:
17
  ### Framework versions
18
 
19
 
20
- - PEFT 0.5.0.dev0
 
1
  ---
2
  library_name: peft
3
+ license: apache-2.0
4
+ datasets:
5
+ - ThatOneShortGuy/SongLyrics
6
+ language:
7
+ - en
8
+ tags:
9
+ - music
10
  ---
11
+ # Musical Falcon
12
+
13
+ [OpenAssistant/falcon-7b-sft-mix-2000](https://huggingface.co/OpenAssistant/falcon-7b-sft-mix-2000) model fine tuned using PEFT on
14
+ [Song Lyrics](https://huggingface.co/datasets/ThatOneShortGuy/SongLyrics) to write lyrics to songs.
15
+
16
+ ## Model Details
17
+ - **Finetuned from**: [OpenAssistant/falcon-7b-sft-mix-2000]
18
+ - **Model Type**: Causal decoder-only transformer language model
19
+ - **Language**: English (and limited capabilities in German, Spanish, French, Italian, Portuguese, Polish, Dutch, Romanian, Czech, Swedish);
20
+ - **License**: Apache 2.0
21
+ - **Contact**: Lol don't. This is just for fun.
22
+
23
+ ## Usage
24
+ The basic basic format getting it in is:
25
+ ```python
26
+ from peft import PeftModel, PeftConfig
27
+ from transformers import AutoModelForCausalLM
28
+
29
+ config = PeftConfig.from_pretrained("ThatOneShortGuy/MusicalFalcon")
30
+ model = AutoModelForCausalLM.from_pretrained("OpenAssistant/falcon-7b-sft-mix-2000")
31
+ model = PeftModel.from_pretrained(model, "ThatOneShortGuy/MusicalFalcon")
32
+ ```
33
+
34
+ ## Prompting
35
+ Considering this comes from [OpenAssistant/falcon-7b-sft-mix-2000](https://huggingface.co/OpenAssistant/falcon-7b-sft-mix-2000), it uses the same structure.
36
+ Two special tokens are used to mark the beginning of user and assistant turns: `<|prompter|>` and `<|assistant|>`. Each turn ends with a `<|endoftext|>` token.
37
+ The training prompt used the structure:
38
+ ```
39
+ <|prompter|>Come up with the lyrics for a song from "{artist}" {"from " + year if year else ""} titled "{title}".<|endoftext|>
40
+ <|assistant|>Sure! Here are the lyrics:
41
+ {lyrics}
42
+ <|endoftext|>
43
+ ```
44
+
45
+ However, it still seems to work just fine using:
46
+ ```
47
+ <|prompter|>Write me a song titled "{title}"<|endoftext|><|assistant|>
48
+ ```
49
+ or any anything of similar nature. Feel free to add a description of the song in there too.
50
+
51
+
52
  ## Training procedure
53
 
54
 
 
65
  ### Framework versions
66
 
67
 
68
+ - PEFT 0.5.0.dev0