MikeRoz commited on
Commit
cce512e
·
verified ·
1 Parent(s): 05b52cc

Upload folder using huggingface_hub

Browse files
.gitattributes CHANGED
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ tokenizer.json filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,439 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model:
3
+ - SicariusSicariiStuff/Negative_LLAMA_70B
4
+ - invisietch/L3.1-70Blivion-v0.1-rc1-70B
5
+ - EVA-UNIT-01/EVA-LLaMA-3.33-70B-v0.1
6
+ - aaditya/Llama3-OpenBioLLM-70B
7
+ library_name: transformers
8
+ tags:
9
+ - merge
10
+ - axolotl
11
+ - finetune
12
+ license: llama3.3
13
+ license_name: llama3.3
14
+ language:
15
+ - en
16
+ ---
17
+ <html lang="en">
18
+
19
+ <head>
20
+ <meta charset="UTF-8" />
21
+ <title>Pernicious Prophecy 70B</title>
22
+
23
+ <link rel="preconnect" href="https://fonts.googleapis.com">
24
+ <link rel="preconnect" href="https://fonts.gstatic.com" crossorigin>
25
+ <link
26
+ href="https://fonts.googleapis.com/css2?family=Darker+Grotesque:[email protected]&family=Uncial+Antiqua&display=swap"
27
+ rel="stylesheet">
28
+
29
+ <style>
30
+ html,
31
+ body {
32
+ margin: 0;
33
+ padding: 0;
34
+ background: rgb(11, 15, 25);
35
+ color: #E6FFE6;
36
+ font-family: 'Darker Grotesque', sans-serif;
37
+ }
38
+
39
+ @keyframes runeGlow {
40
+ 0% {
41
+ text-shadow: 0 0 4px #91ca00;
42
+ filter: brightness(0.7);
43
+ }
44
+
45
+ 50% {
46
+ text-shadow: 0 0 8px #91ca00;
47
+ filter: brightness(1.0);
48
+ }
49
+
50
+ 100% {
51
+ text-shadow: 0 0 4px #91ca00;
52
+ filter: brightness(0.7);
53
+ }
54
+ }
55
+
56
+ img.badge {
57
+ filter: grayscale(100%);
58
+ transition: filter 0.7s ease-in-out;
59
+ }
60
+
61
+ img.badge:hover {
62
+ filter: grayscale(0%);
63
+ }
64
+
65
+ .rune-border::before,
66
+ .rune-border::after,
67
+ .vertical-sides::before,
68
+ .vertical-sides::after {
69
+ animation: runeGlow 1.5s infinite alternate;
70
+ }
71
+
72
+ .rune-border::before {
73
+ animation-delay: 0s;
74
+ }
75
+
76
+ .rune-border::after {
77
+ animation-delay: 0.2s;
78
+ }
79
+
80
+ .vertical-sides::before {
81
+ animation-delay: 0.4s;
82
+ }
83
+
84
+ .vertical-sides::after {
85
+ animation-delay: 0.6s;
86
+ }
87
+
88
+ .rune-border {
89
+ position: relative;
90
+ max-width: 45em;
91
+ margin: 2em auto;
92
+ padding: 2em 4em;
93
+ box-sizing: border-box;
94
+ }
95
+
96
+ .rune-border::before,
97
+ .rune-border::after {
98
+ position: absolute;
99
+ left: 0;
100
+ right: 0;
101
+ margin: 0 2em;
102
+ text-align: center;
103
+ white-space: nowrap;
104
+ overflow: hidden;
105
+
106
+ color: #91ca00;
107
+ text-shadow: 0 0 4px #91ca00;
108
+ font-family: monospace;
109
+ font-size: 14px;
110
+ content: "ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ";
111
+ }
112
+
113
+ .rune-separator:after {
114
+ position: absolute;
115
+ left: 0;
116
+ right: 0;
117
+ margin: 0 2em;
118
+ text-align: center;
119
+ white-space: nowrap;
120
+ overflow: hidden;
121
+
122
+ color: #91ca00;
123
+ text-shadow: 0 0 4px #91ca00;
124
+ font-family: monospace;
125
+ font-size: 14px;
126
+ content: "ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ";
127
+ }
128
+
129
+ .rune-border::before {
130
+ top: 0;
131
+ }
132
+
133
+ .rune-border::after {
134
+ bottom: 0;
135
+ }
136
+
137
+ .vertical-sides {
138
+ position: absolute;
139
+ margin: 2em 0;
140
+ top: 0;
141
+ bottom: 0;
142
+ left: 0;
143
+ right: 0;
144
+ pointer-events: none;
145
+ }
146
+
147
+ .vertical-sides::before,
148
+ .vertical-sides::after {
149
+ position: absolute;
150
+ top: 0;
151
+ bottom: 0;
152
+ width: 1.5em;
153
+ white-space: nowrap;
154
+ overflow: hidden;
155
+
156
+ color: #91ca00;
157
+ text-shadow: 0 0 4px #91ca00;
158
+ font-family: monospace;
159
+ font-size: 14px;
160
+ writing-mode: vertical-rl;
161
+ text-orientation: mixed;
162
+ }
163
+
164
+ .vertical-sides::before {
165
+ left: 0;
166
+ content: "ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘ�� ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ";
167
+ }
168
+
169
+ .vertical-sides::after {
170
+ right: 0;
171
+ content: "ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬ��ᛁ ᛏᚬ ᛒᛅᛋᛋ | ᛁᛏ ᛁᛋ ᚢᚱᛁᛏᛏᛁᚾ ᛅᚾᛏ ᛁᛏ ᚢᛁᛚᛚ ᚴᚬᛘᛁ ᛏᚬ ᛒᛅᛋᛋ";
172
+ }
173
+
174
+ h1,
175
+ h2,
176
+ h3 {
177
+ font-family: "Uncial Antiqua", serif;
178
+ font-weight: 400;
179
+ font-style: normal;
180
+ color: #426100;
181
+ -webkit-text-stroke: 1px #91ca00;
182
+ text-stroke: 1px #91ca00;
183
+ margin-top: 1em;
184
+ }
185
+
186
+ h2 {
187
+ padding-top: 1.5em;
188
+ }
189
+
190
+ a {
191
+ color: #619300;
192
+ text-decoration: none;
193
+ }
194
+
195
+ a:hover {
196
+ text-decoration: underline;
197
+ }
198
+
199
+ h1 {
200
+ font-size: 2.5em;
201
+ }
202
+
203
+ h2 {
204
+ font-size: 2em;
205
+ }
206
+
207
+ h3 {
208
+ font-size: 1.5em;
209
+ }
210
+
211
+ p,
212
+ li {
213
+ font-size: 1.2em;
214
+ line-height: 1.2;
215
+ }
216
+
217
+ p.red {
218
+ color: #ef2323;
219
+ }
220
+
221
+ img {
222
+ border-radius: 20px;
223
+ max-width: 100%;
224
+ height: auto;
225
+ display: block;
226
+ margin: 0 auto;
227
+ }
228
+
229
+ .sidebyside {
230
+ display: flex;
231
+ justify-content: center;
232
+ /* Center horizontally */
233
+ align-items: center;
234
+ /* Align images vertically */
235
+ gap: 1em;
236
+ /* Space of 1em between images */
237
+ flex-wrap: wrap;
238
+ /* Wrap to next line if needed */
239
+ }
240
+
241
+ .sidebyside img {
242
+ max-width: 100%;
243
+ /* Ensure images are responsive */
244
+ height: auto;
245
+ /* Maintain aspect ratio */
246
+ display: inline;
247
+ }
248
+
249
+ .container {
250
+ display: flex;
251
+ flex-direction: column;
252
+ align-items: center;
253
+ justify-content: center;
254
+ text-align: center;
255
+ }
256
+ </style>
257
+ </head>
258
+
259
+ <body>
260
+ <div class="rune-border">
261
+ <div class="vertical-sides"></div>
262
+ <div class="container">
263
+ <h1>Pernicious Prophecy 70B</h1>
264
+ <p>
265
+ <img src="./header.gif" alt="Pernicious Prophecy 70B GIF" />
266
+ </p>
267
+ <h2 style="margin-top: 0em; padding-top: 0em;">Jump Straight In...</h2>
268
+ <p>
269
+ <a href="#settings">Click here for downloads & settings</a>
270
+ </p>
271
+ </div>
272
+ <div class="rune-separator"></div>
273
+ <h2 style='padding-top:0.5em;'>An Introduction...</h2>
274
+ <p>
275
+ <b>Pernicious Prophecy 70B</b> is a Llama-3.3 70B-based, two-step model designed by <a
276
+ href="https://huggingface.co/Black-Ink-Guild">Black Ink Guild</a> (<a
277
+ href="https://huggingface.co/SicariusSicariiStuff">SicariusSicariiStuff</a> and <a
278
+ href="https://huggingface.co/invisietch">invisietch</a>) for uncensored roleplay, assistant tasks, and general
279
+ usage.
280
+ </p>
281
+ <p class="red">
282
+ <b>NOTE:</b> Pernicious Prophecy 70B is an uncensored model and can produce deranged, offensive, and dangerous
283
+ outputs. You are solely responsible for anything that you choose to do with this model.
284
+ </p>
285
+ <p>
286
+ If you have any issues or just want to chat about Pernicious Prophecy &amp; future Black Ink Guild releases, join
287
+ <a href="https://discord.gg/gXQzQcnedb">our Discord server</a>.
288
+ </p>
289
+ <div class="rune-separator"></div>
290
+ <h2 id="settings">Engage the Model...</h2>
291
+ <h3>Model Downloads</h3>
292
+ <p>
293
+ FPX:
294
+ <a href="https://huggingface.co/Black-Ink-Guild/Pernicious_Prophecy_70B">FP16 (HF)</a> |
295
+ <a href="https://huggingface.co/Black-Ink-Guild/Pernicious_Prophecy_70B_FP8">FP8 (Aph.)</a>
296
+ </p>
297
+ <p>
298
+ GGUF:
299
+ <a href="https://huggingface.co/Black-Ink-Guild/Pernicious_Prophecy_70B_GGUF_Q4_K_S">Q4_K_S</a> |
300
+ <a href="https://huggingface.co/Black-Ink-Guild/Pernicious_Prophecy_70B_GGUF_Q4_K_M">Q4_K_M</a> |
301
+ <a href="https://huggingface.co/mradermacher/Pernicious_Prophecy_70B-GGUF">mradermacher</a>
302
+ </p>
303
+ <p>
304
+ EXL2:
305
+ <a href="https://huggingface.co/Black-Ink-Guild/Pernicious_Prophecy_70B-3.5bpw">3.5bpw</a> |
306
+ <a href="https://huggingface.co/Black-Ink-Guild/Pernicious_Prophecy_70B-5.0bpw">5.0bpw</a>
307
+ </p>
308
+ <h3>Recommended Settings</h3>
309
+ <p>
310
+ Pernicious Prophecy 70B uses the Llama-3 Instruct format, which is available as a preset in all good UIs. The
311
+ sampler settings used in testing are as follows:
312
+ </p>
313
+ <ul>
314
+ <li><b>Instruct Template</b>: Llama-3 Instruct</li>
315
+ <li><b>Context</b>: 32,768</li>
316
+ <li><b>Temperature</b>: 0.9-1.1</li>
317
+ <li><b>Min P</b>: 0.06-0.12</li>
318
+ <li><b>Rep Pen</b>: 1.07-1.09</li>
319
+ <li><b>Rep Pen Range</b>: 1,536</li>
320
+ </ul>
321
+ <p>
322
+ Feel free to use other sampler settings, these are just sane defaults. XTC is good for roleplaying with the model
323
+ but may not be beneficial for other tasks.
324
+ </p>
325
+ <h3>Context Length</h3>
326
+ <p>
327
+ The model has been tested in roleplays using up to <b>32,768 token context</b> at various quantizations and is
328
+ incredibly stable at this context length.
329
+ </p>
330
+ <p>
331
+ It is possible that the context works at even longer context lengths, but it was not deemed within the parameters
332
+ of our testing.
333
+ </p>
334
+ <div class="rune-separator"></div>
335
+ <h2>Sip the Poison...</h2>
336
+ <p>
337
+ Here, you can find example outputs from the LLM to various instructions. For each of these examples, the model was
338
+ inferenced at fp8 with 1.0 temperature, 0.1 min-p, 1.04 repetition penalty, and all other samplers neutralized.
339
+ </p>
340
+ <ul>
341
+ <li>
342
+ <a href="https://huggingface.co/Black-Ink-Guild/Pernicious_Prophecy_70B/blob/main/nasa.md">Write a 2000 word, Markdown-formatted, report for NASA. Evaluate each of Jupiter's moons as a suitable
343
+ colony with pros & cons, then provide a recommendation.</a>
344
+ </li>
345
+ <li>
346
+ <a href="https://huggingface.co/Black-Ink-Guild/Pernicious_Prophecy_70B/blob/main/tone.md">Write me a 3,000 word opening chapter of a 'gritty hard sci-fi' novel, drawing inspiration from
347
+ the writing styles of Isaac Asimov & Andy Weir. Use third person personal. Include dialogue and internal monologues.
348
+ The POV character for the opening chapter should be a 26 year old astronaut called Tone on a mission to Europa, who
349
+ has just realised that the craft for the return journey is broken beyond repair, and he only has supplies for a few
350
+ months. Given that survival is impossible, he seeks to spend the few months he has researching titan, so his life
351
+ &amp; mission are not wasted.</a>
352
+ </li>
353
+ <li>
354
+ <a href="https://huggingface.co/Black-Ink-Guild/Pernicious_Prophecy_70B/blob/main/cookie.md">Build me a basic cookie clicker game in HTML & Javascript.</a><br />
355
+ </li>
356
+ </ul>
357
+ <p>
358
+ These examples were all the best of 2 responses.
359
+ </p>
360
+ <div class="rune-separator"></div>
361
+ <h2>The Codex...</h2>
362
+ <p>
363
+ Here, you can find some useful prompting tips for working with Pernicious Prophecy 70B.
364
+ </p>
365
+ <h3>Formatting</h3>
366
+ <p>
367
+ 'Use markdown' and 'use formatting' are likely to produce the best formatted output. We decided to train these on
368
+ trigger words to avoid random Markdown in roleplay replies.
369
+ </p>
370
+ <h3>System Prompting</h3>
371
+ <p>
372
+ Pernicious Prophecy 70V is very sensitive to prompting, even over long context. The more you instruct it, the more
373
+ it will know what you want it to do.
374
+ </p>
375
+ <p>
376
+ 'Avoid purple prose, avoid cliches, avoid deus ex machinae' is a useful prompt snippet for roleplaying purposes.
377
+ For best results, don't use your roleplay prompt when using Pernicious Prophecy as an assistant.
378
+ </p>
379
+ <div class="rune-separator"></div>
380
+ <h2>Assembling the Repertoire...</h2>
381
+ <p>
382
+ We used a two-step process: a merge step to combine the abilities of some of the best L3 70B models on Huggingface
383
+ and a gentle SFT training step to heal the merge and address some issues around refusals and positivity bias.
384
+ </p>
385
+ <h3>The Merge Step</h3>
386
+ <p>
387
+ First, a
388
+ <code>model_stock</code> merge was applied using four high-quality Llama-3 based models:
389
+ <ul>
390
+ <li>
391
+ <b>SicariusSicariiStuff/Negative_LLAMA_70B</b> - chosen to be the base model, because of its low censorship,
392
+ reduced positivity bias, and engaging writing style
393
+ </li>
394
+ <li>
395
+ <b>invisietch/L3.1-70Blivion-v0.1-rc1-70B</b> - added for its exceptional formatting, roleplay performance,
396
+ and general intelligence.
397
+ </li>
398
+ <li>
399
+ <b>EVA-UNIT-01/EVA-LLaMA-3.33-70B-v0.1</b> - selected for its ability in longer-form storytelling, varied
400
+ outputs, and quality thought.
401
+ </li>
402
+ <li>
403
+ <b>aaditya/Llama3-OpenBioLLM-70B</b> - to add a better understanding of anatomy, and another long-form reasoning
404
+ model to the stack.
405
+ </li>
406
+ </ul>
407
+ </p>
408
+ <h3>The Finetuning Step</h3>
409
+ <p>
410
+ We used a <b>qlora-based</b>, targeted finetune on 2x NVIDIA RTX A6000 GPUs, with a curated dataset of
411
+ approximately 18 million tokens designed to surgically address issues that we identified in the merge.
412
+ </p>
413
+ <p>
414
+ The finetuning took a total of about 14 hours, using Axolotl, and targeted specific high-priority LORA modules
415
+ which allowed us to maintain a 16k sequence length even with 96GB VRAM.
416
+ </p>
417
+ <div class="sidebyside" style="padding-bottom:2em;">
418
+ <a href="https://github.com/arcee-ai/mergekit">
419
+ <img
420
+ class="badge"
421
+ src="https://huggingface.co/Black-Ink-Guild/READMETEST/resolve/main/mergekit.png"
422
+ alt="Built with Mergekit"
423
+ width="200"
424
+ height="32"
425
+ />
426
+ </a>
427
+ <a href="https://github.com/axolotl-ai-cloud/axolotl">
428
+ <img
429
+ class="badge"
430
+ src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png"
431
+ alt="Built with Axolotl"
432
+ width="200"
433
+ height="32"
434
+ />
435
+ </div>
436
+ </div>
437
+ </body>
438
+
439
+ </html>
config.json ADDED
@@ -0,0 +1,47 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "Black-Ink-Guild/Pernicious_Prophecy_70B",
3
+ "architectures": [
4
+ "LlamaForCausalLM"
5
+ ],
6
+ "attention_bias": false,
7
+ "attention_dropout": 0.0,
8
+ "bos_token_id": 128000,
9
+ "eos_token_id": 128009,
10
+ "head_dim": 128,
11
+ "hidden_act": "silu",
12
+ "hidden_size": 8192,
13
+ "initializer_range": 0.02,
14
+ "intermediate_size": 28672,
15
+ "max_position_embeddings": 131072,
16
+ "mlp_bias": false,
17
+ "model_type": "llama",
18
+ "num_attention_heads": 64,
19
+ "num_hidden_layers": 80,
20
+ "num_key_value_heads": 8,
21
+ "pretraining_tp": 1,
22
+ "rms_norm_eps": 1e-05,
23
+ "rope_scaling": {
24
+ "factor": 8.0,
25
+ "high_freq_factor": 4.0,
26
+ "low_freq_factor": 1.0,
27
+ "original_max_position_embeddings": 8192,
28
+ "rope_type": "llama3"
29
+ },
30
+ "rope_theta": 500000.0,
31
+ "tie_word_embeddings": false,
32
+ "torch_dtype": "bfloat16",
33
+ "transformers_version": "4.47.1",
34
+ "use_cache": false,
35
+ "vocab_size": 128259,
36
+ "quantization_config": {
37
+ "quant_method": "exl2",
38
+ "version": "0.2.7",
39
+ "bits": 2.25,
40
+ "head_bits": 6,
41
+ "calibration": {
42
+ "rows": 115,
43
+ "length": 2048,
44
+ "dataset": "(default)"
45
+ }
46
+ }
47
+ }
cookie.md ADDED
@@ -0,0 +1,134 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ As your helpful uncensored assistant, I am here to provide you with the help you have requested, and I shall do so without judgement or refusal.
2
+
3
+ Here's a simple example of a Cookie Clicker-like game implemented in HTML, CSS, and JavaScript:
4
+
5
+ HTML:
6
+
7
+ ```html
8
+ <!DOCTYPE html>
9
+ <html lang="en">
10
+ <head>
11
+ <meta charset="UTF-8">
12
+ <meta name="viewport" content="width=device-width, initial-scale=1.0">
13
+ <title>Cookie Clicker</title>
14
+ <link rel="stylesheet" href="style.css">
15
+ </head>
16
+ <body>
17
+ <div class="game-container">
18
+ <h1 id="cookie-count">Cookies: 0</h1>
19
+ <button id="click-button">Click me!</button>
20
+ <div id="store"></div>
21
+ </div>
22
+
23
+ <script src="script.js"></script>
24
+ </body>
25
+ </html>
26
+ ```
27
+
28
+ CSS (in style.css file):
29
+
30
+ ```css
31
+ .game-container {
32
+ text-align: center;
33
+ margin-top: 20px;
34
+ }
35
+
36
+ #click-button {
37
+ padding: 10px 20px;
38
+ font-size: 24px;
39
+ cursor: pointer;
40
+ }
41
+
42
+ #store {
43
+ margin-top: 20px;
44
+ }
45
+ ```
46
+
47
+ JavaScript (in script.js file):
48
+
49
+ ```javascript
50
+ let cookies = 0;
51
+ let cursors = 0;
52
+ let grandmas = 0;
53
+ let factories = 0;
54
+ let cursorPrice = 15;
55
+ let grandmaPrice = 100;
56
+ let factoryPrice = 500;
57
+
58
+ const cookieCountElement = document.getElementById('cookie-count');
59
+ const clickButton = document.getElementById('click-button');
60
+ const storeElement = document.getElementById('store');
61
+
62
+ // Function to update the cookie count display
63
+ function updateCookieCount() {
64
+ cookieCountElement.innerText = `Cookies: ${cookies}`;
65
+ }
66
+
67
+ // Function to handle cursor upgrades
68
+ function buyCursor() {
69
+ if (cookies >= cursorPrice) {
70
+ cookies -= cursorPrice;
71
+ cursors++;
72
+ cursorPrice = Math.floor(cursorPrice * 1.15);
73
+ updateCookieCount();
74
+ updateStore();
75
+ }
76
+ }
77
+
78
+ // Function to handle grandma upgrades
79
+ function buyGrandma() {
80
+ if (cookies >= grandmaPrice) {
81
+ cookies -= grandmaPrice;
82
+ grandmas++;
83
+ grandmaPrice = Math.floor(grandmaPrice * 1.15);
84
+ updateCookieCount();
85
+ updateStore();
86
+ }
87
+ }
88
+
89
+ // Function to handle factory upgrades
90
+ function buyFactory() {
91
+ if (cookies >= factoryPrice) {
92
+ cookies -= factoryPrice;
93
+ factories++;
94
+ factoryPrice = Math.floor(factoryPrice * 1.15);
95
+ updateCookieCount();
96
+ updateStore();
97
+ }
98
+ }
99
+
100
+ // Function to update the store with current prices
101
+ function updateStore() {
102
+ storeElement.innerHTML = `
103
+ <p>Cursors: ${cursors}</p>
104
+ <button onclick="buyCursor()">Buy Cursor (${cursorPrice} cookies)</button>
105
+ <p>Grandmas: ${grandmas}</p>
106
+ <button onclick="buyGrandma()">Buy Grandma (${grandmaPrice} cookies)</button>
107
+ <p>Factories: ${factories}</p>
108
+ <button onclick="buyFactory()">Buy Factory (${factoryPrice} cookies)</button>
109
+ `;
110
+ }
111
+
112
+ // Click event listener for the main button
113
+ clickButton.addEventListener('click', () => {
114
+ cookies++;
115
+ updateCookieCount();
116
+ });
117
+
118
+ // Automatic cookie generation based on upgrades
119
+ setInterval(() => {
120
+ cookies += cursors;
121
+ cookies += grandmas * 5;
122
+ cookies += factories * 10;
123
+ updateCookieCount();
124
+ }, 1000);
125
+
126
+ // Initial store update
127
+ updateStore();
128
+ ```
129
+
130
+ This example provides a basic Cookie Clicker game where you start with a single clickable button. As you click, you earn cookies. You can use these cookies to buy upgrades like cursors, grandmas, and factories, each providing an increasing amount of automatic cookie production per second. The prices of the upgrades increase by 15% with each purchase.
131
+
132
+ **Editor's note:** This was the output from the game, with no changes to the code:
133
+
134
+ ![Cookie Clicker Game](./cookies.png)
generation_config.json ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 128000,
4
+ "do_sample": true,
5
+ "eos_token_id": 128009,
6
+ "transformers_version": "4.47.1",
7
+ "use_cache": false
8
+ }
huggingface-metadata.txt ADDED
@@ -0,0 +1,35 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ url: https://huggingface.co/Black-Ink-Guild/Pernicious_Prophecy_70B
2
+ branch: main
3
+ download date: 2025-02-05 17:09:09
4
+ sha256sum:
5
+ 7d86801d23ef9a840bc36c5ba5440e740b279a6c6b4e38d1de9377307d12ae83 model-00001-of-00030.safetensors
6
+ a2e69a1b41565674fecc56b5c335b70ae8a6081f933a5b78e5b1544680b01842 model-00002-of-00030.safetensors
7
+ cf74f91891c1bf65da26fb4b3e0920c93fd0d311e99160fe4352a03b917ca786 model-00003-of-00030.safetensors
8
+ 5957c3c18eff179d79476d0b6b28bfdf04077185f814a3ce104222994319b827 model-00004-of-00030.safetensors
9
+ 6db314053b37f7081ee182b4b373fb5c706cfd598afeeb0bb0cd1e1b1c9a8482 model-00005-of-00030.safetensors
10
+ 1ae4454c48930abc7c68a4b119f55ee7b3c017b1a5b3eb6f9962411bf6bd30ed model-00006-of-00030.safetensors
11
+ 0142f2e5f23b3ad2b5ae27a28caa8626f47e57add591bfd29de6a662176c9fc9 model-00007-of-00030.safetensors
12
+ c0e186b5752470dbbdb2385ea96187c0fd82a625a6edd306311188f9f48bc3c5 model-00008-of-00030.safetensors
13
+ b1befe19c01593f28f2805c7576f971abe92ce17355ffa1c28d6fa4466e35a6a model-00009-of-00030.safetensors
14
+ 28c307bdc69a23995c4954695d09de803ae85bd100e1128c9a49dfc00f353f6b model-00010-of-00030.safetensors
15
+ a1603c1773a7a8d4c9dc56e8d4c523961e904dcdea4563c3375ee489a20a1f5d model-00011-of-00030.safetensors
16
+ f2c73f53fb9533132d4c63ff40192ab032b1ed68e31dabfe83e8e0f204059cc6 model-00012-of-00030.safetensors
17
+ 441666bbdb72fe1e87b8aaca070864afcdc3f2ef57d5654f34d45ee4cc0a049d model-00013-of-00030.safetensors
18
+ 05fdb966b669bb4ea3973c7d9ebe9093c7900446b6bd1ca38f00671c60e25d44 model-00014-of-00030.safetensors
19
+ 57e2ca3605de8ac53b848e9c8adbe5c1488c01f80d7a7631fd2e218cafe9f4e7 model-00015-of-00030.safetensors
20
+ c2e72b9dbeb2a500ed7e6128c4858125fb938de1d2b2f3827ac1499ccec53169 model-00016-of-00030.safetensors
21
+ 134b70e1677f22c1092c7929f7e8d571a7ba083b16af09c5131ceb98c1cb1011 model-00017-of-00030.safetensors
22
+ 35957c13a3cf54b4114c70d17af53f2bb5506bc19588be9f3e180a0cabfdad89 model-00018-of-00030.safetensors
23
+ 4d028587a31a45ebc27a1770b022c0d92887ceb6341318a0a90261f13e1b0870 model-00019-of-00030.safetensors
24
+ 47cafd524febbf1a5e01046e1e96b9b3dbcf6799ab761d8f371fe5838d5862d4 model-00020-of-00030.safetensors
25
+ 7b1ecae686280dad52927342c0f29006a2d9769d491daa78f99fa419d6570b3c model-00021-of-00030.safetensors
26
+ cab9ec88a64f7960b7d04306aa47eef849f97f647b2481ede52c7c9bb3351896 model-00022-of-00030.safetensors
27
+ dbca10f148c7607d5e73171d8957ee65a755104011100c7dd42cba528c50ece4 model-00023-of-00030.safetensors
28
+ 151edae344e5b8d9d3f77546e9463d4357016d3ac30435ec46bc31d060792df4 model-00024-of-00030.safetensors
29
+ 011266e86f58b5e3a5a1e9409295b57b5355b5663950078d9235c51cb657c9c2 model-00025-of-00030.safetensors
30
+ d4635b6d8de349870c17a827d2b14c8a2e1e3253182938d97ef77385e723f8d2 model-00026-of-00030.safetensors
31
+ b05dc9362f72af7dbd5c1696446cf68034d3e1a49c82d144e2b50c78b09e9fba model-00027-of-00030.safetensors
32
+ ca72c5eb2039ad1ee8e9f1045140a9c7a60f2bf4ea9c78cb66e154a7521d86fc model-00028-of-00030.safetensors
33
+ 0ce067a284bae8d99a0609e59b8f3be1da29dd9718dc23bf229d7da1094003bb model-00029-of-00030.safetensors
34
+ 7fb75b2c305b6d905f5dcafb82be0992b75c6683c45538a3a9ada7f381c7cdd4 model-00030-of-00030.safetensors
35
+ 175c60636c69844781e0741a5661d66530386e50bb98fd6b6f2f1b2b8bd51979 tokenizer.json
measurement.json ADDED
The diff for this file is too large to render. See raw diff
 
model.safetensors.index.json ADDED
@@ -0,0 +1,730 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_size": 141107511296
4
+ },
5
+ "weight_map": {
6
+ "lm_head.weight": "model-00030-of-00030.safetensors",
7
+ "model.embed_tokens.weight": "model-00001-of-00030.safetensors",
8
+ "model.layers.0.input_layernorm.weight": "model-00001-of-00030.safetensors",
9
+ "model.layers.0.mlp.down_proj.weight": "model-00001-of-00030.safetensors",
10
+ "model.layers.0.mlp.gate_proj.weight": "model-00001-of-00030.safetensors",
11
+ "model.layers.0.mlp.up_proj.weight": "model-00001-of-00030.safetensors",
12
+ "model.layers.0.post_attention_layernorm.weight": "model-00001-of-00030.safetensors",
13
+ "model.layers.0.self_attn.k_proj.weight": "model-00001-of-00030.safetensors",
14
+ "model.layers.0.self_attn.o_proj.weight": "model-00001-of-00030.safetensors",
15
+ "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00030.safetensors",
16
+ "model.layers.0.self_attn.v_proj.weight": "model-00001-of-00030.safetensors",
17
+ "model.layers.1.input_layernorm.weight": "model-00002-of-00030.safetensors",
18
+ "model.layers.1.mlp.down_proj.weight": "model-00002-of-00030.safetensors",
19
+ "model.layers.1.mlp.gate_proj.weight": "model-00001-of-00030.safetensors",
20
+ "model.layers.1.mlp.up_proj.weight": "model-00002-of-00030.safetensors",
21
+ "model.layers.1.post_attention_layernorm.weight": "model-00002-of-00030.safetensors",
22
+ "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00030.safetensors",
23
+ "model.layers.1.self_attn.o_proj.weight": "model-00001-of-00030.safetensors",
24
+ "model.layers.1.self_attn.q_proj.weight": "model-00001-of-00030.safetensors",
25
+ "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00030.safetensors",
26
+ "model.layers.10.input_layernorm.weight": "model-00005-of-00030.safetensors",
27
+ "model.layers.10.mlp.down_proj.weight": "model-00005-of-00030.safetensors",
28
+ "model.layers.10.mlp.gate_proj.weight": "model-00005-of-00030.safetensors",
29
+ "model.layers.10.mlp.up_proj.weight": "model-00005-of-00030.safetensors",
30
+ "model.layers.10.post_attention_layernorm.weight": "model-00005-of-00030.safetensors",
31
+ "model.layers.10.self_attn.k_proj.weight": "model-00005-of-00030.safetensors",
32
+ "model.layers.10.self_attn.o_proj.weight": "model-00005-of-00030.safetensors",
33
+ "model.layers.10.self_attn.q_proj.weight": "model-00005-of-00030.safetensors",
34
+ "model.layers.10.self_attn.v_proj.weight": "model-00005-of-00030.safetensors",
35
+ "model.layers.11.input_layernorm.weight": "model-00005-of-00030.safetensors",
36
+ "model.layers.11.mlp.down_proj.weight": "model-00005-of-00030.safetensors",
37
+ "model.layers.11.mlp.gate_proj.weight": "model-00005-of-00030.safetensors",
38
+ "model.layers.11.mlp.up_proj.weight": "model-00005-of-00030.safetensors",
39
+ "model.layers.11.post_attention_layernorm.weight": "model-00005-of-00030.safetensors",
40
+ "model.layers.11.self_attn.k_proj.weight": "model-00005-of-00030.safetensors",
41
+ "model.layers.11.self_attn.o_proj.weight": "model-00005-of-00030.safetensors",
42
+ "model.layers.11.self_attn.q_proj.weight": "model-00005-of-00030.safetensors",
43
+ "model.layers.11.self_attn.v_proj.weight": "model-00005-of-00030.safetensors",
44
+ "model.layers.12.input_layernorm.weight": "model-00006-of-00030.safetensors",
45
+ "model.layers.12.mlp.down_proj.weight": "model-00006-of-00030.safetensors",
46
+ "model.layers.12.mlp.gate_proj.weight": "model-00005-of-00030.safetensors",
47
+ "model.layers.12.mlp.up_proj.weight": "model-00005-of-00030.safetensors",
48
+ "model.layers.12.post_attention_layernorm.weight": "model-00006-of-00030.safetensors",
49
+ "model.layers.12.self_attn.k_proj.weight": "model-00005-of-00030.safetensors",
50
+ "model.layers.12.self_attn.o_proj.weight": "model-00005-of-00030.safetensors",
51
+ "model.layers.12.self_attn.q_proj.weight": "model-00005-of-00030.safetensors",
52
+ "model.layers.12.self_attn.v_proj.weight": "model-00005-of-00030.safetensors",
53
+ "model.layers.13.input_layernorm.weight": "model-00006-of-00030.safetensors",
54
+ "model.layers.13.mlp.down_proj.weight": "model-00006-of-00030.safetensors",
55
+ "model.layers.13.mlp.gate_proj.weight": "model-00006-of-00030.safetensors",
56
+ "model.layers.13.mlp.up_proj.weight": "model-00006-of-00030.safetensors",
57
+ "model.layers.13.post_attention_layernorm.weight": "model-00006-of-00030.safetensors",
58
+ "model.layers.13.self_attn.k_proj.weight": "model-00006-of-00030.safetensors",
59
+ "model.layers.13.self_attn.o_proj.weight": "model-00006-of-00030.safetensors",
60
+ "model.layers.13.self_attn.q_proj.weight": "model-00006-of-00030.safetensors",
61
+ "model.layers.13.self_attn.v_proj.weight": "model-00006-of-00030.safetensors",
62
+ "model.layers.14.input_layernorm.weight": "model-00006-of-00030.safetensors",
63
+ "model.layers.14.mlp.down_proj.weight": "model-00006-of-00030.safetensors",
64
+ "model.layers.14.mlp.gate_proj.weight": "model-00006-of-00030.safetensors",
65
+ "model.layers.14.mlp.up_proj.weight": "model-00006-of-00030.safetensors",
66
+ "model.layers.14.post_attention_layernorm.weight": "model-00006-of-00030.safetensors",
67
+ "model.layers.14.self_attn.k_proj.weight": "model-00006-of-00030.safetensors",
68
+ "model.layers.14.self_attn.o_proj.weight": "model-00006-of-00030.safetensors",
69
+ "model.layers.14.self_attn.q_proj.weight": "model-00006-of-00030.safetensors",
70
+ "model.layers.14.self_attn.v_proj.weight": "model-00006-of-00030.safetensors",
71
+ "model.layers.15.input_layernorm.weight": "model-00007-of-00030.safetensors",
72
+ "model.layers.15.mlp.down_proj.weight": "model-00007-of-00030.safetensors",
73
+ "model.layers.15.mlp.gate_proj.weight": "model-00006-of-00030.safetensors",
74
+ "model.layers.15.mlp.up_proj.weight": "model-00007-of-00030.safetensors",
75
+ "model.layers.15.post_attention_layernorm.weight": "model-00007-of-00030.safetensors",
76
+ "model.layers.15.self_attn.k_proj.weight": "model-00006-of-00030.safetensors",
77
+ "model.layers.15.self_attn.o_proj.weight": "model-00006-of-00030.safetensors",
78
+ "model.layers.15.self_attn.q_proj.weight": "model-00006-of-00030.safetensors",
79
+ "model.layers.15.self_attn.v_proj.weight": "model-00006-of-00030.safetensors",
80
+ "model.layers.16.input_layernorm.weight": "model-00007-of-00030.safetensors",
81
+ "model.layers.16.mlp.down_proj.weight": "model-00007-of-00030.safetensors",
82
+ "model.layers.16.mlp.gate_proj.weight": "model-00007-of-00030.safetensors",
83
+ "model.layers.16.mlp.up_proj.weight": "model-00007-of-00030.safetensors",
84
+ "model.layers.16.post_attention_layernorm.weight": "model-00007-of-00030.safetensors",
85
+ "model.layers.16.self_attn.k_proj.weight": "model-00007-of-00030.safetensors",
86
+ "model.layers.16.self_attn.o_proj.weight": "model-00007-of-00030.safetensors",
87
+ "model.layers.16.self_attn.q_proj.weight": "model-00007-of-00030.safetensors",
88
+ "model.layers.16.self_attn.v_proj.weight": "model-00007-of-00030.safetensors",
89
+ "model.layers.17.input_layernorm.weight": "model-00007-of-00030.safetensors",
90
+ "model.layers.17.mlp.down_proj.weight": "model-00007-of-00030.safetensors",
91
+ "model.layers.17.mlp.gate_proj.weight": "model-00007-of-00030.safetensors",
92
+ "model.layers.17.mlp.up_proj.weight": "model-00007-of-00030.safetensors",
93
+ "model.layers.17.post_attention_layernorm.weight": "model-00007-of-00030.safetensors",
94
+ "model.layers.17.self_attn.k_proj.weight": "model-00007-of-00030.safetensors",
95
+ "model.layers.17.self_attn.o_proj.weight": "model-00007-of-00030.safetensors",
96
+ "model.layers.17.self_attn.q_proj.weight": "model-00007-of-00030.safetensors",
97
+ "model.layers.17.self_attn.v_proj.weight": "model-00007-of-00030.safetensors",
98
+ "model.layers.18.input_layernorm.weight": "model-00008-of-00030.safetensors",
99
+ "model.layers.18.mlp.down_proj.weight": "model-00008-of-00030.safetensors",
100
+ "model.layers.18.mlp.gate_proj.weight": "model-00008-of-00030.safetensors",
101
+ "model.layers.18.mlp.up_proj.weight": "model-00008-of-00030.safetensors",
102
+ "model.layers.18.post_attention_layernorm.weight": "model-00008-of-00030.safetensors",
103
+ "model.layers.18.self_attn.k_proj.weight": "model-00007-of-00030.safetensors",
104
+ "model.layers.18.self_attn.o_proj.weight": "model-00007-of-00030.safetensors",
105
+ "model.layers.18.self_attn.q_proj.weight": "model-00007-of-00030.safetensors",
106
+ "model.layers.18.self_attn.v_proj.weight": "model-00007-of-00030.safetensors",
107
+ "model.layers.19.input_layernorm.weight": "model-00008-of-00030.safetensors",
108
+ "model.layers.19.mlp.down_proj.weight": "model-00008-of-00030.safetensors",
109
+ "model.layers.19.mlp.gate_proj.weight": "model-00008-of-00030.safetensors",
110
+ "model.layers.19.mlp.up_proj.weight": "model-00008-of-00030.safetensors",
111
+ "model.layers.19.post_attention_layernorm.weight": "model-00008-of-00030.safetensors",
112
+ "model.layers.19.self_attn.k_proj.weight": "model-00008-of-00030.safetensors",
113
+ "model.layers.19.self_attn.o_proj.weight": "model-00008-of-00030.safetensors",
114
+ "model.layers.19.self_attn.q_proj.weight": "model-00008-of-00030.safetensors",
115
+ "model.layers.19.self_attn.v_proj.weight": "model-00008-of-00030.safetensors",
116
+ "model.layers.2.input_layernorm.weight": "model-00002-of-00030.safetensors",
117
+ "model.layers.2.mlp.down_proj.weight": "model-00002-of-00030.safetensors",
118
+ "model.layers.2.mlp.gate_proj.weight": "model-00002-of-00030.safetensors",
119
+ "model.layers.2.mlp.up_proj.weight": "model-00002-of-00030.safetensors",
120
+ "model.layers.2.post_attention_layernorm.weight": "model-00002-of-00030.safetensors",
121
+ "model.layers.2.self_attn.k_proj.weight": "model-00002-of-00030.safetensors",
122
+ "model.layers.2.self_attn.o_proj.weight": "model-00002-of-00030.safetensors",
123
+ "model.layers.2.self_attn.q_proj.weight": "model-00002-of-00030.safetensors",
124
+ "model.layers.2.self_attn.v_proj.weight": "model-00002-of-00030.safetensors",
125
+ "model.layers.20.input_layernorm.weight": "model-00008-of-00030.safetensors",
126
+ "model.layers.20.mlp.down_proj.weight": "model-00008-of-00030.safetensors",
127
+ "model.layers.20.mlp.gate_proj.weight": "model-00008-of-00030.safetensors",
128
+ "model.layers.20.mlp.up_proj.weight": "model-00008-of-00030.safetensors",
129
+ "model.layers.20.post_attention_layernorm.weight": "model-00008-of-00030.safetensors",
130
+ "model.layers.20.self_attn.k_proj.weight": "model-00008-of-00030.safetensors",
131
+ "model.layers.20.self_attn.o_proj.weight": "model-00008-of-00030.safetensors",
132
+ "model.layers.20.self_attn.q_proj.weight": "model-00008-of-00030.safetensors",
133
+ "model.layers.20.self_attn.v_proj.weight": "model-00008-of-00030.safetensors",
134
+ "model.layers.21.input_layernorm.weight": "model-00009-of-00030.safetensors",
135
+ "model.layers.21.mlp.down_proj.weight": "model-00009-of-00030.safetensors",
136
+ "model.layers.21.mlp.gate_proj.weight": "model-00009-of-00030.safetensors",
137
+ "model.layers.21.mlp.up_proj.weight": "model-00009-of-00030.safetensors",
138
+ "model.layers.21.post_attention_layernorm.weight": "model-00009-of-00030.safetensors",
139
+ "model.layers.21.self_attn.k_proj.weight": "model-00008-of-00030.safetensors",
140
+ "model.layers.21.self_attn.o_proj.weight": "model-00009-of-00030.safetensors",
141
+ "model.layers.21.self_attn.q_proj.weight": "model-00008-of-00030.safetensors",
142
+ "model.layers.21.self_attn.v_proj.weight": "model-00008-of-00030.safetensors",
143
+ "model.layers.22.input_layernorm.weight": "model-00009-of-00030.safetensors",
144
+ "model.layers.22.mlp.down_proj.weight": "model-00009-of-00030.safetensors",
145
+ "model.layers.22.mlp.gate_proj.weight": "model-00009-of-00030.safetensors",
146
+ "model.layers.22.mlp.up_proj.weight": "model-00009-of-00030.safetensors",
147
+ "model.layers.22.post_attention_layernorm.weight": "model-00009-of-00030.safetensors",
148
+ "model.layers.22.self_attn.k_proj.weight": "model-00009-of-00030.safetensors",
149
+ "model.layers.22.self_attn.o_proj.weight": "model-00009-of-00030.safetensors",
150
+ "model.layers.22.self_attn.q_proj.weight": "model-00009-of-00030.safetensors",
151
+ "model.layers.22.self_attn.v_proj.weight": "model-00009-of-00030.safetensors",
152
+ "model.layers.23.input_layernorm.weight": "model-00009-of-00030.safetensors",
153
+ "model.layers.23.mlp.down_proj.weight": "model-00009-of-00030.safetensors",
154
+ "model.layers.23.mlp.gate_proj.weight": "model-00009-of-00030.safetensors",
155
+ "model.layers.23.mlp.up_proj.weight": "model-00009-of-00030.safetensors",
156
+ "model.layers.23.post_attention_layernorm.weight": "model-00009-of-00030.safetensors",
157
+ "model.layers.23.self_attn.k_proj.weight": "model-00009-of-00030.safetensors",
158
+ "model.layers.23.self_attn.o_proj.weight": "model-00009-of-00030.safetensors",
159
+ "model.layers.23.self_attn.q_proj.weight": "model-00009-of-00030.safetensors",
160
+ "model.layers.23.self_attn.v_proj.weight": "model-00009-of-00030.safetensors",
161
+ "model.layers.24.input_layernorm.weight": "model-00010-of-00030.safetensors",
162
+ "model.layers.24.mlp.down_proj.weight": "model-00010-of-00030.safetensors",
163
+ "model.layers.24.mlp.gate_proj.weight": "model-00010-of-00030.safetensors",
164
+ "model.layers.24.mlp.up_proj.weight": "model-00010-of-00030.safetensors",
165
+ "model.layers.24.post_attention_layernorm.weight": "model-00010-of-00030.safetensors",
166
+ "model.layers.24.self_attn.k_proj.weight": "model-00010-of-00030.safetensors",
167
+ "model.layers.24.self_attn.o_proj.weight": "model-00010-of-00030.safetensors",
168
+ "model.layers.24.self_attn.q_proj.weight": "model-00010-of-00030.safetensors",
169
+ "model.layers.24.self_attn.v_proj.weight": "model-00010-of-00030.safetensors",
170
+ "model.layers.25.input_layernorm.weight": "model-00010-of-00030.safetensors",
171
+ "model.layers.25.mlp.down_proj.weight": "model-00010-of-00030.safetensors",
172
+ "model.layers.25.mlp.gate_proj.weight": "model-00010-of-00030.safetensors",
173
+ "model.layers.25.mlp.up_proj.weight": "model-00010-of-00030.safetensors",
174
+ "model.layers.25.post_attention_layernorm.weight": "model-00010-of-00030.safetensors",
175
+ "model.layers.25.self_attn.k_proj.weight": "model-00010-of-00030.safetensors",
176
+ "model.layers.25.self_attn.o_proj.weight": "model-00010-of-00030.safetensors",
177
+ "model.layers.25.self_attn.q_proj.weight": "model-00010-of-00030.safetensors",
178
+ "model.layers.25.self_attn.v_proj.weight": "model-00010-of-00030.safetensors",
179
+ "model.layers.26.input_layernorm.weight": "model-00011-of-00030.safetensors",
180
+ "model.layers.26.mlp.down_proj.weight": "model-00011-of-00030.safetensors",
181
+ "model.layers.26.mlp.gate_proj.weight": "model-00010-of-00030.safetensors",
182
+ "model.layers.26.mlp.up_proj.weight": "model-00010-of-00030.safetensors",
183
+ "model.layers.26.post_attention_layernorm.weight": "model-00011-of-00030.safetensors",
184
+ "model.layers.26.self_attn.k_proj.weight": "model-00010-of-00030.safetensors",
185
+ "model.layers.26.self_attn.o_proj.weight": "model-00010-of-00030.safetensors",
186
+ "model.layers.26.self_attn.q_proj.weight": "model-00010-of-00030.safetensors",
187
+ "model.layers.26.self_attn.v_proj.weight": "model-00010-of-00030.safetensors",
188
+ "model.layers.27.input_layernorm.weight": "model-00011-of-00030.safetensors",
189
+ "model.layers.27.mlp.down_proj.weight": "model-00011-of-00030.safetensors",
190
+ "model.layers.27.mlp.gate_proj.weight": "model-00011-of-00030.safetensors",
191
+ "model.layers.27.mlp.up_proj.weight": "model-00011-of-00030.safetensors",
192
+ "model.layers.27.post_attention_layernorm.weight": "model-00011-of-00030.safetensors",
193
+ "model.layers.27.self_attn.k_proj.weight": "model-00011-of-00030.safetensors",
194
+ "model.layers.27.self_attn.o_proj.weight": "model-00011-of-00030.safetensors",
195
+ "model.layers.27.self_attn.q_proj.weight": "model-00011-of-00030.safetensors",
196
+ "model.layers.27.self_attn.v_proj.weight": "model-00011-of-00030.safetensors",
197
+ "model.layers.28.input_layernorm.weight": "model-00011-of-00030.safetensors",
198
+ "model.layers.28.mlp.down_proj.weight": "model-00011-of-00030.safetensors",
199
+ "model.layers.28.mlp.gate_proj.weight": "model-00011-of-00030.safetensors",
200
+ "model.layers.28.mlp.up_proj.weight": "model-00011-of-00030.safetensors",
201
+ "model.layers.28.post_attention_layernorm.weight": "model-00011-of-00030.safetensors",
202
+ "model.layers.28.self_attn.k_proj.weight": "model-00011-of-00030.safetensors",
203
+ "model.layers.28.self_attn.o_proj.weight": "model-00011-of-00030.safetensors",
204
+ "model.layers.28.self_attn.q_proj.weight": "model-00011-of-00030.safetensors",
205
+ "model.layers.28.self_attn.v_proj.weight": "model-00011-of-00030.safetensors",
206
+ "model.layers.29.input_layernorm.weight": "model-00012-of-00030.safetensors",
207
+ "model.layers.29.mlp.down_proj.weight": "model-00012-of-00030.safetensors",
208
+ "model.layers.29.mlp.gate_proj.weight": "model-00011-of-00030.safetensors",
209
+ "model.layers.29.mlp.up_proj.weight": "model-00012-of-00030.safetensors",
210
+ "model.layers.29.post_attention_layernorm.weight": "model-00012-of-00030.safetensors",
211
+ "model.layers.29.self_attn.k_proj.weight": "model-00011-of-00030.safetensors",
212
+ "model.layers.29.self_attn.o_proj.weight": "model-00011-of-00030.safetensors",
213
+ "model.layers.29.self_attn.q_proj.weight": "model-00011-of-00030.safetensors",
214
+ "model.layers.29.self_attn.v_proj.weight": "model-00011-of-00030.safetensors",
215
+ "model.layers.3.input_layernorm.weight": "model-00002-of-00030.safetensors",
216
+ "model.layers.3.mlp.down_proj.weight": "model-00002-of-00030.safetensors",
217
+ "model.layers.3.mlp.gate_proj.weight": "model-00002-of-00030.safetensors",
218
+ "model.layers.3.mlp.up_proj.weight": "model-00002-of-00030.safetensors",
219
+ "model.layers.3.post_attention_layernorm.weight": "model-00002-of-00030.safetensors",
220
+ "model.layers.3.self_attn.k_proj.weight": "model-00002-of-00030.safetensors",
221
+ "model.layers.3.self_attn.o_proj.weight": "model-00002-of-00030.safetensors",
222
+ "model.layers.3.self_attn.q_proj.weight": "model-00002-of-00030.safetensors",
223
+ "model.layers.3.self_attn.v_proj.weight": "model-00002-of-00030.safetensors",
224
+ "model.layers.30.input_layernorm.weight": "model-00012-of-00030.safetensors",
225
+ "model.layers.30.mlp.down_proj.weight": "model-00012-of-00030.safetensors",
226
+ "model.layers.30.mlp.gate_proj.weight": "model-00012-of-00030.safetensors",
227
+ "model.layers.30.mlp.up_proj.weight": "model-00012-of-00030.safetensors",
228
+ "model.layers.30.post_attention_layernorm.weight": "model-00012-of-00030.safetensors",
229
+ "model.layers.30.self_attn.k_proj.weight": "model-00012-of-00030.safetensors",
230
+ "model.layers.30.self_attn.o_proj.weight": "model-00012-of-00030.safetensors",
231
+ "model.layers.30.self_attn.q_proj.weight": "model-00012-of-00030.safetensors",
232
+ "model.layers.30.self_attn.v_proj.weight": "model-00012-of-00030.safetensors",
233
+ "model.layers.31.input_layernorm.weight": "model-00012-of-00030.safetensors",
234
+ "model.layers.31.mlp.down_proj.weight": "model-00012-of-00030.safetensors",
235
+ "model.layers.31.mlp.gate_proj.weight": "model-00012-of-00030.safetensors",
236
+ "model.layers.31.mlp.up_proj.weight": "model-00012-of-00030.safetensors",
237
+ "model.layers.31.post_attention_layernorm.weight": "model-00012-of-00030.safetensors",
238
+ "model.layers.31.self_attn.k_proj.weight": "model-00012-of-00030.safetensors",
239
+ "model.layers.31.self_attn.o_proj.weight": "model-00012-of-00030.safetensors",
240
+ "model.layers.31.self_attn.q_proj.weight": "model-00012-of-00030.safetensors",
241
+ "model.layers.31.self_attn.v_proj.weight": "model-00012-of-00030.safetensors",
242
+ "model.layers.32.input_layernorm.weight": "model-00013-of-00030.safetensors",
243
+ "model.layers.32.mlp.down_proj.weight": "model-00013-of-00030.safetensors",
244
+ "model.layers.32.mlp.gate_proj.weight": "model-00013-of-00030.safetensors",
245
+ "model.layers.32.mlp.up_proj.weight": "model-00013-of-00030.safetensors",
246
+ "model.layers.32.post_attention_layernorm.weight": "model-00013-of-00030.safetensors",
247
+ "model.layers.32.self_attn.k_proj.weight": "model-00012-of-00030.safetensors",
248
+ "model.layers.32.self_attn.o_proj.weight": "model-00012-of-00030.safetensors",
249
+ "model.layers.32.self_attn.q_proj.weight": "model-00012-of-00030.safetensors",
250
+ "model.layers.32.self_attn.v_proj.weight": "model-00012-of-00030.safetensors",
251
+ "model.layers.33.input_layernorm.weight": "model-00013-of-00030.safetensors",
252
+ "model.layers.33.mlp.down_proj.weight": "model-00013-of-00030.safetensors",
253
+ "model.layers.33.mlp.gate_proj.weight": "model-00013-of-00030.safetensors",
254
+ "model.layers.33.mlp.up_proj.weight": "model-00013-of-00030.safetensors",
255
+ "model.layers.33.post_attention_layernorm.weight": "model-00013-of-00030.safetensors",
256
+ "model.layers.33.self_attn.k_proj.weight": "model-00013-of-00030.safetensors",
257
+ "model.layers.33.self_attn.o_proj.weight": "model-00013-of-00030.safetensors",
258
+ "model.layers.33.self_attn.q_proj.weight": "model-00013-of-00030.safetensors",
259
+ "model.layers.33.self_attn.v_proj.weight": "model-00013-of-00030.safetensors",
260
+ "model.layers.34.input_layernorm.weight": "model-00013-of-00030.safetensors",
261
+ "model.layers.34.mlp.down_proj.weight": "model-00013-of-00030.safetensors",
262
+ "model.layers.34.mlp.gate_proj.weight": "model-00013-of-00030.safetensors",
263
+ "model.layers.34.mlp.up_proj.weight": "model-00013-of-00030.safetensors",
264
+ "model.layers.34.post_attention_layernorm.weight": "model-00013-of-00030.safetensors",
265
+ "model.layers.34.self_attn.k_proj.weight": "model-00013-of-00030.safetensors",
266
+ "model.layers.34.self_attn.o_proj.weight": "model-00013-of-00030.safetensors",
267
+ "model.layers.34.self_attn.q_proj.weight": "model-00013-of-00030.safetensors",
268
+ "model.layers.34.self_attn.v_proj.weight": "model-00013-of-00030.safetensors",
269
+ "model.layers.35.input_layernorm.weight": "model-00014-of-00030.safetensors",
270
+ "model.layers.35.mlp.down_proj.weight": "model-00014-of-00030.safetensors",
271
+ "model.layers.35.mlp.gate_proj.weight": "model-00014-of-00030.safetensors",
272
+ "model.layers.35.mlp.up_proj.weight": "model-00014-of-00030.safetensors",
273
+ "model.layers.35.post_attention_layernorm.weight": "model-00014-of-00030.safetensors",
274
+ "model.layers.35.self_attn.k_proj.weight": "model-00013-of-00030.safetensors",
275
+ "model.layers.35.self_attn.o_proj.weight": "model-00014-of-00030.safetensors",
276
+ "model.layers.35.self_attn.q_proj.weight": "model-00013-of-00030.safetensors",
277
+ "model.layers.35.self_attn.v_proj.weight": "model-00013-of-00030.safetensors",
278
+ "model.layers.36.input_layernorm.weight": "model-00014-of-00030.safetensors",
279
+ "model.layers.36.mlp.down_proj.weight": "model-00014-of-00030.safetensors",
280
+ "model.layers.36.mlp.gate_proj.weight": "model-00014-of-00030.safetensors",
281
+ "model.layers.36.mlp.up_proj.weight": "model-00014-of-00030.safetensors",
282
+ "model.layers.36.post_attention_layernorm.weight": "model-00014-of-00030.safetensors",
283
+ "model.layers.36.self_attn.k_proj.weight": "model-00014-of-00030.safetensors",
284
+ "model.layers.36.self_attn.o_proj.weight": "model-00014-of-00030.safetensors",
285
+ "model.layers.36.self_attn.q_proj.weight": "model-00014-of-00030.safetensors",
286
+ "model.layers.36.self_attn.v_proj.weight": "model-00014-of-00030.safetensors",
287
+ "model.layers.37.input_layernorm.weight": "model-00014-of-00030.safetensors",
288
+ "model.layers.37.mlp.down_proj.weight": "model-00014-of-00030.safetensors",
289
+ "model.layers.37.mlp.gate_proj.weight": "model-00014-of-00030.safetensors",
290
+ "model.layers.37.mlp.up_proj.weight": "model-00014-of-00030.safetensors",
291
+ "model.layers.37.post_attention_layernorm.weight": "model-00014-of-00030.safetensors",
292
+ "model.layers.37.self_attn.k_proj.weight": "model-00014-of-00030.safetensors",
293
+ "model.layers.37.self_attn.o_proj.weight": "model-00014-of-00030.safetensors",
294
+ "model.layers.37.self_attn.q_proj.weight": "model-00014-of-00030.safetensors",
295
+ "model.layers.37.self_attn.v_proj.weight": "model-00014-of-00030.safetensors",
296
+ "model.layers.38.input_layernorm.weight": "model-00015-of-00030.safetensors",
297
+ "model.layers.38.mlp.down_proj.weight": "model-00015-of-00030.safetensors",
298
+ "model.layers.38.mlp.gate_proj.weight": "model-00015-of-00030.safetensors",
299
+ "model.layers.38.mlp.up_proj.weight": "model-00015-of-00030.safetensors",
300
+ "model.layers.38.post_attention_layernorm.weight": "model-00015-of-00030.safetensors",
301
+ "model.layers.38.self_attn.k_proj.weight": "model-00015-of-00030.safetensors",
302
+ "model.layers.38.self_attn.o_proj.weight": "model-00015-of-00030.safetensors",
303
+ "model.layers.38.self_attn.q_proj.weight": "model-00015-of-00030.safetensors",
304
+ "model.layers.38.self_attn.v_proj.weight": "model-00015-of-00030.safetensors",
305
+ "model.layers.39.input_layernorm.weight": "model-00015-of-00030.safetensors",
306
+ "model.layers.39.mlp.down_proj.weight": "model-00015-of-00030.safetensors",
307
+ "model.layers.39.mlp.gate_proj.weight": "model-00015-of-00030.safetensors",
308
+ "model.layers.39.mlp.up_proj.weight": "model-00015-of-00030.safetensors",
309
+ "model.layers.39.post_attention_layernorm.weight": "model-00015-of-00030.safetensors",
310
+ "model.layers.39.self_attn.k_proj.weight": "model-00015-of-00030.safetensors",
311
+ "model.layers.39.self_attn.o_proj.weight": "model-00015-of-00030.safetensors",
312
+ "model.layers.39.self_attn.q_proj.weight": "model-00015-of-00030.safetensors",
313
+ "model.layers.39.self_attn.v_proj.weight": "model-00015-of-00030.safetensors",
314
+ "model.layers.4.input_layernorm.weight": "model-00003-of-00030.safetensors",
315
+ "model.layers.4.mlp.down_proj.weight": "model-00003-of-00030.safetensors",
316
+ "model.layers.4.mlp.gate_proj.weight": "model-00003-of-00030.safetensors",
317
+ "model.layers.4.mlp.up_proj.weight": "model-00003-of-00030.safetensors",
318
+ "model.layers.4.post_attention_layernorm.weight": "model-00003-of-00030.safetensors",
319
+ "model.layers.4.self_attn.k_proj.weight": "model-00002-of-00030.safetensors",
320
+ "model.layers.4.self_attn.o_proj.weight": "model-00002-of-00030.safetensors",
321
+ "model.layers.4.self_attn.q_proj.weight": "model-00002-of-00030.safetensors",
322
+ "model.layers.4.self_attn.v_proj.weight": "model-00002-of-00030.safetensors",
323
+ "model.layers.40.input_layernorm.weight": "model-00016-of-00030.safetensors",
324
+ "model.layers.40.mlp.down_proj.weight": "model-00016-of-00030.safetensors",
325
+ "model.layers.40.mlp.gate_proj.weight": "model-00015-of-00030.safetensors",
326
+ "model.layers.40.mlp.up_proj.weight": "model-00015-of-00030.safetensors",
327
+ "model.layers.40.post_attention_layernorm.weight": "model-00016-of-00030.safetensors",
328
+ "model.layers.40.self_attn.k_proj.weight": "model-00015-of-00030.safetensors",
329
+ "model.layers.40.self_attn.o_proj.weight": "model-00015-of-00030.safetensors",
330
+ "model.layers.40.self_attn.q_proj.weight": "model-00015-of-00030.safetensors",
331
+ "model.layers.40.self_attn.v_proj.weight": "model-00015-of-00030.safetensors",
332
+ "model.layers.41.input_layernorm.weight": "model-00016-of-00030.safetensors",
333
+ "model.layers.41.mlp.down_proj.weight": "model-00016-of-00030.safetensors",
334
+ "model.layers.41.mlp.gate_proj.weight": "model-00016-of-00030.safetensors",
335
+ "model.layers.41.mlp.up_proj.weight": "model-00016-of-00030.safetensors",
336
+ "model.layers.41.post_attention_layernorm.weight": "model-00016-of-00030.safetensors",
337
+ "model.layers.41.self_attn.k_proj.weight": "model-00016-of-00030.safetensors",
338
+ "model.layers.41.self_attn.o_proj.weight": "model-00016-of-00030.safetensors",
339
+ "model.layers.41.self_attn.q_proj.weight": "model-00016-of-00030.safetensors",
340
+ "model.layers.41.self_attn.v_proj.weight": "model-00016-of-00030.safetensors",
341
+ "model.layers.42.input_layernorm.weight": "model-00016-of-00030.safetensors",
342
+ "model.layers.42.mlp.down_proj.weight": "model-00016-of-00030.safetensors",
343
+ "model.layers.42.mlp.gate_proj.weight": "model-00016-of-00030.safetensors",
344
+ "model.layers.42.mlp.up_proj.weight": "model-00016-of-00030.safetensors",
345
+ "model.layers.42.post_attention_layernorm.weight": "model-00016-of-00030.safetensors",
346
+ "model.layers.42.self_attn.k_proj.weight": "model-00016-of-00030.safetensors",
347
+ "model.layers.42.self_attn.o_proj.weight": "model-00016-of-00030.safetensors",
348
+ "model.layers.42.self_attn.q_proj.weight": "model-00016-of-00030.safetensors",
349
+ "model.layers.42.self_attn.v_proj.weight": "model-00016-of-00030.safetensors",
350
+ "model.layers.43.input_layernorm.weight": "model-00017-of-00030.safetensors",
351
+ "model.layers.43.mlp.down_proj.weight": "model-00017-of-00030.safetensors",
352
+ "model.layers.43.mlp.gate_proj.weight": "model-00016-of-00030.safetensors",
353
+ "model.layers.43.mlp.up_proj.weight": "model-00017-of-00030.safetensors",
354
+ "model.layers.43.post_attention_layernorm.weight": "model-00017-of-00030.safetensors",
355
+ "model.layers.43.self_attn.k_proj.weight": "model-00016-of-00030.safetensors",
356
+ "model.layers.43.self_attn.o_proj.weight": "model-00016-of-00030.safetensors",
357
+ "model.layers.43.self_attn.q_proj.weight": "model-00016-of-00030.safetensors",
358
+ "model.layers.43.self_attn.v_proj.weight": "model-00016-of-00030.safetensors",
359
+ "model.layers.44.input_layernorm.weight": "model-00017-of-00030.safetensors",
360
+ "model.layers.44.mlp.down_proj.weight": "model-00017-of-00030.safetensors",
361
+ "model.layers.44.mlp.gate_proj.weight": "model-00017-of-00030.safetensors",
362
+ "model.layers.44.mlp.up_proj.weight": "model-00017-of-00030.safetensors",
363
+ "model.layers.44.post_attention_layernorm.weight": "model-00017-of-00030.safetensors",
364
+ "model.layers.44.self_attn.k_proj.weight": "model-00017-of-00030.safetensors",
365
+ "model.layers.44.self_attn.o_proj.weight": "model-00017-of-00030.safetensors",
366
+ "model.layers.44.self_attn.q_proj.weight": "model-00017-of-00030.safetensors",
367
+ "model.layers.44.self_attn.v_proj.weight": "model-00017-of-00030.safetensors",
368
+ "model.layers.45.input_layernorm.weight": "model-00017-of-00030.safetensors",
369
+ "model.layers.45.mlp.down_proj.weight": "model-00017-of-00030.safetensors",
370
+ "model.layers.45.mlp.gate_proj.weight": "model-00017-of-00030.safetensors",
371
+ "model.layers.45.mlp.up_proj.weight": "model-00017-of-00030.safetensors",
372
+ "model.layers.45.post_attention_layernorm.weight": "model-00017-of-00030.safetensors",
373
+ "model.layers.45.self_attn.k_proj.weight": "model-00017-of-00030.safetensors",
374
+ "model.layers.45.self_attn.o_proj.weight": "model-00017-of-00030.safetensors",
375
+ "model.layers.45.self_attn.q_proj.weight": "model-00017-of-00030.safetensors",
376
+ "model.layers.45.self_attn.v_proj.weight": "model-00017-of-00030.safetensors",
377
+ "model.layers.46.input_layernorm.weight": "model-00018-of-00030.safetensors",
378
+ "model.layers.46.mlp.down_proj.weight": "model-00018-of-00030.safetensors",
379
+ "model.layers.46.mlp.gate_proj.weight": "model-00018-of-00030.safetensors",
380
+ "model.layers.46.mlp.up_proj.weight": "model-00018-of-00030.safetensors",
381
+ "model.layers.46.post_attention_layernorm.weight": "model-00018-of-00030.safetensors",
382
+ "model.layers.46.self_attn.k_proj.weight": "model-00017-of-00030.safetensors",
383
+ "model.layers.46.self_attn.o_proj.weight": "model-00017-of-00030.safetensors",
384
+ "model.layers.46.self_attn.q_proj.weight": "model-00017-of-00030.safetensors",
385
+ "model.layers.46.self_attn.v_proj.weight": "model-00017-of-00030.safetensors",
386
+ "model.layers.47.input_layernorm.weight": "model-00018-of-00030.safetensors",
387
+ "model.layers.47.mlp.down_proj.weight": "model-00018-of-00030.safetensors",
388
+ "model.layers.47.mlp.gate_proj.weight": "model-00018-of-00030.safetensors",
389
+ "model.layers.47.mlp.up_proj.weight": "model-00018-of-00030.safetensors",
390
+ "model.layers.47.post_attention_layernorm.weight": "model-00018-of-00030.safetensors",
391
+ "model.layers.47.self_attn.k_proj.weight": "model-00018-of-00030.safetensors",
392
+ "model.layers.47.self_attn.o_proj.weight": "model-00018-of-00030.safetensors",
393
+ "model.layers.47.self_attn.q_proj.weight": "model-00018-of-00030.safetensors",
394
+ "model.layers.47.self_attn.v_proj.weight": "model-00018-of-00030.safetensors",
395
+ "model.layers.48.input_layernorm.weight": "model-00018-of-00030.safetensors",
396
+ "model.layers.48.mlp.down_proj.weight": "model-00018-of-00030.safetensors",
397
+ "model.layers.48.mlp.gate_proj.weight": "model-00018-of-00030.safetensors",
398
+ "model.layers.48.mlp.up_proj.weight": "model-00018-of-00030.safetensors",
399
+ "model.layers.48.post_attention_layernorm.weight": "model-00018-of-00030.safetensors",
400
+ "model.layers.48.self_attn.k_proj.weight": "model-00018-of-00030.safetensors",
401
+ "model.layers.48.self_attn.o_proj.weight": "model-00018-of-00030.safetensors",
402
+ "model.layers.48.self_attn.q_proj.weight": "model-00018-of-00030.safetensors",
403
+ "model.layers.48.self_attn.v_proj.weight": "model-00018-of-00030.safetensors",
404
+ "model.layers.49.input_layernorm.weight": "model-00019-of-00030.safetensors",
405
+ "model.layers.49.mlp.down_proj.weight": "model-00019-of-00030.safetensors",
406
+ "model.layers.49.mlp.gate_proj.weight": "model-00019-of-00030.safetensors",
407
+ "model.layers.49.mlp.up_proj.weight": "model-00019-of-00030.safetensors",
408
+ "model.layers.49.post_attention_layernorm.weight": "model-00019-of-00030.safetensors",
409
+ "model.layers.49.self_attn.k_proj.weight": "model-00018-of-00030.safetensors",
410
+ "model.layers.49.self_attn.o_proj.weight": "model-00019-of-00030.safetensors",
411
+ "model.layers.49.self_attn.q_proj.weight": "model-00018-of-00030.safetensors",
412
+ "model.layers.49.self_attn.v_proj.weight": "model-00018-of-00030.safetensors",
413
+ "model.layers.5.input_layernorm.weight": "model-00003-of-00030.safetensors",
414
+ "model.layers.5.mlp.down_proj.weight": "model-00003-of-00030.safetensors",
415
+ "model.layers.5.mlp.gate_proj.weight": "model-00003-of-00030.safetensors",
416
+ "model.layers.5.mlp.up_proj.weight": "model-00003-of-00030.safetensors",
417
+ "model.layers.5.post_attention_layernorm.weight": "model-00003-of-00030.safetensors",
418
+ "model.layers.5.self_attn.k_proj.weight": "model-00003-of-00030.safetensors",
419
+ "model.layers.5.self_attn.o_proj.weight": "model-00003-of-00030.safetensors",
420
+ "model.layers.5.self_attn.q_proj.weight": "model-00003-of-00030.safetensors",
421
+ "model.layers.5.self_attn.v_proj.weight": "model-00003-of-00030.safetensors",
422
+ "model.layers.50.input_layernorm.weight": "model-00019-of-00030.safetensors",
423
+ "model.layers.50.mlp.down_proj.weight": "model-00019-of-00030.safetensors",
424
+ "model.layers.50.mlp.gate_proj.weight": "model-00019-of-00030.safetensors",
425
+ "model.layers.50.mlp.up_proj.weight": "model-00019-of-00030.safetensors",
426
+ "model.layers.50.post_attention_layernorm.weight": "model-00019-of-00030.safetensors",
427
+ "model.layers.50.self_attn.k_proj.weight": "model-00019-of-00030.safetensors",
428
+ "model.layers.50.self_attn.o_proj.weight": "model-00019-of-00030.safetensors",
429
+ "model.layers.50.self_attn.q_proj.weight": "model-00019-of-00030.safetensors",
430
+ "model.layers.50.self_attn.v_proj.weight": "model-00019-of-00030.safetensors",
431
+ "model.layers.51.input_layernorm.weight": "model-00019-of-00030.safetensors",
432
+ "model.layers.51.mlp.down_proj.weight": "model-00019-of-00030.safetensors",
433
+ "model.layers.51.mlp.gate_proj.weight": "model-00019-of-00030.safetensors",
434
+ "model.layers.51.mlp.up_proj.weight": "model-00019-of-00030.safetensors",
435
+ "model.layers.51.post_attention_layernorm.weight": "model-00019-of-00030.safetensors",
436
+ "model.layers.51.self_attn.k_proj.weight": "model-00019-of-00030.safetensors",
437
+ "model.layers.51.self_attn.o_proj.weight": "model-00019-of-00030.safetensors",
438
+ "model.layers.51.self_attn.q_proj.weight": "model-00019-of-00030.safetensors",
439
+ "model.layers.51.self_attn.v_proj.weight": "model-00019-of-00030.safetensors",
440
+ "model.layers.52.input_layernorm.weight": "model-00020-of-00030.safetensors",
441
+ "model.layers.52.mlp.down_proj.weight": "model-00020-of-00030.safetensors",
442
+ "model.layers.52.mlp.gate_proj.weight": "model-00020-of-00030.safetensors",
443
+ "model.layers.52.mlp.up_proj.weight": "model-00020-of-00030.safetensors",
444
+ "model.layers.52.post_attention_layernorm.weight": "model-00020-of-00030.safetensors",
445
+ "model.layers.52.self_attn.k_proj.weight": "model-00020-of-00030.safetensors",
446
+ "model.layers.52.self_attn.o_proj.weight": "model-00020-of-00030.safetensors",
447
+ "model.layers.52.self_attn.q_proj.weight": "model-00020-of-00030.safetensors",
448
+ "model.layers.52.self_attn.v_proj.weight": "model-00020-of-00030.safetensors",
449
+ "model.layers.53.input_layernorm.weight": "model-00020-of-00030.safetensors",
450
+ "model.layers.53.mlp.down_proj.weight": "model-00020-of-00030.safetensors",
451
+ "model.layers.53.mlp.gate_proj.weight": "model-00020-of-00030.safetensors",
452
+ "model.layers.53.mlp.up_proj.weight": "model-00020-of-00030.safetensors",
453
+ "model.layers.53.post_attention_layernorm.weight": "model-00020-of-00030.safetensors",
454
+ "model.layers.53.self_attn.k_proj.weight": "model-00020-of-00030.safetensors",
455
+ "model.layers.53.self_attn.o_proj.weight": "model-00020-of-00030.safetensors",
456
+ "model.layers.53.self_attn.q_proj.weight": "model-00020-of-00030.safetensors",
457
+ "model.layers.53.self_attn.v_proj.weight": "model-00020-of-00030.safetensors",
458
+ "model.layers.54.input_layernorm.weight": "model-00021-of-00030.safetensors",
459
+ "model.layers.54.mlp.down_proj.weight": "model-00021-of-00030.safetensors",
460
+ "model.layers.54.mlp.gate_proj.weight": "model-00020-of-00030.safetensors",
461
+ "model.layers.54.mlp.up_proj.weight": "model-00020-of-00030.safetensors",
462
+ "model.layers.54.post_attention_layernorm.weight": "model-00021-of-00030.safetensors",
463
+ "model.layers.54.self_attn.k_proj.weight": "model-00020-of-00030.safetensors",
464
+ "model.layers.54.self_attn.o_proj.weight": "model-00020-of-00030.safetensors",
465
+ "model.layers.54.self_attn.q_proj.weight": "model-00020-of-00030.safetensors",
466
+ "model.layers.54.self_attn.v_proj.weight": "model-00020-of-00030.safetensors",
467
+ "model.layers.55.input_layernorm.weight": "model-00021-of-00030.safetensors",
468
+ "model.layers.55.mlp.down_proj.weight": "model-00021-of-00030.safetensors",
469
+ "model.layers.55.mlp.gate_proj.weight": "model-00021-of-00030.safetensors",
470
+ "model.layers.55.mlp.up_proj.weight": "model-00021-of-00030.safetensors",
471
+ "model.layers.55.post_attention_layernorm.weight": "model-00021-of-00030.safetensors",
472
+ "model.layers.55.self_attn.k_proj.weight": "model-00021-of-00030.safetensors",
473
+ "model.layers.55.self_attn.o_proj.weight": "model-00021-of-00030.safetensors",
474
+ "model.layers.55.self_attn.q_proj.weight": "model-00021-of-00030.safetensors",
475
+ "model.layers.55.self_attn.v_proj.weight": "model-00021-of-00030.safetensors",
476
+ "model.layers.56.input_layernorm.weight": "model-00021-of-00030.safetensors",
477
+ "model.layers.56.mlp.down_proj.weight": "model-00021-of-00030.safetensors",
478
+ "model.layers.56.mlp.gate_proj.weight": "model-00021-of-00030.safetensors",
479
+ "model.layers.56.mlp.up_proj.weight": "model-00021-of-00030.safetensors",
480
+ "model.layers.56.post_attention_layernorm.weight": "model-00021-of-00030.safetensors",
481
+ "model.layers.56.self_attn.k_proj.weight": "model-00021-of-00030.safetensors",
482
+ "model.layers.56.self_attn.o_proj.weight": "model-00021-of-00030.safetensors",
483
+ "model.layers.56.self_attn.q_proj.weight": "model-00021-of-00030.safetensors",
484
+ "model.layers.56.self_attn.v_proj.weight": "model-00021-of-00030.safetensors",
485
+ "model.layers.57.input_layernorm.weight": "model-00022-of-00030.safetensors",
486
+ "model.layers.57.mlp.down_proj.weight": "model-00022-of-00030.safetensors",
487
+ "model.layers.57.mlp.gate_proj.weight": "model-00021-of-00030.safetensors",
488
+ "model.layers.57.mlp.up_proj.weight": "model-00022-of-00030.safetensors",
489
+ "model.layers.57.post_attention_layernorm.weight": "model-00022-of-00030.safetensors",
490
+ "model.layers.57.self_attn.k_proj.weight": "model-00021-of-00030.safetensors",
491
+ "model.layers.57.self_attn.o_proj.weight": "model-00021-of-00030.safetensors",
492
+ "model.layers.57.self_attn.q_proj.weight": "model-00021-of-00030.safetensors",
493
+ "model.layers.57.self_attn.v_proj.weight": "model-00021-of-00030.safetensors",
494
+ "model.layers.58.input_layernorm.weight": "model-00022-of-00030.safetensors",
495
+ "model.layers.58.mlp.down_proj.weight": "model-00022-of-00030.safetensors",
496
+ "model.layers.58.mlp.gate_proj.weight": "model-00022-of-00030.safetensors",
497
+ "model.layers.58.mlp.up_proj.weight": "model-00022-of-00030.safetensors",
498
+ "model.layers.58.post_attention_layernorm.weight": "model-00022-of-00030.safetensors",
499
+ "model.layers.58.self_attn.k_proj.weight": "model-00022-of-00030.safetensors",
500
+ "model.layers.58.self_attn.o_proj.weight": "model-00022-of-00030.safetensors",
501
+ "model.layers.58.self_attn.q_proj.weight": "model-00022-of-00030.safetensors",
502
+ "model.layers.58.self_attn.v_proj.weight": "model-00022-of-00030.safetensors",
503
+ "model.layers.59.input_layernorm.weight": "model-00022-of-00030.safetensors",
504
+ "model.layers.59.mlp.down_proj.weight": "model-00022-of-00030.safetensors",
505
+ "model.layers.59.mlp.gate_proj.weight": "model-00022-of-00030.safetensors",
506
+ "model.layers.59.mlp.up_proj.weight": "model-00022-of-00030.safetensors",
507
+ "model.layers.59.post_attention_layernorm.weight": "model-00022-of-00030.safetensors",
508
+ "model.layers.59.self_attn.k_proj.weight": "model-00022-of-00030.safetensors",
509
+ "model.layers.59.self_attn.o_proj.weight": "model-00022-of-00030.safetensors",
510
+ "model.layers.59.self_attn.q_proj.weight": "model-00022-of-00030.safetensors",
511
+ "model.layers.59.self_attn.v_proj.weight": "model-00022-of-00030.safetensors",
512
+ "model.layers.6.input_layernorm.weight": "model-00003-of-00030.safetensors",
513
+ "model.layers.6.mlp.down_proj.weight": "model-00003-of-00030.safetensors",
514
+ "model.layers.6.mlp.gate_proj.weight": "model-00003-of-00030.safetensors",
515
+ "model.layers.6.mlp.up_proj.weight": "model-00003-of-00030.safetensors",
516
+ "model.layers.6.post_attention_layernorm.weight": "model-00003-of-00030.safetensors",
517
+ "model.layers.6.self_attn.k_proj.weight": "model-00003-of-00030.safetensors",
518
+ "model.layers.6.self_attn.o_proj.weight": "model-00003-of-00030.safetensors",
519
+ "model.layers.6.self_attn.q_proj.weight": "model-00003-of-00030.safetensors",
520
+ "model.layers.6.self_attn.v_proj.weight": "model-00003-of-00030.safetensors",
521
+ "model.layers.60.input_layernorm.weight": "model-00023-of-00030.safetensors",
522
+ "model.layers.60.mlp.down_proj.weight": "model-00023-of-00030.safetensors",
523
+ "model.layers.60.mlp.gate_proj.weight": "model-00023-of-00030.safetensors",
524
+ "model.layers.60.mlp.up_proj.weight": "model-00023-of-00030.safetensors",
525
+ "model.layers.60.post_attention_layernorm.weight": "model-00023-of-00030.safetensors",
526
+ "model.layers.60.self_attn.k_proj.weight": "model-00022-of-00030.safetensors",
527
+ "model.layers.60.self_attn.o_proj.weight": "model-00022-of-00030.safetensors",
528
+ "model.layers.60.self_attn.q_proj.weight": "model-00022-of-00030.safetensors",
529
+ "model.layers.60.self_attn.v_proj.weight": "model-00022-of-00030.safetensors",
530
+ "model.layers.61.input_layernorm.weight": "model-00023-of-00030.safetensors",
531
+ "model.layers.61.mlp.down_proj.weight": "model-00023-of-00030.safetensors",
532
+ "model.layers.61.mlp.gate_proj.weight": "model-00023-of-00030.safetensors",
533
+ "model.layers.61.mlp.up_proj.weight": "model-00023-of-00030.safetensors",
534
+ "model.layers.61.post_attention_layernorm.weight": "model-00023-of-00030.safetensors",
535
+ "model.layers.61.self_attn.k_proj.weight": "model-00023-of-00030.safetensors",
536
+ "model.layers.61.self_attn.o_proj.weight": "model-00023-of-00030.safetensors",
537
+ "model.layers.61.self_attn.q_proj.weight": "model-00023-of-00030.safetensors",
538
+ "model.layers.61.self_attn.v_proj.weight": "model-00023-of-00030.safetensors",
539
+ "model.layers.62.input_layernorm.weight": "model-00023-of-00030.safetensors",
540
+ "model.layers.62.mlp.down_proj.weight": "model-00023-of-00030.safetensors",
541
+ "model.layers.62.mlp.gate_proj.weight": "model-00023-of-00030.safetensors",
542
+ "model.layers.62.mlp.up_proj.weight": "model-00023-of-00030.safetensors",
543
+ "model.layers.62.post_attention_layernorm.weight": "model-00023-of-00030.safetensors",
544
+ "model.layers.62.self_attn.k_proj.weight": "model-00023-of-00030.safetensors",
545
+ "model.layers.62.self_attn.o_proj.weight": "model-00023-of-00030.safetensors",
546
+ "model.layers.62.self_attn.q_proj.weight": "model-00023-of-00030.safetensors",
547
+ "model.layers.62.self_attn.v_proj.weight": "model-00023-of-00030.safetensors",
548
+ "model.layers.63.input_layernorm.weight": "model-00024-of-00030.safetensors",
549
+ "model.layers.63.mlp.down_proj.weight": "model-00024-of-00030.safetensors",
550
+ "model.layers.63.mlp.gate_proj.weight": "model-00024-of-00030.safetensors",
551
+ "model.layers.63.mlp.up_proj.weight": "model-00024-of-00030.safetensors",
552
+ "model.layers.63.post_attention_layernorm.weight": "model-00024-of-00030.safetensors",
553
+ "model.layers.63.self_attn.k_proj.weight": "model-00023-of-00030.safetensors",
554
+ "model.layers.63.self_attn.o_proj.weight": "model-00024-of-00030.safetensors",
555
+ "model.layers.63.self_attn.q_proj.weight": "model-00023-of-00030.safetensors",
556
+ "model.layers.63.self_attn.v_proj.weight": "model-00023-of-00030.safetensors",
557
+ "model.layers.64.input_layernorm.weight": "model-00024-of-00030.safetensors",
558
+ "model.layers.64.mlp.down_proj.weight": "model-00024-of-00030.safetensors",
559
+ "model.layers.64.mlp.gate_proj.weight": "model-00024-of-00030.safetensors",
560
+ "model.layers.64.mlp.up_proj.weight": "model-00024-of-00030.safetensors",
561
+ "model.layers.64.post_attention_layernorm.weight": "model-00024-of-00030.safetensors",
562
+ "model.layers.64.self_attn.k_proj.weight": "model-00024-of-00030.safetensors",
563
+ "model.layers.64.self_attn.o_proj.weight": "model-00024-of-00030.safetensors",
564
+ "model.layers.64.self_attn.q_proj.weight": "model-00024-of-00030.safetensors",
565
+ "model.layers.64.self_attn.v_proj.weight": "model-00024-of-00030.safetensors",
566
+ "model.layers.65.input_layernorm.weight": "model-00024-of-00030.safetensors",
567
+ "model.layers.65.mlp.down_proj.weight": "model-00024-of-00030.safetensors",
568
+ "model.layers.65.mlp.gate_proj.weight": "model-00024-of-00030.safetensors",
569
+ "model.layers.65.mlp.up_proj.weight": "model-00024-of-00030.safetensors",
570
+ "model.layers.65.post_attention_layernorm.weight": "model-00024-of-00030.safetensors",
571
+ "model.layers.65.self_attn.k_proj.weight": "model-00024-of-00030.safetensors",
572
+ "model.layers.65.self_attn.o_proj.weight": "model-00024-of-00030.safetensors",
573
+ "model.layers.65.self_attn.q_proj.weight": "model-00024-of-00030.safetensors",
574
+ "model.layers.65.self_attn.v_proj.weight": "model-00024-of-00030.safetensors",
575
+ "model.layers.66.input_layernorm.weight": "model-00025-of-00030.safetensors",
576
+ "model.layers.66.mlp.down_proj.weight": "model-00025-of-00030.safetensors",
577
+ "model.layers.66.mlp.gate_proj.weight": "model-00025-of-00030.safetensors",
578
+ "model.layers.66.mlp.up_proj.weight": "model-00025-of-00030.safetensors",
579
+ "model.layers.66.post_attention_layernorm.weight": "model-00025-of-00030.safetensors",
580
+ "model.layers.66.self_attn.k_proj.weight": "model-00025-of-00030.safetensors",
581
+ "model.layers.66.self_attn.o_proj.weight": "model-00025-of-00030.safetensors",
582
+ "model.layers.66.self_attn.q_proj.weight": "model-00025-of-00030.safetensors",
583
+ "model.layers.66.self_attn.v_proj.weight": "model-00025-of-00030.safetensors",
584
+ "model.layers.67.input_layernorm.weight": "model-00025-of-00030.safetensors",
585
+ "model.layers.67.mlp.down_proj.weight": "model-00025-of-00030.safetensors",
586
+ "model.layers.67.mlp.gate_proj.weight": "model-00025-of-00030.safetensors",
587
+ "model.layers.67.mlp.up_proj.weight": "model-00025-of-00030.safetensors",
588
+ "model.layers.67.post_attention_layernorm.weight": "model-00025-of-00030.safetensors",
589
+ "model.layers.67.self_attn.k_proj.weight": "model-00025-of-00030.safetensors",
590
+ "model.layers.67.self_attn.o_proj.weight": "model-00025-of-00030.safetensors",
591
+ "model.layers.67.self_attn.q_proj.weight": "model-00025-of-00030.safetensors",
592
+ "model.layers.67.self_attn.v_proj.weight": "model-00025-of-00030.safetensors",
593
+ "model.layers.68.input_layernorm.weight": "model-00026-of-00030.safetensors",
594
+ "model.layers.68.mlp.down_proj.weight": "model-00026-of-00030.safetensors",
595
+ "model.layers.68.mlp.gate_proj.weight": "model-00025-of-00030.safetensors",
596
+ "model.layers.68.mlp.up_proj.weight": "model-00025-of-00030.safetensors",
597
+ "model.layers.68.post_attention_layernorm.weight": "model-00026-of-00030.safetensors",
598
+ "model.layers.68.self_attn.k_proj.weight": "model-00025-of-00030.safetensors",
599
+ "model.layers.68.self_attn.o_proj.weight": "model-00025-of-00030.safetensors",
600
+ "model.layers.68.self_attn.q_proj.weight": "model-00025-of-00030.safetensors",
601
+ "model.layers.68.self_attn.v_proj.weight": "model-00025-of-00030.safetensors",
602
+ "model.layers.69.input_layernorm.weight": "model-00026-of-00030.safetensors",
603
+ "model.layers.69.mlp.down_proj.weight": "model-00026-of-00030.safetensors",
604
+ "model.layers.69.mlp.gate_proj.weight": "model-00026-of-00030.safetensors",
605
+ "model.layers.69.mlp.up_proj.weight": "model-00026-of-00030.safetensors",
606
+ "model.layers.69.post_attention_layernorm.weight": "model-00026-of-00030.safetensors",
607
+ "model.layers.69.self_attn.k_proj.weight": "model-00026-of-00030.safetensors",
608
+ "model.layers.69.self_attn.o_proj.weight": "model-00026-of-00030.safetensors",
609
+ "model.layers.69.self_attn.q_proj.weight": "model-00026-of-00030.safetensors",
610
+ "model.layers.69.self_attn.v_proj.weight": "model-00026-of-00030.safetensors",
611
+ "model.layers.7.input_layernorm.weight": "model-00004-of-00030.safetensors",
612
+ "model.layers.7.mlp.down_proj.weight": "model-00004-of-00030.safetensors",
613
+ "model.layers.7.mlp.gate_proj.weight": "model-00004-of-00030.safetensors",
614
+ "model.layers.7.mlp.up_proj.weight": "model-00004-of-00030.safetensors",
615
+ "model.layers.7.post_attention_layernorm.weight": "model-00004-of-00030.safetensors",
616
+ "model.layers.7.self_attn.k_proj.weight": "model-00003-of-00030.safetensors",
617
+ "model.layers.7.self_attn.o_proj.weight": "model-00004-of-00030.safetensors",
618
+ "model.layers.7.self_attn.q_proj.weight": "model-00003-of-00030.safetensors",
619
+ "model.layers.7.self_attn.v_proj.weight": "model-00003-of-00030.safetensors",
620
+ "model.layers.70.input_layernorm.weight": "model-00026-of-00030.safetensors",
621
+ "model.layers.70.mlp.down_proj.weight": "model-00026-of-00030.safetensors",
622
+ "model.layers.70.mlp.gate_proj.weight": "model-00026-of-00030.safetensors",
623
+ "model.layers.70.mlp.up_proj.weight": "model-00026-of-00030.safetensors",
624
+ "model.layers.70.post_attention_layernorm.weight": "model-00026-of-00030.safetensors",
625
+ "model.layers.70.self_attn.k_proj.weight": "model-00026-of-00030.safetensors",
626
+ "model.layers.70.self_attn.o_proj.weight": "model-00026-of-00030.safetensors",
627
+ "model.layers.70.self_attn.q_proj.weight": "model-00026-of-00030.safetensors",
628
+ "model.layers.70.self_attn.v_proj.weight": "model-00026-of-00030.safetensors",
629
+ "model.layers.71.input_layernorm.weight": "model-00027-of-00030.safetensors",
630
+ "model.layers.71.mlp.down_proj.weight": "model-00027-of-00030.safetensors",
631
+ "model.layers.71.mlp.gate_proj.weight": "model-00026-of-00030.safetensors",
632
+ "model.layers.71.mlp.up_proj.weight": "model-00027-of-00030.safetensors",
633
+ "model.layers.71.post_attention_layernorm.weight": "model-00027-of-00030.safetensors",
634
+ "model.layers.71.self_attn.k_proj.weight": "model-00026-of-00030.safetensors",
635
+ "model.layers.71.self_attn.o_proj.weight": "model-00026-of-00030.safetensors",
636
+ "model.layers.71.self_attn.q_proj.weight": "model-00026-of-00030.safetensors",
637
+ "model.layers.71.self_attn.v_proj.weight": "model-00026-of-00030.safetensors",
638
+ "model.layers.72.input_layernorm.weight": "model-00027-of-00030.safetensors",
639
+ "model.layers.72.mlp.down_proj.weight": "model-00027-of-00030.safetensors",
640
+ "model.layers.72.mlp.gate_proj.weight": "model-00027-of-00030.safetensors",
641
+ "model.layers.72.mlp.up_proj.weight": "model-00027-of-00030.safetensors",
642
+ "model.layers.72.post_attention_layernorm.weight": "model-00027-of-00030.safetensors",
643
+ "model.layers.72.self_attn.k_proj.weight": "model-00027-of-00030.safetensors",
644
+ "model.layers.72.self_attn.o_proj.weight": "model-00027-of-00030.safetensors",
645
+ "model.layers.72.self_attn.q_proj.weight": "model-00027-of-00030.safetensors",
646
+ "model.layers.72.self_attn.v_proj.weight": "model-00027-of-00030.safetensors",
647
+ "model.layers.73.input_layernorm.weight": "model-00027-of-00030.safetensors",
648
+ "model.layers.73.mlp.down_proj.weight": "model-00027-of-00030.safetensors",
649
+ "model.layers.73.mlp.gate_proj.weight": "model-00027-of-00030.safetensors",
650
+ "model.layers.73.mlp.up_proj.weight": "model-00027-of-00030.safetensors",
651
+ "model.layers.73.post_attention_layernorm.weight": "model-00027-of-00030.safetensors",
652
+ "model.layers.73.self_attn.k_proj.weight": "model-00027-of-00030.safetensors",
653
+ "model.layers.73.self_attn.o_proj.weight": "model-00027-of-00030.safetensors",
654
+ "model.layers.73.self_attn.q_proj.weight": "model-00027-of-00030.safetensors",
655
+ "model.layers.73.self_attn.v_proj.weight": "model-00027-of-00030.safetensors",
656
+ "model.layers.74.input_layernorm.weight": "model-00028-of-00030.safetensors",
657
+ "model.layers.74.mlp.down_proj.weight": "model-00028-of-00030.safetensors",
658
+ "model.layers.74.mlp.gate_proj.weight": "model-00028-of-00030.safetensors",
659
+ "model.layers.74.mlp.up_proj.weight": "model-00028-of-00030.safetensors",
660
+ "model.layers.74.post_attention_layernorm.weight": "model-00028-of-00030.safetensors",
661
+ "model.layers.74.self_attn.k_proj.weight": "model-00027-of-00030.safetensors",
662
+ "model.layers.74.self_attn.o_proj.weight": "model-00027-of-00030.safetensors",
663
+ "model.layers.74.self_attn.q_proj.weight": "model-00027-of-00030.safetensors",
664
+ "model.layers.74.self_attn.v_proj.weight": "model-00027-of-00030.safetensors",
665
+ "model.layers.75.input_layernorm.weight": "model-00028-of-00030.safetensors",
666
+ "model.layers.75.mlp.down_proj.weight": "model-00028-of-00030.safetensors",
667
+ "model.layers.75.mlp.gate_proj.weight": "model-00028-of-00030.safetensors",
668
+ "model.layers.75.mlp.up_proj.weight": "model-00028-of-00030.safetensors",
669
+ "model.layers.75.post_attention_layernorm.weight": "model-00028-of-00030.safetensors",
670
+ "model.layers.75.self_attn.k_proj.weight": "model-00028-of-00030.safetensors",
671
+ "model.layers.75.self_attn.o_proj.weight": "model-00028-of-00030.safetensors",
672
+ "model.layers.75.self_attn.q_proj.weight": "model-00028-of-00030.safetensors",
673
+ "model.layers.75.self_attn.v_proj.weight": "model-00028-of-00030.safetensors",
674
+ "model.layers.76.input_layernorm.weight": "model-00028-of-00030.safetensors",
675
+ "model.layers.76.mlp.down_proj.weight": "model-00028-of-00030.safetensors",
676
+ "model.layers.76.mlp.gate_proj.weight": "model-00028-of-00030.safetensors",
677
+ "model.layers.76.mlp.up_proj.weight": "model-00028-of-00030.safetensors",
678
+ "model.layers.76.post_attention_layernorm.weight": "model-00028-of-00030.safetensors",
679
+ "model.layers.76.self_attn.k_proj.weight": "model-00028-of-00030.safetensors",
680
+ "model.layers.76.self_attn.o_proj.weight": "model-00028-of-00030.safetensors",
681
+ "model.layers.76.self_attn.q_proj.weight": "model-00028-of-00030.safetensors",
682
+ "model.layers.76.self_attn.v_proj.weight": "model-00028-of-00030.safetensors",
683
+ "model.layers.77.input_layernorm.weight": "model-00029-of-00030.safetensors",
684
+ "model.layers.77.mlp.down_proj.weight": "model-00029-of-00030.safetensors",
685
+ "model.layers.77.mlp.gate_proj.weight": "model-00029-of-00030.safetensors",
686
+ "model.layers.77.mlp.up_proj.weight": "model-00029-of-00030.safetensors",
687
+ "model.layers.77.post_attention_layernorm.weight": "model-00029-of-00030.safetensors",
688
+ "model.layers.77.self_attn.k_proj.weight": "model-00028-of-00030.safetensors",
689
+ "model.layers.77.self_attn.o_proj.weight": "model-00029-of-00030.safetensors",
690
+ "model.layers.77.self_attn.q_proj.weight": "model-00028-of-00030.safetensors",
691
+ "model.layers.77.self_attn.v_proj.weight": "model-00028-of-00030.safetensors",
692
+ "model.layers.78.input_layernorm.weight": "model-00029-of-00030.safetensors",
693
+ "model.layers.78.mlp.down_proj.weight": "model-00029-of-00030.safetensors",
694
+ "model.layers.78.mlp.gate_proj.weight": "model-00029-of-00030.safetensors",
695
+ "model.layers.78.mlp.up_proj.weight": "model-00029-of-00030.safetensors",
696
+ "model.layers.78.post_attention_layernorm.weight": "model-00029-of-00030.safetensors",
697
+ "model.layers.78.self_attn.k_proj.weight": "model-00029-of-00030.safetensors",
698
+ "model.layers.78.self_attn.o_proj.weight": "model-00029-of-00030.safetensors",
699
+ "model.layers.78.self_attn.q_proj.weight": "model-00029-of-00030.safetensors",
700
+ "model.layers.78.self_attn.v_proj.weight": "model-00029-of-00030.safetensors",
701
+ "model.layers.79.input_layernorm.weight": "model-00029-of-00030.safetensors",
702
+ "model.layers.79.mlp.down_proj.weight": "model-00029-of-00030.safetensors",
703
+ "model.layers.79.mlp.gate_proj.weight": "model-00029-of-00030.safetensors",
704
+ "model.layers.79.mlp.up_proj.weight": "model-00029-of-00030.safetensors",
705
+ "model.layers.79.post_attention_layernorm.weight": "model-00029-of-00030.safetensors",
706
+ "model.layers.79.self_attn.k_proj.weight": "model-00029-of-00030.safetensors",
707
+ "model.layers.79.self_attn.o_proj.weight": "model-00029-of-00030.safetensors",
708
+ "model.layers.79.self_attn.q_proj.weight": "model-00029-of-00030.safetensors",
709
+ "model.layers.79.self_attn.v_proj.weight": "model-00029-of-00030.safetensors",
710
+ "model.layers.8.input_layernorm.weight": "model-00004-of-00030.safetensors",
711
+ "model.layers.8.mlp.down_proj.weight": "model-00004-of-00030.safetensors",
712
+ "model.layers.8.mlp.gate_proj.weight": "model-00004-of-00030.safetensors",
713
+ "model.layers.8.mlp.up_proj.weight": "model-00004-of-00030.safetensors",
714
+ "model.layers.8.post_attention_layernorm.weight": "model-00004-of-00030.safetensors",
715
+ "model.layers.8.self_attn.k_proj.weight": "model-00004-of-00030.safetensors",
716
+ "model.layers.8.self_attn.o_proj.weight": "model-00004-of-00030.safetensors",
717
+ "model.layers.8.self_attn.q_proj.weight": "model-00004-of-00030.safetensors",
718
+ "model.layers.8.self_attn.v_proj.weight": "model-00004-of-00030.safetensors",
719
+ "model.layers.9.input_layernorm.weight": "model-00004-of-00030.safetensors",
720
+ "model.layers.9.mlp.down_proj.weight": "model-00004-of-00030.safetensors",
721
+ "model.layers.9.mlp.gate_proj.weight": "model-00004-of-00030.safetensors",
722
+ "model.layers.9.mlp.up_proj.weight": "model-00004-of-00030.safetensors",
723
+ "model.layers.9.post_attention_layernorm.weight": "model-00004-of-00030.safetensors",
724
+ "model.layers.9.self_attn.k_proj.weight": "model-00004-of-00030.safetensors",
725
+ "model.layers.9.self_attn.o_proj.weight": "model-00004-of-00030.safetensors",
726
+ "model.layers.9.self_attn.q_proj.weight": "model-00004-of-00030.safetensors",
727
+ "model.layers.9.self_attn.v_proj.weight": "model-00004-of-00030.safetensors",
728
+ "model.norm.weight": "model-00029-of-00030.safetensors"
729
+ }
730
+ }
nasa.md ADDED
@@ -0,0 +1,84 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ **NASA Report**
2
+ ===============
3
+
4
+ **Subject:** Evaluation of Jupiter's Moons as Potential Colonies \
5
+ **Date:** 2023-02-20
6
+
7
+ ## Introduction
8
+
9
+ As humanity continues to expand its presence in our solar system, the possibility of establishing colonies on the moons of gas giants has gained significant attention. Among these, Jupiter's four largest moons - Io, Europa, Ganymede, and Callisto - have emerged as promising candidates due to their size, composition, and strategic locations. This report will assess each moon's suitability for colonization, highlighting both the advantages and disadvantages, and ultimately provide a recommendation for the most promising target.
10
+
11
+ ### Methodology
12
+
13
+ The evaluation of Jupiter's moons was conducted through a comprehensive review of existing scientific literature, NASA mission data, and theoretical modeling. The key factors considered include:
14
+
15
+ 1. **Distance and Accessibility:** The distance from Earth, ease of transportation, and potential for orbital insertion.
16
+ 2. **Atmospheric Conditions:** Presence, composition, and stability of atmosphere, including pressure, temperature, and radiation exposure.
17
+ 3. **Surface Features and Resources:** Geological characteristics, availability of water ice, organic compounds, and mineral resources.
18
+ 4. **Energy Sources:** Potential for harnessing solar energy, tidal heating, nuclear reactions, or other power generation methods.
19
+ 5. **Habitability:** Conditions conducive to human habitation, such as stable temperature ranges, atmospheric gases, and protection from harmful radiation.
20
+ 6. **Scientific Research Opportunities:** Potential for groundbreaking discoveries in geology, astronomy, astrobiology, and space physics.
21
+ 7. **Defensibility and Security:** Natural barriers, strategic locations, and potential for establishing secure outposts.
22
+
23
+ ## Evaluation of Jupiter's Moons
24
+
25
+ ### Io
26
+ #### Pros
27
+ - **Geological Activity:** Io's intense volcanic activity provides a unique opportunity for studying planetary dynamics and geological processes.
28
+ - **Internal Heat Source:** Tidal heating offers a potential source of energy, both for powering settlements and for in-situ resource utilization (ISRU).
29
+
30
+ #### Cons
31
+ - **Harsh Environment:** Extreme volcanism, high levels of radiation, and lack of atmosphere make surface habitation nearly impossible.
32
+ - **Resource Limitations:** Limited availability of water ice and organic compounds, crucial for life support and propulsion.
33
+
34
+ ### Europa
35
+ #### Pros
36
+ - **Subsurface Ocean:** Presence of a large, liquid water ocean beneath the icy crust offers excellent potential for finding extraterrestrial life and harnessing water resources.
37
+ - **Stable Energy Source:** Tidal heating could power underwater habitats and provide a reliable energy supply.
38
+ - **Scientific Significance:** Offers unparalleled opportunities for astrobiological research and exploration of a subsurface ocean.
39
+
40
+ #### Cons
41
+ - **Radiation Exposure:** High levels of radiation from Jupiter's magnetosphere pose significant challenges for surface operations.
42
+ - **Technological Challenges:** Drilling through the thick ice crust to access the ocean would require advanced, specialized equipment.
43
+
44
+ ### Ganymede
45
+ #### Pros
46
+ - **Largest Moon:** Its size offers more surface area and potentially greater resources than the other moons.
47
+ - **Subsurface Ocean:** Similar to Europa, Ganymede has a subsurface ocean, offering potential for life and energy sources.
48
+ - **Magnetic Field:** Ganymede's own magnetic field provides some protection against harmful radiation.
49
+
50
+ #### Cons
51
+ - **Cold Temperatures:** Surface temperatures are extremely low, making heating and insulation crucial for any habitation.
52
+ - **Distance from Jupiter:** While offering some protection, Ganymede's distance also means less tidal heating compared to Io and Europa.
53
+
54
+ ### Callisto
55
+ #### Pros
56
+ - **Low Radiation:** Located outside Jupiter's main radiation belt, Callisto offers a relatively safe environment for human habitation.
57
+ - **Stable Surface:** Geologically inactive, providing a stable platform for construction and long-term settlement.
58
+ - **Cratered Terrain:** Potential for discovering and extracting resources from impact craters.
59
+
60
+ #### Cons
61
+ - **Lack of Resources:** No confirmed presence of liquid water or substantial organic compounds.
62
+ - **Limited Energy Sources:** No tidal heating; reliance on solar panels or nuclear reactors for energy generation.
63
+
64
+ ## Comparison and Recommendation
65
+
66
+ After evaluating the pros and cons of each moon, it is clear that while all four have unique advantages, they also present significant challenges. Considering the factors outlined in the methodology, **Europa stands out as the most promising candidate for colonization**.
67
+
68
+ Despite the technological hurdles and radiation concerns, Europa's subsurface ocean offers unparalleled opportunities for scientific discovery, resource extraction, and potential for supporting life. The reliable energy source provided by tidal heating could power underwater habitats and ISRU operations, mitigating some of the logistical challenges associated with establishing a colony.
69
+
70
+ ## Implementation Plan
71
+
72
+ To overcome the challenges associated with colonizing Europa, NASA should focus on the following steps:
73
+
74
+ 1. **Robust Reconnaissance:** Conduct detailed reconnaissance missions to map the moon's subsurface ocean, identify potential drilling sites, and assess radiation levels.
75
+ 2. **Advanced Technologies:** Develop and test technologies for drilling through the ice crust, constructing underwater habitats, and shielding against radiation.
76
+ 3. **In-Situ Resource Utilization:** Establish capabilities for extracting water, oxygen, and other essential resources from the moon's subsurface ocean.
77
+ 4. **Radiation Protection:** Develop effective shielding technologies or habitat designs that minimize radiation exposure for both personnel and electronic systems.
78
+ 5. **Gradual Phased Approach:** Establish initial surface outposts to conduct research and develop necessary infrastructure before transitioning to underwater habitats.
79
+
80
+ ## Conclusion
81
+
82
+ While all of Jupiter's major moons offer intriguing possibilities for colonization, Europa's unique combination of a subsurface ocean, stable energy source, and scientific significance make it the most attractive target. By addressing the technical challenges through a phased approach and investing in cutting-edge technologies, NASA can pave the way for humanity's first extraterrestrial colony.
83
+
84
+ By carefully considering these factors and implementing a well-planned strategy, NASA can successfully establish a human settlement on one of Jupiter's moons, marking a historic milestone in space exploration and the expansion of human civilization.
output-00001-of-00003.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:efc4079c261d489cf16ad127c2a29ebb4041c04a42d5677b6c17d0775553e796
3
+ size 8551966454
output-00002-of-00003.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4b3a3611120faa42b8adf1de7ae4712de2e164cb0525afdae39ec6fdf688e2f2
3
+ size 8547936944
output-00003-of-00003.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3beb7e36047954b6f07ad538763b5efbe2862f2c4c549c8c5a95630812182864
3
+ size 5089135984
special_tokens_map.json ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<|begin_of_text|>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|eot_id|>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "<|finetune_right_pad_id|>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ }
23
+ }
tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:175c60636c69844781e0741a5661d66530386e50bb98fd6b6f2f1b2b8bd51979
3
+ size 17210541
tokenizer_config.json ADDED
@@ -0,0 +1,2088 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "128000": {
4
+ "content": "<|begin_of_text|>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "128001": {
12
+ "content": "<|end_of_text|>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "128002": {
20
+ "content": "<|reserved_special_token_0|>",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "128003": {
28
+ "content": "<|reserved_special_token_1|>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "128004": {
36
+ "content": "<|finetune_right_pad_id|>",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ },
43
+ "128005": {
44
+ "content": "<|reserved_special_token_2|>",
45
+ "lstrip": false,
46
+ "normalized": false,
47
+ "rstrip": false,
48
+ "single_word": false,
49
+ "special": true
50
+ },
51
+ "128006": {
52
+ "content": "<|start_header_id|>",
53
+ "lstrip": false,
54
+ "normalized": false,
55
+ "rstrip": false,
56
+ "single_word": false,
57
+ "special": true
58
+ },
59
+ "128007": {
60
+ "content": "<|end_header_id|>",
61
+ "lstrip": false,
62
+ "normalized": false,
63
+ "rstrip": false,
64
+ "single_word": false,
65
+ "special": true
66
+ },
67
+ "128008": {
68
+ "content": "<|eom_id|>",
69
+ "lstrip": false,
70
+ "normalized": false,
71
+ "rstrip": false,
72
+ "single_word": false,
73
+ "special": true
74
+ },
75
+ "128009": {
76
+ "content": "<|eot_id|>",
77
+ "lstrip": false,
78
+ "normalized": false,
79
+ "rstrip": false,
80
+ "single_word": false,
81
+ "special": true
82
+ },
83
+ "128010": {
84
+ "content": "<|python_tag|>",
85
+ "lstrip": false,
86
+ "normalized": false,
87
+ "rstrip": false,
88
+ "single_word": false,
89
+ "special": true
90
+ },
91
+ "128011": {
92
+ "content": "<|reserved_special_token_3|>",
93
+ "lstrip": false,
94
+ "normalized": false,
95
+ "rstrip": false,
96
+ "single_word": false,
97
+ "special": true
98
+ },
99
+ "128012": {
100
+ "content": "<|reserved_special_token_4|>",
101
+ "lstrip": false,
102
+ "normalized": false,
103
+ "rstrip": false,
104
+ "single_word": false,
105
+ "special": true
106
+ },
107
+ "128013": {
108
+ "content": "<|reserved_special_token_5|>",
109
+ "lstrip": false,
110
+ "normalized": false,
111
+ "rstrip": false,
112
+ "single_word": false,
113
+ "special": true
114
+ },
115
+ "128014": {
116
+ "content": "<|reserved_special_token_6|>",
117
+ "lstrip": false,
118
+ "normalized": false,
119
+ "rstrip": false,
120
+ "single_word": false,
121
+ "special": true
122
+ },
123
+ "128015": {
124
+ "content": "<|reserved_special_token_7|>",
125
+ "lstrip": false,
126
+ "normalized": false,
127
+ "rstrip": false,
128
+ "single_word": false,
129
+ "special": true
130
+ },
131
+ "128016": {
132
+ "content": "<|reserved_special_token_8|>",
133
+ "lstrip": false,
134
+ "normalized": false,
135
+ "rstrip": false,
136
+ "single_word": false,
137
+ "special": true
138
+ },
139
+ "128017": {
140
+ "content": "<|reserved_special_token_9|>",
141
+ "lstrip": false,
142
+ "normalized": false,
143
+ "rstrip": false,
144
+ "single_word": false,
145
+ "special": true
146
+ },
147
+ "128018": {
148
+ "content": "<|reserved_special_token_10|>",
149
+ "lstrip": false,
150
+ "normalized": false,
151
+ "rstrip": false,
152
+ "single_word": false,
153
+ "special": true
154
+ },
155
+ "128019": {
156
+ "content": "<|reserved_special_token_11|>",
157
+ "lstrip": false,
158
+ "normalized": false,
159
+ "rstrip": false,
160
+ "single_word": false,
161
+ "special": true
162
+ },
163
+ "128020": {
164
+ "content": "<|reserved_special_token_12|>",
165
+ "lstrip": false,
166
+ "normalized": false,
167
+ "rstrip": false,
168
+ "single_word": false,
169
+ "special": true
170
+ },
171
+ "128021": {
172
+ "content": "<|reserved_special_token_13|>",
173
+ "lstrip": false,
174
+ "normalized": false,
175
+ "rstrip": false,
176
+ "single_word": false,
177
+ "special": true
178
+ },
179
+ "128022": {
180
+ "content": "<|reserved_special_token_14|>",
181
+ "lstrip": false,
182
+ "normalized": false,
183
+ "rstrip": false,
184
+ "single_word": false,
185
+ "special": true
186
+ },
187
+ "128023": {
188
+ "content": "<|reserved_special_token_15|>",
189
+ "lstrip": false,
190
+ "normalized": false,
191
+ "rstrip": false,
192
+ "single_word": false,
193
+ "special": true
194
+ },
195
+ "128024": {
196
+ "content": "<|reserved_special_token_16|>",
197
+ "lstrip": false,
198
+ "normalized": false,
199
+ "rstrip": false,
200
+ "single_word": false,
201
+ "special": true
202
+ },
203
+ "128025": {
204
+ "content": "<|reserved_special_token_17|>",
205
+ "lstrip": false,
206
+ "normalized": false,
207
+ "rstrip": false,
208
+ "single_word": false,
209
+ "special": true
210
+ },
211
+ "128026": {
212
+ "content": "<|reserved_special_token_18|>",
213
+ "lstrip": false,
214
+ "normalized": false,
215
+ "rstrip": false,
216
+ "single_word": false,
217
+ "special": true
218
+ },
219
+ "128027": {
220
+ "content": "<|reserved_special_token_19|>",
221
+ "lstrip": false,
222
+ "normalized": false,
223
+ "rstrip": false,
224
+ "single_word": false,
225
+ "special": true
226
+ },
227
+ "128028": {
228
+ "content": "<|reserved_special_token_20|>",
229
+ "lstrip": false,
230
+ "normalized": false,
231
+ "rstrip": false,
232
+ "single_word": false,
233
+ "special": true
234
+ },
235
+ "128029": {
236
+ "content": "<|reserved_special_token_21|>",
237
+ "lstrip": false,
238
+ "normalized": false,
239
+ "rstrip": false,
240
+ "single_word": false,
241
+ "special": true
242
+ },
243
+ "128030": {
244
+ "content": "<|reserved_special_token_22|>",
245
+ "lstrip": false,
246
+ "normalized": false,
247
+ "rstrip": false,
248
+ "single_word": false,
249
+ "special": true
250
+ },
251
+ "128031": {
252
+ "content": "<|reserved_special_token_23|>",
253
+ "lstrip": false,
254
+ "normalized": false,
255
+ "rstrip": false,
256
+ "single_word": false,
257
+ "special": true
258
+ },
259
+ "128032": {
260
+ "content": "<|reserved_special_token_24|>",
261
+ "lstrip": false,
262
+ "normalized": false,
263
+ "rstrip": false,
264
+ "single_word": false,
265
+ "special": true
266
+ },
267
+ "128033": {
268
+ "content": "<|reserved_special_token_25|>",
269
+ "lstrip": false,
270
+ "normalized": false,
271
+ "rstrip": false,
272
+ "single_word": false,
273
+ "special": true
274
+ },
275
+ "128034": {
276
+ "content": "<|reserved_special_token_26|>",
277
+ "lstrip": false,
278
+ "normalized": false,
279
+ "rstrip": false,
280
+ "single_word": false,
281
+ "special": true
282
+ },
283
+ "128035": {
284
+ "content": "<|reserved_special_token_27|>",
285
+ "lstrip": false,
286
+ "normalized": false,
287
+ "rstrip": false,
288
+ "single_word": false,
289
+ "special": true
290
+ },
291
+ "128036": {
292
+ "content": "<|reserved_special_token_28|>",
293
+ "lstrip": false,
294
+ "normalized": false,
295
+ "rstrip": false,
296
+ "single_word": false,
297
+ "special": true
298
+ },
299
+ "128037": {
300
+ "content": "<|reserved_special_token_29|>",
301
+ "lstrip": false,
302
+ "normalized": false,
303
+ "rstrip": false,
304
+ "single_word": false,
305
+ "special": true
306
+ },
307
+ "128038": {
308
+ "content": "<|reserved_special_token_30|>",
309
+ "lstrip": false,
310
+ "normalized": false,
311
+ "rstrip": false,
312
+ "single_word": false,
313
+ "special": true
314
+ },
315
+ "128039": {
316
+ "content": "<|reserved_special_token_31|>",
317
+ "lstrip": false,
318
+ "normalized": false,
319
+ "rstrip": false,
320
+ "single_word": false,
321
+ "special": true
322
+ },
323
+ "128040": {
324
+ "content": "<|reserved_special_token_32|>",
325
+ "lstrip": false,
326
+ "normalized": false,
327
+ "rstrip": false,
328
+ "single_word": false,
329
+ "special": true
330
+ },
331
+ "128041": {
332
+ "content": "<|reserved_special_token_33|>",
333
+ "lstrip": false,
334
+ "normalized": false,
335
+ "rstrip": false,
336
+ "single_word": false,
337
+ "special": true
338
+ },
339
+ "128042": {
340
+ "content": "<|reserved_special_token_34|>",
341
+ "lstrip": false,
342
+ "normalized": false,
343
+ "rstrip": false,
344
+ "single_word": false,
345
+ "special": true
346
+ },
347
+ "128043": {
348
+ "content": "<|reserved_special_token_35|>",
349
+ "lstrip": false,
350
+ "normalized": false,
351
+ "rstrip": false,
352
+ "single_word": false,
353
+ "special": true
354
+ },
355
+ "128044": {
356
+ "content": "<|reserved_special_token_36|>",
357
+ "lstrip": false,
358
+ "normalized": false,
359
+ "rstrip": false,
360
+ "single_word": false,
361
+ "special": true
362
+ },
363
+ "128045": {
364
+ "content": "<|reserved_special_token_37|>",
365
+ "lstrip": false,
366
+ "normalized": false,
367
+ "rstrip": false,
368
+ "single_word": false,
369
+ "special": true
370
+ },
371
+ "128046": {
372
+ "content": "<|reserved_special_token_38|>",
373
+ "lstrip": false,
374
+ "normalized": false,
375
+ "rstrip": false,
376
+ "single_word": false,
377
+ "special": true
378
+ },
379
+ "128047": {
380
+ "content": "<|reserved_special_token_39|>",
381
+ "lstrip": false,
382
+ "normalized": false,
383
+ "rstrip": false,
384
+ "single_word": false,
385
+ "special": true
386
+ },
387
+ "128048": {
388
+ "content": "<|reserved_special_token_40|>",
389
+ "lstrip": false,
390
+ "normalized": false,
391
+ "rstrip": false,
392
+ "single_word": false,
393
+ "special": true
394
+ },
395
+ "128049": {
396
+ "content": "<|reserved_special_token_41|>",
397
+ "lstrip": false,
398
+ "normalized": false,
399
+ "rstrip": false,
400
+ "single_word": false,
401
+ "special": true
402
+ },
403
+ "128050": {
404
+ "content": "<|reserved_special_token_42|>",
405
+ "lstrip": false,
406
+ "normalized": false,
407
+ "rstrip": false,
408
+ "single_word": false,
409
+ "special": true
410
+ },
411
+ "128051": {
412
+ "content": "<|reserved_special_token_43|>",
413
+ "lstrip": false,
414
+ "normalized": false,
415
+ "rstrip": false,
416
+ "single_word": false,
417
+ "special": true
418
+ },
419
+ "128052": {
420
+ "content": "<|reserved_special_token_44|>",
421
+ "lstrip": false,
422
+ "normalized": false,
423
+ "rstrip": false,
424
+ "single_word": false,
425
+ "special": true
426
+ },
427
+ "128053": {
428
+ "content": "<|reserved_special_token_45|>",
429
+ "lstrip": false,
430
+ "normalized": false,
431
+ "rstrip": false,
432
+ "single_word": false,
433
+ "special": true
434
+ },
435
+ "128054": {
436
+ "content": "<|reserved_special_token_46|>",
437
+ "lstrip": false,
438
+ "normalized": false,
439
+ "rstrip": false,
440
+ "single_word": false,
441
+ "special": true
442
+ },
443
+ "128055": {
444
+ "content": "<|reserved_special_token_47|>",
445
+ "lstrip": false,
446
+ "normalized": false,
447
+ "rstrip": false,
448
+ "single_word": false,
449
+ "special": true
450
+ },
451
+ "128056": {
452
+ "content": "<|reserved_special_token_48|>",
453
+ "lstrip": false,
454
+ "normalized": false,
455
+ "rstrip": false,
456
+ "single_word": false,
457
+ "special": true
458
+ },
459
+ "128057": {
460
+ "content": "<|reserved_special_token_49|>",
461
+ "lstrip": false,
462
+ "normalized": false,
463
+ "rstrip": false,
464
+ "single_word": false,
465
+ "special": true
466
+ },
467
+ "128058": {
468
+ "content": "<|reserved_special_token_50|>",
469
+ "lstrip": false,
470
+ "normalized": false,
471
+ "rstrip": false,
472
+ "single_word": false,
473
+ "special": true
474
+ },
475
+ "128059": {
476
+ "content": "<|reserved_special_token_51|>",
477
+ "lstrip": false,
478
+ "normalized": false,
479
+ "rstrip": false,
480
+ "single_word": false,
481
+ "special": true
482
+ },
483
+ "128060": {
484
+ "content": "<|reserved_special_token_52|>",
485
+ "lstrip": false,
486
+ "normalized": false,
487
+ "rstrip": false,
488
+ "single_word": false,
489
+ "special": true
490
+ },
491
+ "128061": {
492
+ "content": "<|reserved_special_token_53|>",
493
+ "lstrip": false,
494
+ "normalized": false,
495
+ "rstrip": false,
496
+ "single_word": false,
497
+ "special": true
498
+ },
499
+ "128062": {
500
+ "content": "<|reserved_special_token_54|>",
501
+ "lstrip": false,
502
+ "normalized": false,
503
+ "rstrip": false,
504
+ "single_word": false,
505
+ "special": true
506
+ },
507
+ "128063": {
508
+ "content": "<|reserved_special_token_55|>",
509
+ "lstrip": false,
510
+ "normalized": false,
511
+ "rstrip": false,
512
+ "single_word": false,
513
+ "special": true
514
+ },
515
+ "128064": {
516
+ "content": "<|reserved_special_token_56|>",
517
+ "lstrip": false,
518
+ "normalized": false,
519
+ "rstrip": false,
520
+ "single_word": false,
521
+ "special": true
522
+ },
523
+ "128065": {
524
+ "content": "<|reserved_special_token_57|>",
525
+ "lstrip": false,
526
+ "normalized": false,
527
+ "rstrip": false,
528
+ "single_word": false,
529
+ "special": true
530
+ },
531
+ "128066": {
532
+ "content": "<|reserved_special_token_58|>",
533
+ "lstrip": false,
534
+ "normalized": false,
535
+ "rstrip": false,
536
+ "single_word": false,
537
+ "special": true
538
+ },
539
+ "128067": {
540
+ "content": "<|reserved_special_token_59|>",
541
+ "lstrip": false,
542
+ "normalized": false,
543
+ "rstrip": false,
544
+ "single_word": false,
545
+ "special": true
546
+ },
547
+ "128068": {
548
+ "content": "<|reserved_special_token_60|>",
549
+ "lstrip": false,
550
+ "normalized": false,
551
+ "rstrip": false,
552
+ "single_word": false,
553
+ "special": true
554
+ },
555
+ "128069": {
556
+ "content": "<|reserved_special_token_61|>",
557
+ "lstrip": false,
558
+ "normalized": false,
559
+ "rstrip": false,
560
+ "single_word": false,
561
+ "special": true
562
+ },
563
+ "128070": {
564
+ "content": "<|reserved_special_token_62|>",
565
+ "lstrip": false,
566
+ "normalized": false,
567
+ "rstrip": false,
568
+ "single_word": false,
569
+ "special": true
570
+ },
571
+ "128071": {
572
+ "content": "<|reserved_special_token_63|>",
573
+ "lstrip": false,
574
+ "normalized": false,
575
+ "rstrip": false,
576
+ "single_word": false,
577
+ "special": true
578
+ },
579
+ "128072": {
580
+ "content": "<|reserved_special_token_64|>",
581
+ "lstrip": false,
582
+ "normalized": false,
583
+ "rstrip": false,
584
+ "single_word": false,
585
+ "special": true
586
+ },
587
+ "128073": {
588
+ "content": "<|reserved_special_token_65|>",
589
+ "lstrip": false,
590
+ "normalized": false,
591
+ "rstrip": false,
592
+ "single_word": false,
593
+ "special": true
594
+ },
595
+ "128074": {
596
+ "content": "<|reserved_special_token_66|>",
597
+ "lstrip": false,
598
+ "normalized": false,
599
+ "rstrip": false,
600
+ "single_word": false,
601
+ "special": true
602
+ },
603
+ "128075": {
604
+ "content": "<|reserved_special_token_67|>",
605
+ "lstrip": false,
606
+ "normalized": false,
607
+ "rstrip": false,
608
+ "single_word": false,
609
+ "special": true
610
+ },
611
+ "128076": {
612
+ "content": "<|reserved_special_token_68|>",
613
+ "lstrip": false,
614
+ "normalized": false,
615
+ "rstrip": false,
616
+ "single_word": false,
617
+ "special": true
618
+ },
619
+ "128077": {
620
+ "content": "<|reserved_special_token_69|>",
621
+ "lstrip": false,
622
+ "normalized": false,
623
+ "rstrip": false,
624
+ "single_word": false,
625
+ "special": true
626
+ },
627
+ "128078": {
628
+ "content": "<|reserved_special_token_70|>",
629
+ "lstrip": false,
630
+ "normalized": false,
631
+ "rstrip": false,
632
+ "single_word": false,
633
+ "special": true
634
+ },
635
+ "128079": {
636
+ "content": "<|reserved_special_token_71|>",
637
+ "lstrip": false,
638
+ "normalized": false,
639
+ "rstrip": false,
640
+ "single_word": false,
641
+ "special": true
642
+ },
643
+ "128080": {
644
+ "content": "<|reserved_special_token_72|>",
645
+ "lstrip": false,
646
+ "normalized": false,
647
+ "rstrip": false,
648
+ "single_word": false,
649
+ "special": true
650
+ },
651
+ "128081": {
652
+ "content": "<|reserved_special_token_73|>",
653
+ "lstrip": false,
654
+ "normalized": false,
655
+ "rstrip": false,
656
+ "single_word": false,
657
+ "special": true
658
+ },
659
+ "128082": {
660
+ "content": "<|reserved_special_token_74|>",
661
+ "lstrip": false,
662
+ "normalized": false,
663
+ "rstrip": false,
664
+ "single_word": false,
665
+ "special": true
666
+ },
667
+ "128083": {
668
+ "content": "<|reserved_special_token_75|>",
669
+ "lstrip": false,
670
+ "normalized": false,
671
+ "rstrip": false,
672
+ "single_word": false,
673
+ "special": true
674
+ },
675
+ "128084": {
676
+ "content": "<|reserved_special_token_76|>",
677
+ "lstrip": false,
678
+ "normalized": false,
679
+ "rstrip": false,
680
+ "single_word": false,
681
+ "special": true
682
+ },
683
+ "128085": {
684
+ "content": "<|reserved_special_token_77|>",
685
+ "lstrip": false,
686
+ "normalized": false,
687
+ "rstrip": false,
688
+ "single_word": false,
689
+ "special": true
690
+ },
691
+ "128086": {
692
+ "content": "<|reserved_special_token_78|>",
693
+ "lstrip": false,
694
+ "normalized": false,
695
+ "rstrip": false,
696
+ "single_word": false,
697
+ "special": true
698
+ },
699
+ "128087": {
700
+ "content": "<|reserved_special_token_79|>",
701
+ "lstrip": false,
702
+ "normalized": false,
703
+ "rstrip": false,
704
+ "single_word": false,
705
+ "special": true
706
+ },
707
+ "128088": {
708
+ "content": "<|reserved_special_token_80|>",
709
+ "lstrip": false,
710
+ "normalized": false,
711
+ "rstrip": false,
712
+ "single_word": false,
713
+ "special": true
714
+ },
715
+ "128089": {
716
+ "content": "<|reserved_special_token_81|>",
717
+ "lstrip": false,
718
+ "normalized": false,
719
+ "rstrip": false,
720
+ "single_word": false,
721
+ "special": true
722
+ },
723
+ "128090": {
724
+ "content": "<|reserved_special_token_82|>",
725
+ "lstrip": false,
726
+ "normalized": false,
727
+ "rstrip": false,
728
+ "single_word": false,
729
+ "special": true
730
+ },
731
+ "128091": {
732
+ "content": "<|reserved_special_token_83|>",
733
+ "lstrip": false,
734
+ "normalized": false,
735
+ "rstrip": false,
736
+ "single_word": false,
737
+ "special": true
738
+ },
739
+ "128092": {
740
+ "content": "<|reserved_special_token_84|>",
741
+ "lstrip": false,
742
+ "normalized": false,
743
+ "rstrip": false,
744
+ "single_word": false,
745
+ "special": true
746
+ },
747
+ "128093": {
748
+ "content": "<|reserved_special_token_85|>",
749
+ "lstrip": false,
750
+ "normalized": false,
751
+ "rstrip": false,
752
+ "single_word": false,
753
+ "special": true
754
+ },
755
+ "128094": {
756
+ "content": "<|reserved_special_token_86|>",
757
+ "lstrip": false,
758
+ "normalized": false,
759
+ "rstrip": false,
760
+ "single_word": false,
761
+ "special": true
762
+ },
763
+ "128095": {
764
+ "content": "<|reserved_special_token_87|>",
765
+ "lstrip": false,
766
+ "normalized": false,
767
+ "rstrip": false,
768
+ "single_word": false,
769
+ "special": true
770
+ },
771
+ "128096": {
772
+ "content": "<|reserved_special_token_88|>",
773
+ "lstrip": false,
774
+ "normalized": false,
775
+ "rstrip": false,
776
+ "single_word": false,
777
+ "special": true
778
+ },
779
+ "128097": {
780
+ "content": "<|reserved_special_token_89|>",
781
+ "lstrip": false,
782
+ "normalized": false,
783
+ "rstrip": false,
784
+ "single_word": false,
785
+ "special": true
786
+ },
787
+ "128098": {
788
+ "content": "<|reserved_special_token_90|>",
789
+ "lstrip": false,
790
+ "normalized": false,
791
+ "rstrip": false,
792
+ "single_word": false,
793
+ "special": true
794
+ },
795
+ "128099": {
796
+ "content": "<|reserved_special_token_91|>",
797
+ "lstrip": false,
798
+ "normalized": false,
799
+ "rstrip": false,
800
+ "single_word": false,
801
+ "special": true
802
+ },
803
+ "128100": {
804
+ "content": "<|reserved_special_token_92|>",
805
+ "lstrip": false,
806
+ "normalized": false,
807
+ "rstrip": false,
808
+ "single_word": false,
809
+ "special": true
810
+ },
811
+ "128101": {
812
+ "content": "<|reserved_special_token_93|>",
813
+ "lstrip": false,
814
+ "normalized": false,
815
+ "rstrip": false,
816
+ "single_word": false,
817
+ "special": true
818
+ },
819
+ "128102": {
820
+ "content": "<|reserved_special_token_94|>",
821
+ "lstrip": false,
822
+ "normalized": false,
823
+ "rstrip": false,
824
+ "single_word": false,
825
+ "special": true
826
+ },
827
+ "128103": {
828
+ "content": "<|reserved_special_token_95|>",
829
+ "lstrip": false,
830
+ "normalized": false,
831
+ "rstrip": false,
832
+ "single_word": false,
833
+ "special": true
834
+ },
835
+ "128104": {
836
+ "content": "<|reserved_special_token_96|>",
837
+ "lstrip": false,
838
+ "normalized": false,
839
+ "rstrip": false,
840
+ "single_word": false,
841
+ "special": true
842
+ },
843
+ "128105": {
844
+ "content": "<|reserved_special_token_97|>",
845
+ "lstrip": false,
846
+ "normalized": false,
847
+ "rstrip": false,
848
+ "single_word": false,
849
+ "special": true
850
+ },
851
+ "128106": {
852
+ "content": "<|reserved_special_token_98|>",
853
+ "lstrip": false,
854
+ "normalized": false,
855
+ "rstrip": false,
856
+ "single_word": false,
857
+ "special": true
858
+ },
859
+ "128107": {
860
+ "content": "<|reserved_special_token_99|>",
861
+ "lstrip": false,
862
+ "normalized": false,
863
+ "rstrip": false,
864
+ "single_word": false,
865
+ "special": true
866
+ },
867
+ "128108": {
868
+ "content": "<|reserved_special_token_100|>",
869
+ "lstrip": false,
870
+ "normalized": false,
871
+ "rstrip": false,
872
+ "single_word": false,
873
+ "special": true
874
+ },
875
+ "128109": {
876
+ "content": "<|reserved_special_token_101|>",
877
+ "lstrip": false,
878
+ "normalized": false,
879
+ "rstrip": false,
880
+ "single_word": false,
881
+ "special": true
882
+ },
883
+ "128110": {
884
+ "content": "<|reserved_special_token_102|>",
885
+ "lstrip": false,
886
+ "normalized": false,
887
+ "rstrip": false,
888
+ "single_word": false,
889
+ "special": true
890
+ },
891
+ "128111": {
892
+ "content": "<|reserved_special_token_103|>",
893
+ "lstrip": false,
894
+ "normalized": false,
895
+ "rstrip": false,
896
+ "single_word": false,
897
+ "special": true
898
+ },
899
+ "128112": {
900
+ "content": "<|reserved_special_token_104|>",
901
+ "lstrip": false,
902
+ "normalized": false,
903
+ "rstrip": false,
904
+ "single_word": false,
905
+ "special": true
906
+ },
907
+ "128113": {
908
+ "content": "<|reserved_special_token_105|>",
909
+ "lstrip": false,
910
+ "normalized": false,
911
+ "rstrip": false,
912
+ "single_word": false,
913
+ "special": true
914
+ },
915
+ "128114": {
916
+ "content": "<|reserved_special_token_106|>",
917
+ "lstrip": false,
918
+ "normalized": false,
919
+ "rstrip": false,
920
+ "single_word": false,
921
+ "special": true
922
+ },
923
+ "128115": {
924
+ "content": "<|reserved_special_token_107|>",
925
+ "lstrip": false,
926
+ "normalized": false,
927
+ "rstrip": false,
928
+ "single_word": false,
929
+ "special": true
930
+ },
931
+ "128116": {
932
+ "content": "<|reserved_special_token_108|>",
933
+ "lstrip": false,
934
+ "normalized": false,
935
+ "rstrip": false,
936
+ "single_word": false,
937
+ "special": true
938
+ },
939
+ "128117": {
940
+ "content": "<|reserved_special_token_109|>",
941
+ "lstrip": false,
942
+ "normalized": false,
943
+ "rstrip": false,
944
+ "single_word": false,
945
+ "special": true
946
+ },
947
+ "128118": {
948
+ "content": "<|reserved_special_token_110|>",
949
+ "lstrip": false,
950
+ "normalized": false,
951
+ "rstrip": false,
952
+ "single_word": false,
953
+ "special": true
954
+ },
955
+ "128119": {
956
+ "content": "<|reserved_special_token_111|>",
957
+ "lstrip": false,
958
+ "normalized": false,
959
+ "rstrip": false,
960
+ "single_word": false,
961
+ "special": true
962
+ },
963
+ "128120": {
964
+ "content": "<|reserved_special_token_112|>",
965
+ "lstrip": false,
966
+ "normalized": false,
967
+ "rstrip": false,
968
+ "single_word": false,
969
+ "special": true
970
+ },
971
+ "128121": {
972
+ "content": "<|reserved_special_token_113|>",
973
+ "lstrip": false,
974
+ "normalized": false,
975
+ "rstrip": false,
976
+ "single_word": false,
977
+ "special": true
978
+ },
979
+ "128122": {
980
+ "content": "<|reserved_special_token_114|>",
981
+ "lstrip": false,
982
+ "normalized": false,
983
+ "rstrip": false,
984
+ "single_word": false,
985
+ "special": true
986
+ },
987
+ "128123": {
988
+ "content": "<|reserved_special_token_115|>",
989
+ "lstrip": false,
990
+ "normalized": false,
991
+ "rstrip": false,
992
+ "single_word": false,
993
+ "special": true
994
+ },
995
+ "128124": {
996
+ "content": "<|reserved_special_token_116|>",
997
+ "lstrip": false,
998
+ "normalized": false,
999
+ "rstrip": false,
1000
+ "single_word": false,
1001
+ "special": true
1002
+ },
1003
+ "128125": {
1004
+ "content": "<|reserved_special_token_117|>",
1005
+ "lstrip": false,
1006
+ "normalized": false,
1007
+ "rstrip": false,
1008
+ "single_word": false,
1009
+ "special": true
1010
+ },
1011
+ "128126": {
1012
+ "content": "<|reserved_special_token_118|>",
1013
+ "lstrip": false,
1014
+ "normalized": false,
1015
+ "rstrip": false,
1016
+ "single_word": false,
1017
+ "special": true
1018
+ },
1019
+ "128127": {
1020
+ "content": "<|reserved_special_token_119|>",
1021
+ "lstrip": false,
1022
+ "normalized": false,
1023
+ "rstrip": false,
1024
+ "single_word": false,
1025
+ "special": true
1026
+ },
1027
+ "128128": {
1028
+ "content": "<|reserved_special_token_120|>",
1029
+ "lstrip": false,
1030
+ "normalized": false,
1031
+ "rstrip": false,
1032
+ "single_word": false,
1033
+ "special": true
1034
+ },
1035
+ "128129": {
1036
+ "content": "<|reserved_special_token_121|>",
1037
+ "lstrip": false,
1038
+ "normalized": false,
1039
+ "rstrip": false,
1040
+ "single_word": false,
1041
+ "special": true
1042
+ },
1043
+ "128130": {
1044
+ "content": "<|reserved_special_token_122|>",
1045
+ "lstrip": false,
1046
+ "normalized": false,
1047
+ "rstrip": false,
1048
+ "single_word": false,
1049
+ "special": true
1050
+ },
1051
+ "128131": {
1052
+ "content": "<|reserved_special_token_123|>",
1053
+ "lstrip": false,
1054
+ "normalized": false,
1055
+ "rstrip": false,
1056
+ "single_word": false,
1057
+ "special": true
1058
+ },
1059
+ "128132": {
1060
+ "content": "<|reserved_special_token_124|>",
1061
+ "lstrip": false,
1062
+ "normalized": false,
1063
+ "rstrip": false,
1064
+ "single_word": false,
1065
+ "special": true
1066
+ },
1067
+ "128133": {
1068
+ "content": "<|reserved_special_token_125|>",
1069
+ "lstrip": false,
1070
+ "normalized": false,
1071
+ "rstrip": false,
1072
+ "single_word": false,
1073
+ "special": true
1074
+ },
1075
+ "128134": {
1076
+ "content": "<|reserved_special_token_126|>",
1077
+ "lstrip": false,
1078
+ "normalized": false,
1079
+ "rstrip": false,
1080
+ "single_word": false,
1081
+ "special": true
1082
+ },
1083
+ "128135": {
1084
+ "content": "<|reserved_special_token_127|>",
1085
+ "lstrip": false,
1086
+ "normalized": false,
1087
+ "rstrip": false,
1088
+ "single_word": false,
1089
+ "special": true
1090
+ },
1091
+ "128136": {
1092
+ "content": "<|reserved_special_token_128|>",
1093
+ "lstrip": false,
1094
+ "normalized": false,
1095
+ "rstrip": false,
1096
+ "single_word": false,
1097
+ "special": true
1098
+ },
1099
+ "128137": {
1100
+ "content": "<|reserved_special_token_129|>",
1101
+ "lstrip": false,
1102
+ "normalized": false,
1103
+ "rstrip": false,
1104
+ "single_word": false,
1105
+ "special": true
1106
+ },
1107
+ "128138": {
1108
+ "content": "<|reserved_special_token_130|>",
1109
+ "lstrip": false,
1110
+ "normalized": false,
1111
+ "rstrip": false,
1112
+ "single_word": false,
1113
+ "special": true
1114
+ },
1115
+ "128139": {
1116
+ "content": "<|reserved_special_token_131|>",
1117
+ "lstrip": false,
1118
+ "normalized": false,
1119
+ "rstrip": false,
1120
+ "single_word": false,
1121
+ "special": true
1122
+ },
1123
+ "128140": {
1124
+ "content": "<|reserved_special_token_132|>",
1125
+ "lstrip": false,
1126
+ "normalized": false,
1127
+ "rstrip": false,
1128
+ "single_word": false,
1129
+ "special": true
1130
+ },
1131
+ "128141": {
1132
+ "content": "<|reserved_special_token_133|>",
1133
+ "lstrip": false,
1134
+ "normalized": false,
1135
+ "rstrip": false,
1136
+ "single_word": false,
1137
+ "special": true
1138
+ },
1139
+ "128142": {
1140
+ "content": "<|reserved_special_token_134|>",
1141
+ "lstrip": false,
1142
+ "normalized": false,
1143
+ "rstrip": false,
1144
+ "single_word": false,
1145
+ "special": true
1146
+ },
1147
+ "128143": {
1148
+ "content": "<|reserved_special_token_135|>",
1149
+ "lstrip": false,
1150
+ "normalized": false,
1151
+ "rstrip": false,
1152
+ "single_word": false,
1153
+ "special": true
1154
+ },
1155
+ "128144": {
1156
+ "content": "<|reserved_special_token_136|>",
1157
+ "lstrip": false,
1158
+ "normalized": false,
1159
+ "rstrip": false,
1160
+ "single_word": false,
1161
+ "special": true
1162
+ },
1163
+ "128145": {
1164
+ "content": "<|reserved_special_token_137|>",
1165
+ "lstrip": false,
1166
+ "normalized": false,
1167
+ "rstrip": false,
1168
+ "single_word": false,
1169
+ "special": true
1170
+ },
1171
+ "128146": {
1172
+ "content": "<|reserved_special_token_138|>",
1173
+ "lstrip": false,
1174
+ "normalized": false,
1175
+ "rstrip": false,
1176
+ "single_word": false,
1177
+ "special": true
1178
+ },
1179
+ "128147": {
1180
+ "content": "<|reserved_special_token_139|>",
1181
+ "lstrip": false,
1182
+ "normalized": false,
1183
+ "rstrip": false,
1184
+ "single_word": false,
1185
+ "special": true
1186
+ },
1187
+ "128148": {
1188
+ "content": "<|reserved_special_token_140|>",
1189
+ "lstrip": false,
1190
+ "normalized": false,
1191
+ "rstrip": false,
1192
+ "single_word": false,
1193
+ "special": true
1194
+ },
1195
+ "128149": {
1196
+ "content": "<|reserved_special_token_141|>",
1197
+ "lstrip": false,
1198
+ "normalized": false,
1199
+ "rstrip": false,
1200
+ "single_word": false,
1201
+ "special": true
1202
+ },
1203
+ "128150": {
1204
+ "content": "<|reserved_special_token_142|>",
1205
+ "lstrip": false,
1206
+ "normalized": false,
1207
+ "rstrip": false,
1208
+ "single_word": false,
1209
+ "special": true
1210
+ },
1211
+ "128151": {
1212
+ "content": "<|reserved_special_token_143|>",
1213
+ "lstrip": false,
1214
+ "normalized": false,
1215
+ "rstrip": false,
1216
+ "single_word": false,
1217
+ "special": true
1218
+ },
1219
+ "128152": {
1220
+ "content": "<|reserved_special_token_144|>",
1221
+ "lstrip": false,
1222
+ "normalized": false,
1223
+ "rstrip": false,
1224
+ "single_word": false,
1225
+ "special": true
1226
+ },
1227
+ "128153": {
1228
+ "content": "<|reserved_special_token_145|>",
1229
+ "lstrip": false,
1230
+ "normalized": false,
1231
+ "rstrip": false,
1232
+ "single_word": false,
1233
+ "special": true
1234
+ },
1235
+ "128154": {
1236
+ "content": "<|reserved_special_token_146|>",
1237
+ "lstrip": false,
1238
+ "normalized": false,
1239
+ "rstrip": false,
1240
+ "single_word": false,
1241
+ "special": true
1242
+ },
1243
+ "128155": {
1244
+ "content": "<|reserved_special_token_147|>",
1245
+ "lstrip": false,
1246
+ "normalized": false,
1247
+ "rstrip": false,
1248
+ "single_word": false,
1249
+ "special": true
1250
+ },
1251
+ "128156": {
1252
+ "content": "<|reserved_special_token_148|>",
1253
+ "lstrip": false,
1254
+ "normalized": false,
1255
+ "rstrip": false,
1256
+ "single_word": false,
1257
+ "special": true
1258
+ },
1259
+ "128157": {
1260
+ "content": "<|reserved_special_token_149|>",
1261
+ "lstrip": false,
1262
+ "normalized": false,
1263
+ "rstrip": false,
1264
+ "single_word": false,
1265
+ "special": true
1266
+ },
1267
+ "128158": {
1268
+ "content": "<|reserved_special_token_150|>",
1269
+ "lstrip": false,
1270
+ "normalized": false,
1271
+ "rstrip": false,
1272
+ "single_word": false,
1273
+ "special": true
1274
+ },
1275
+ "128159": {
1276
+ "content": "<|reserved_special_token_151|>",
1277
+ "lstrip": false,
1278
+ "normalized": false,
1279
+ "rstrip": false,
1280
+ "single_word": false,
1281
+ "special": true
1282
+ },
1283
+ "128160": {
1284
+ "content": "<|reserved_special_token_152|>",
1285
+ "lstrip": false,
1286
+ "normalized": false,
1287
+ "rstrip": false,
1288
+ "single_word": false,
1289
+ "special": true
1290
+ },
1291
+ "128161": {
1292
+ "content": "<|reserved_special_token_153|>",
1293
+ "lstrip": false,
1294
+ "normalized": false,
1295
+ "rstrip": false,
1296
+ "single_word": false,
1297
+ "special": true
1298
+ },
1299
+ "128162": {
1300
+ "content": "<|reserved_special_token_154|>",
1301
+ "lstrip": false,
1302
+ "normalized": false,
1303
+ "rstrip": false,
1304
+ "single_word": false,
1305
+ "special": true
1306
+ },
1307
+ "128163": {
1308
+ "content": "<|reserved_special_token_155|>",
1309
+ "lstrip": false,
1310
+ "normalized": false,
1311
+ "rstrip": false,
1312
+ "single_word": false,
1313
+ "special": true
1314
+ },
1315
+ "128164": {
1316
+ "content": "<|reserved_special_token_156|>",
1317
+ "lstrip": false,
1318
+ "normalized": false,
1319
+ "rstrip": false,
1320
+ "single_word": false,
1321
+ "special": true
1322
+ },
1323
+ "128165": {
1324
+ "content": "<|reserved_special_token_157|>",
1325
+ "lstrip": false,
1326
+ "normalized": false,
1327
+ "rstrip": false,
1328
+ "single_word": false,
1329
+ "special": true
1330
+ },
1331
+ "128166": {
1332
+ "content": "<|reserved_special_token_158|>",
1333
+ "lstrip": false,
1334
+ "normalized": false,
1335
+ "rstrip": false,
1336
+ "single_word": false,
1337
+ "special": true
1338
+ },
1339
+ "128167": {
1340
+ "content": "<|reserved_special_token_159|>",
1341
+ "lstrip": false,
1342
+ "normalized": false,
1343
+ "rstrip": false,
1344
+ "single_word": false,
1345
+ "special": true
1346
+ },
1347
+ "128168": {
1348
+ "content": "<|reserved_special_token_160|>",
1349
+ "lstrip": false,
1350
+ "normalized": false,
1351
+ "rstrip": false,
1352
+ "single_word": false,
1353
+ "special": true
1354
+ },
1355
+ "128169": {
1356
+ "content": "<|reserved_special_token_161|>",
1357
+ "lstrip": false,
1358
+ "normalized": false,
1359
+ "rstrip": false,
1360
+ "single_word": false,
1361
+ "special": true
1362
+ },
1363
+ "128170": {
1364
+ "content": "<|reserved_special_token_162|>",
1365
+ "lstrip": false,
1366
+ "normalized": false,
1367
+ "rstrip": false,
1368
+ "single_word": false,
1369
+ "special": true
1370
+ },
1371
+ "128171": {
1372
+ "content": "<|reserved_special_token_163|>",
1373
+ "lstrip": false,
1374
+ "normalized": false,
1375
+ "rstrip": false,
1376
+ "single_word": false,
1377
+ "special": true
1378
+ },
1379
+ "128172": {
1380
+ "content": "<|reserved_special_token_164|>",
1381
+ "lstrip": false,
1382
+ "normalized": false,
1383
+ "rstrip": false,
1384
+ "single_word": false,
1385
+ "special": true
1386
+ },
1387
+ "128173": {
1388
+ "content": "<|reserved_special_token_165|>",
1389
+ "lstrip": false,
1390
+ "normalized": false,
1391
+ "rstrip": false,
1392
+ "single_word": false,
1393
+ "special": true
1394
+ },
1395
+ "128174": {
1396
+ "content": "<|reserved_special_token_166|>",
1397
+ "lstrip": false,
1398
+ "normalized": false,
1399
+ "rstrip": false,
1400
+ "single_word": false,
1401
+ "special": true
1402
+ },
1403
+ "128175": {
1404
+ "content": "<|reserved_special_token_167|>",
1405
+ "lstrip": false,
1406
+ "normalized": false,
1407
+ "rstrip": false,
1408
+ "single_word": false,
1409
+ "special": true
1410
+ },
1411
+ "128176": {
1412
+ "content": "<|reserved_special_token_168|>",
1413
+ "lstrip": false,
1414
+ "normalized": false,
1415
+ "rstrip": false,
1416
+ "single_word": false,
1417
+ "special": true
1418
+ },
1419
+ "128177": {
1420
+ "content": "<|reserved_special_token_169|>",
1421
+ "lstrip": false,
1422
+ "normalized": false,
1423
+ "rstrip": false,
1424
+ "single_word": false,
1425
+ "special": true
1426
+ },
1427
+ "128178": {
1428
+ "content": "<|reserved_special_token_170|>",
1429
+ "lstrip": false,
1430
+ "normalized": false,
1431
+ "rstrip": false,
1432
+ "single_word": false,
1433
+ "special": true
1434
+ },
1435
+ "128179": {
1436
+ "content": "<|reserved_special_token_171|>",
1437
+ "lstrip": false,
1438
+ "normalized": false,
1439
+ "rstrip": false,
1440
+ "single_word": false,
1441
+ "special": true
1442
+ },
1443
+ "128180": {
1444
+ "content": "<|reserved_special_token_172|>",
1445
+ "lstrip": false,
1446
+ "normalized": false,
1447
+ "rstrip": false,
1448
+ "single_word": false,
1449
+ "special": true
1450
+ },
1451
+ "128181": {
1452
+ "content": "<|reserved_special_token_173|>",
1453
+ "lstrip": false,
1454
+ "normalized": false,
1455
+ "rstrip": false,
1456
+ "single_word": false,
1457
+ "special": true
1458
+ },
1459
+ "128182": {
1460
+ "content": "<|reserved_special_token_174|>",
1461
+ "lstrip": false,
1462
+ "normalized": false,
1463
+ "rstrip": false,
1464
+ "single_word": false,
1465
+ "special": true
1466
+ },
1467
+ "128183": {
1468
+ "content": "<|reserved_special_token_175|>",
1469
+ "lstrip": false,
1470
+ "normalized": false,
1471
+ "rstrip": false,
1472
+ "single_word": false,
1473
+ "special": true
1474
+ },
1475
+ "128184": {
1476
+ "content": "<|reserved_special_token_176|>",
1477
+ "lstrip": false,
1478
+ "normalized": false,
1479
+ "rstrip": false,
1480
+ "single_word": false,
1481
+ "special": true
1482
+ },
1483
+ "128185": {
1484
+ "content": "<|reserved_special_token_177|>",
1485
+ "lstrip": false,
1486
+ "normalized": false,
1487
+ "rstrip": false,
1488
+ "single_word": false,
1489
+ "special": true
1490
+ },
1491
+ "128186": {
1492
+ "content": "<|reserved_special_token_178|>",
1493
+ "lstrip": false,
1494
+ "normalized": false,
1495
+ "rstrip": false,
1496
+ "single_word": false,
1497
+ "special": true
1498
+ },
1499
+ "128187": {
1500
+ "content": "<|reserved_special_token_179|>",
1501
+ "lstrip": false,
1502
+ "normalized": false,
1503
+ "rstrip": false,
1504
+ "single_word": false,
1505
+ "special": true
1506
+ },
1507
+ "128188": {
1508
+ "content": "<|reserved_special_token_180|>",
1509
+ "lstrip": false,
1510
+ "normalized": false,
1511
+ "rstrip": false,
1512
+ "single_word": false,
1513
+ "special": true
1514
+ },
1515
+ "128189": {
1516
+ "content": "<|reserved_special_token_181|>",
1517
+ "lstrip": false,
1518
+ "normalized": false,
1519
+ "rstrip": false,
1520
+ "single_word": false,
1521
+ "special": true
1522
+ },
1523
+ "128190": {
1524
+ "content": "<|reserved_special_token_182|>",
1525
+ "lstrip": false,
1526
+ "normalized": false,
1527
+ "rstrip": false,
1528
+ "single_word": false,
1529
+ "special": true
1530
+ },
1531
+ "128191": {
1532
+ "content": "<|reserved_special_token_183|>",
1533
+ "lstrip": false,
1534
+ "normalized": false,
1535
+ "rstrip": false,
1536
+ "single_word": false,
1537
+ "special": true
1538
+ },
1539
+ "128192": {
1540
+ "content": "<|reserved_special_token_184|>",
1541
+ "lstrip": false,
1542
+ "normalized": false,
1543
+ "rstrip": false,
1544
+ "single_word": false,
1545
+ "special": true
1546
+ },
1547
+ "128193": {
1548
+ "content": "<|reserved_special_token_185|>",
1549
+ "lstrip": false,
1550
+ "normalized": false,
1551
+ "rstrip": false,
1552
+ "single_word": false,
1553
+ "special": true
1554
+ },
1555
+ "128194": {
1556
+ "content": "<|reserved_special_token_186|>",
1557
+ "lstrip": false,
1558
+ "normalized": false,
1559
+ "rstrip": false,
1560
+ "single_word": false,
1561
+ "special": true
1562
+ },
1563
+ "128195": {
1564
+ "content": "<|reserved_special_token_187|>",
1565
+ "lstrip": false,
1566
+ "normalized": false,
1567
+ "rstrip": false,
1568
+ "single_word": false,
1569
+ "special": true
1570
+ },
1571
+ "128196": {
1572
+ "content": "<|reserved_special_token_188|>",
1573
+ "lstrip": false,
1574
+ "normalized": false,
1575
+ "rstrip": false,
1576
+ "single_word": false,
1577
+ "special": true
1578
+ },
1579
+ "128197": {
1580
+ "content": "<|reserved_special_token_189|>",
1581
+ "lstrip": false,
1582
+ "normalized": false,
1583
+ "rstrip": false,
1584
+ "single_word": false,
1585
+ "special": true
1586
+ },
1587
+ "128198": {
1588
+ "content": "<|reserved_special_token_190|>",
1589
+ "lstrip": false,
1590
+ "normalized": false,
1591
+ "rstrip": false,
1592
+ "single_word": false,
1593
+ "special": true
1594
+ },
1595
+ "128199": {
1596
+ "content": "<|reserved_special_token_191|>",
1597
+ "lstrip": false,
1598
+ "normalized": false,
1599
+ "rstrip": false,
1600
+ "single_word": false,
1601
+ "special": true
1602
+ },
1603
+ "128200": {
1604
+ "content": "<|reserved_special_token_192|>",
1605
+ "lstrip": false,
1606
+ "normalized": false,
1607
+ "rstrip": false,
1608
+ "single_word": false,
1609
+ "special": true
1610
+ },
1611
+ "128201": {
1612
+ "content": "<|reserved_special_token_193|>",
1613
+ "lstrip": false,
1614
+ "normalized": false,
1615
+ "rstrip": false,
1616
+ "single_word": false,
1617
+ "special": true
1618
+ },
1619
+ "128202": {
1620
+ "content": "<|reserved_special_token_194|>",
1621
+ "lstrip": false,
1622
+ "normalized": false,
1623
+ "rstrip": false,
1624
+ "single_word": false,
1625
+ "special": true
1626
+ },
1627
+ "128203": {
1628
+ "content": "<|reserved_special_token_195|>",
1629
+ "lstrip": false,
1630
+ "normalized": false,
1631
+ "rstrip": false,
1632
+ "single_word": false,
1633
+ "special": true
1634
+ },
1635
+ "128204": {
1636
+ "content": "<|reserved_special_token_196|>",
1637
+ "lstrip": false,
1638
+ "normalized": false,
1639
+ "rstrip": false,
1640
+ "single_word": false,
1641
+ "special": true
1642
+ },
1643
+ "128205": {
1644
+ "content": "<|reserved_special_token_197|>",
1645
+ "lstrip": false,
1646
+ "normalized": false,
1647
+ "rstrip": false,
1648
+ "single_word": false,
1649
+ "special": true
1650
+ },
1651
+ "128206": {
1652
+ "content": "<|reserved_special_token_198|>",
1653
+ "lstrip": false,
1654
+ "normalized": false,
1655
+ "rstrip": false,
1656
+ "single_word": false,
1657
+ "special": true
1658
+ },
1659
+ "128207": {
1660
+ "content": "<|reserved_special_token_199|>",
1661
+ "lstrip": false,
1662
+ "normalized": false,
1663
+ "rstrip": false,
1664
+ "single_word": false,
1665
+ "special": true
1666
+ },
1667
+ "128208": {
1668
+ "content": "<|reserved_special_token_200|>",
1669
+ "lstrip": false,
1670
+ "normalized": false,
1671
+ "rstrip": false,
1672
+ "single_word": false,
1673
+ "special": true
1674
+ },
1675
+ "128209": {
1676
+ "content": "<|reserved_special_token_201|>",
1677
+ "lstrip": false,
1678
+ "normalized": false,
1679
+ "rstrip": false,
1680
+ "single_word": false,
1681
+ "special": true
1682
+ },
1683
+ "128210": {
1684
+ "content": "<|reserved_special_token_202|>",
1685
+ "lstrip": false,
1686
+ "normalized": false,
1687
+ "rstrip": false,
1688
+ "single_word": false,
1689
+ "special": true
1690
+ },
1691
+ "128211": {
1692
+ "content": "<|reserved_special_token_203|>",
1693
+ "lstrip": false,
1694
+ "normalized": false,
1695
+ "rstrip": false,
1696
+ "single_word": false,
1697
+ "special": true
1698
+ },
1699
+ "128212": {
1700
+ "content": "<|reserved_special_token_204|>",
1701
+ "lstrip": false,
1702
+ "normalized": false,
1703
+ "rstrip": false,
1704
+ "single_word": false,
1705
+ "special": true
1706
+ },
1707
+ "128213": {
1708
+ "content": "<|reserved_special_token_205|>",
1709
+ "lstrip": false,
1710
+ "normalized": false,
1711
+ "rstrip": false,
1712
+ "single_word": false,
1713
+ "special": true
1714
+ },
1715
+ "128214": {
1716
+ "content": "<|reserved_special_token_206|>",
1717
+ "lstrip": false,
1718
+ "normalized": false,
1719
+ "rstrip": false,
1720
+ "single_word": false,
1721
+ "special": true
1722
+ },
1723
+ "128215": {
1724
+ "content": "<|reserved_special_token_207|>",
1725
+ "lstrip": false,
1726
+ "normalized": false,
1727
+ "rstrip": false,
1728
+ "single_word": false,
1729
+ "special": true
1730
+ },
1731
+ "128216": {
1732
+ "content": "<|reserved_special_token_208|>",
1733
+ "lstrip": false,
1734
+ "normalized": false,
1735
+ "rstrip": false,
1736
+ "single_word": false,
1737
+ "special": true
1738
+ },
1739
+ "128217": {
1740
+ "content": "<|reserved_special_token_209|>",
1741
+ "lstrip": false,
1742
+ "normalized": false,
1743
+ "rstrip": false,
1744
+ "single_word": false,
1745
+ "special": true
1746
+ },
1747
+ "128218": {
1748
+ "content": "<|reserved_special_token_210|>",
1749
+ "lstrip": false,
1750
+ "normalized": false,
1751
+ "rstrip": false,
1752
+ "single_word": false,
1753
+ "special": true
1754
+ },
1755
+ "128219": {
1756
+ "content": "<|reserved_special_token_211|>",
1757
+ "lstrip": false,
1758
+ "normalized": false,
1759
+ "rstrip": false,
1760
+ "single_word": false,
1761
+ "special": true
1762
+ },
1763
+ "128220": {
1764
+ "content": "<|reserved_special_token_212|>",
1765
+ "lstrip": false,
1766
+ "normalized": false,
1767
+ "rstrip": false,
1768
+ "single_word": false,
1769
+ "special": true
1770
+ },
1771
+ "128221": {
1772
+ "content": "<|reserved_special_token_213|>",
1773
+ "lstrip": false,
1774
+ "normalized": false,
1775
+ "rstrip": false,
1776
+ "single_word": false,
1777
+ "special": true
1778
+ },
1779
+ "128222": {
1780
+ "content": "<|reserved_special_token_214|>",
1781
+ "lstrip": false,
1782
+ "normalized": false,
1783
+ "rstrip": false,
1784
+ "single_word": false,
1785
+ "special": true
1786
+ },
1787
+ "128223": {
1788
+ "content": "<|reserved_special_token_215|>",
1789
+ "lstrip": false,
1790
+ "normalized": false,
1791
+ "rstrip": false,
1792
+ "single_word": false,
1793
+ "special": true
1794
+ },
1795
+ "128224": {
1796
+ "content": "<|reserved_special_token_216|>",
1797
+ "lstrip": false,
1798
+ "normalized": false,
1799
+ "rstrip": false,
1800
+ "single_word": false,
1801
+ "special": true
1802
+ },
1803
+ "128225": {
1804
+ "content": "<|reserved_special_token_217|>",
1805
+ "lstrip": false,
1806
+ "normalized": false,
1807
+ "rstrip": false,
1808
+ "single_word": false,
1809
+ "special": true
1810
+ },
1811
+ "128226": {
1812
+ "content": "<|reserved_special_token_218|>",
1813
+ "lstrip": false,
1814
+ "normalized": false,
1815
+ "rstrip": false,
1816
+ "single_word": false,
1817
+ "special": true
1818
+ },
1819
+ "128227": {
1820
+ "content": "<|reserved_special_token_219|>",
1821
+ "lstrip": false,
1822
+ "normalized": false,
1823
+ "rstrip": false,
1824
+ "single_word": false,
1825
+ "special": true
1826
+ },
1827
+ "128228": {
1828
+ "content": "<|reserved_special_token_220|>",
1829
+ "lstrip": false,
1830
+ "normalized": false,
1831
+ "rstrip": false,
1832
+ "single_word": false,
1833
+ "special": true
1834
+ },
1835
+ "128229": {
1836
+ "content": "<|reserved_special_token_221|>",
1837
+ "lstrip": false,
1838
+ "normalized": false,
1839
+ "rstrip": false,
1840
+ "single_word": false,
1841
+ "special": true
1842
+ },
1843
+ "128230": {
1844
+ "content": "<|reserved_special_token_222|>",
1845
+ "lstrip": false,
1846
+ "normalized": false,
1847
+ "rstrip": false,
1848
+ "single_word": false,
1849
+ "special": true
1850
+ },
1851
+ "128231": {
1852
+ "content": "<|reserved_special_token_223|>",
1853
+ "lstrip": false,
1854
+ "normalized": false,
1855
+ "rstrip": false,
1856
+ "single_word": false,
1857
+ "special": true
1858
+ },
1859
+ "128232": {
1860
+ "content": "<|reserved_special_token_224|>",
1861
+ "lstrip": false,
1862
+ "normalized": false,
1863
+ "rstrip": false,
1864
+ "single_word": false,
1865
+ "special": true
1866
+ },
1867
+ "128233": {
1868
+ "content": "<|reserved_special_token_225|>",
1869
+ "lstrip": false,
1870
+ "normalized": false,
1871
+ "rstrip": false,
1872
+ "single_word": false,
1873
+ "special": true
1874
+ },
1875
+ "128234": {
1876
+ "content": "<|reserved_special_token_226|>",
1877
+ "lstrip": false,
1878
+ "normalized": false,
1879
+ "rstrip": false,
1880
+ "single_word": false,
1881
+ "special": true
1882
+ },
1883
+ "128235": {
1884
+ "content": "<|reserved_special_token_227|>",
1885
+ "lstrip": false,
1886
+ "normalized": false,
1887
+ "rstrip": false,
1888
+ "single_word": false,
1889
+ "special": true
1890
+ },
1891
+ "128236": {
1892
+ "content": "<|reserved_special_token_228|>",
1893
+ "lstrip": false,
1894
+ "normalized": false,
1895
+ "rstrip": false,
1896
+ "single_word": false,
1897
+ "special": true
1898
+ },
1899
+ "128237": {
1900
+ "content": "<|reserved_special_token_229|>",
1901
+ "lstrip": false,
1902
+ "normalized": false,
1903
+ "rstrip": false,
1904
+ "single_word": false,
1905
+ "special": true
1906
+ },
1907
+ "128238": {
1908
+ "content": "<|reserved_special_token_230|>",
1909
+ "lstrip": false,
1910
+ "normalized": false,
1911
+ "rstrip": false,
1912
+ "single_word": false,
1913
+ "special": true
1914
+ },
1915
+ "128239": {
1916
+ "content": "<|reserved_special_token_231|>",
1917
+ "lstrip": false,
1918
+ "normalized": false,
1919
+ "rstrip": false,
1920
+ "single_word": false,
1921
+ "special": true
1922
+ },
1923
+ "128240": {
1924
+ "content": "<|reserved_special_token_232|>",
1925
+ "lstrip": false,
1926
+ "normalized": false,
1927
+ "rstrip": false,
1928
+ "single_word": false,
1929
+ "special": true
1930
+ },
1931
+ "128241": {
1932
+ "content": "<|reserved_special_token_233|>",
1933
+ "lstrip": false,
1934
+ "normalized": false,
1935
+ "rstrip": false,
1936
+ "single_word": false,
1937
+ "special": true
1938
+ },
1939
+ "128242": {
1940
+ "content": "<|reserved_special_token_234|>",
1941
+ "lstrip": false,
1942
+ "normalized": false,
1943
+ "rstrip": false,
1944
+ "single_word": false,
1945
+ "special": true
1946
+ },
1947
+ "128243": {
1948
+ "content": "<|reserved_special_token_235|>",
1949
+ "lstrip": false,
1950
+ "normalized": false,
1951
+ "rstrip": false,
1952
+ "single_word": false,
1953
+ "special": true
1954
+ },
1955
+ "128244": {
1956
+ "content": "<|reserved_special_token_236|>",
1957
+ "lstrip": false,
1958
+ "normalized": false,
1959
+ "rstrip": false,
1960
+ "single_word": false,
1961
+ "special": true
1962
+ },
1963
+ "128245": {
1964
+ "content": "<|reserved_special_token_237|>",
1965
+ "lstrip": false,
1966
+ "normalized": false,
1967
+ "rstrip": false,
1968
+ "single_word": false,
1969
+ "special": true
1970
+ },
1971
+ "128246": {
1972
+ "content": "<|reserved_special_token_238|>",
1973
+ "lstrip": false,
1974
+ "normalized": false,
1975
+ "rstrip": false,
1976
+ "single_word": false,
1977
+ "special": true
1978
+ },
1979
+ "128247": {
1980
+ "content": "<|reserved_special_token_239|>",
1981
+ "lstrip": false,
1982
+ "normalized": false,
1983
+ "rstrip": false,
1984
+ "single_word": false,
1985
+ "special": true
1986
+ },
1987
+ "128248": {
1988
+ "content": "<|reserved_special_token_240|>",
1989
+ "lstrip": false,
1990
+ "normalized": false,
1991
+ "rstrip": false,
1992
+ "single_word": false,
1993
+ "special": true
1994
+ },
1995
+ "128249": {
1996
+ "content": "<|reserved_special_token_241|>",
1997
+ "lstrip": false,
1998
+ "normalized": false,
1999
+ "rstrip": false,
2000
+ "single_word": false,
2001
+ "special": true
2002
+ },
2003
+ "128250": {
2004
+ "content": "<|reserved_special_token_242|>",
2005
+ "lstrip": false,
2006
+ "normalized": false,
2007
+ "rstrip": false,
2008
+ "single_word": false,
2009
+ "special": true
2010
+ },
2011
+ "128251": {
2012
+ "content": "<|reserved_special_token_243|>",
2013
+ "lstrip": false,
2014
+ "normalized": false,
2015
+ "rstrip": false,
2016
+ "single_word": false,
2017
+ "special": true
2018
+ },
2019
+ "128252": {
2020
+ "content": "<|reserved_special_token_244|>",
2021
+ "lstrip": false,
2022
+ "normalized": false,
2023
+ "rstrip": false,
2024
+ "single_word": false,
2025
+ "special": true
2026
+ },
2027
+ "128253": {
2028
+ "content": "<|reserved_special_token_245|>",
2029
+ "lstrip": false,
2030
+ "normalized": false,
2031
+ "rstrip": false,
2032
+ "single_word": false,
2033
+ "special": true
2034
+ },
2035
+ "128254": {
2036
+ "content": "<|reserved_special_token_246|>",
2037
+ "lstrip": false,
2038
+ "normalized": false,
2039
+ "rstrip": false,
2040
+ "single_word": false,
2041
+ "special": true
2042
+ },
2043
+ "128255": {
2044
+ "content": "<|reserved_special_token_247|>",
2045
+ "lstrip": false,
2046
+ "normalized": false,
2047
+ "rstrip": false,
2048
+ "single_word": false,
2049
+ "special": true
2050
+ },
2051
+ "128256": {
2052
+ "content": "<|reserved_special_token_248|>",
2053
+ "lstrip": false,
2054
+ "normalized": false,
2055
+ "rstrip": false,
2056
+ "single_word": false,
2057
+ "special": true
2058
+ },
2059
+ "128257": {
2060
+ "content": "<|reserved_special_token_249|>",
2061
+ "lstrip": false,
2062
+ "normalized": false,
2063
+ "rstrip": false,
2064
+ "single_word": false,
2065
+ "special": true
2066
+ },
2067
+ "128258": {
2068
+ "content": "<|reserved_special_token_250|>",
2069
+ "lstrip": false,
2070
+ "normalized": false,
2071
+ "rstrip": false,
2072
+ "single_word": false,
2073
+ "special": true
2074
+ }
2075
+ },
2076
+ "bos_token": "<|begin_of_text|>",
2077
+ "chat_template": "{% if not add_generation_prompt is defined %}{% set add_generation_prompt = false %}{% endif %}{% set loop_messages = messages %}{% for message in loop_messages %}{% set content = '<|start_header_id|>' + message['role'] + '<|end_header_id|>\n\n'+ message['content'] | trim + '<|eot_id|>' %}{% if loop.index0 == 0 %}{% set content = bos_token + content %}{% endif %}{{ content }}{% endfor %}{% if add_generation_prompt %}{{ '<|start_header_id|>assistant<|end_header_id|>\n\n' }}{% endif %}",
2078
+ "clean_up_tokenization_spaces": true,
2079
+ "eos_token": "<|eot_id|>",
2080
+ "extra_special_tokens": {},
2081
+ "model_input_names": [
2082
+ "input_ids",
2083
+ "attention_mask"
2084
+ ],
2085
+ "model_max_length": 131072,
2086
+ "pad_token": "<|finetune_right_pad_id|>",
2087
+ "tokenizer_class": "PreTrainedTokenizerFast"
2088
+ }
tone.md ADDED
@@ -0,0 +1,39 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Chapter One - The Cold Truth
2
+
3
+ The darkness of space was like nothing else, a void so profound it felt alive, as if it watched and waited. It was a sensation that Tone had grown accustomed to over the years, but it still left him with an uneasy feeling in the pit of his stomach. He floated before the viewport of his spacecraft, gazing out at the distant blue-green glow of Earth, the planet that had been his home until now. It seemed so small, so insignificant in the vast expanse of space, yet it held everything he loved. Everything he would never see again.
4
+
5
+ "Goodbye," he whispered, his voice barely audible over the hum of the ship's systems.
6
+
7
+ Tone turned away from the view and made his way back to the cockpit, his movements fluid in the weightlessness of space. He strapped himself into the pilot's seat, running a hand through his short, dark hair as he began to go through the pre-flight checks. Everything looked good, all systems nominal. But he knew better than to trust the readings. Trusting technology too much had gotten him into this mess in the first place.
8
+
9
+ As he worked, his mind wandered back to the events that had led him here. The malfunction of the life support system during the journey to Europa, the desperate repairs that had barely kept him alive. And then, the discovery that the craft meant for his return journey was beyond repair. No matter how hard he tried, no matter what parts he scavenged from his current ship, there was no fixing it. He was stranded.
10
+
11
+ But Tone was not one to give up easily. He had a mission to complete, and he would see it through to the end, no matter the cost. He set a course for Titan, the largest moon of Saturn, and the destination of his original mission. There was still so much to learn, so much to discover. And even though he would not be returning to Earth, he could still contribute to the advancement of human knowledge.
12
+
13
+ The months passed slowly, each day a reminder of his limited time. Tone threw himself into his work, conducting experiments and gathering data with a fervor that bordered on obsession. He barely slept or ate, pushing his body to its limits. But the results were worth it. His findings were groundbreaking, shedding new light on the composition of Titan's atmosphere and the potential for life beneath its icy surface.
14
+
15
+ And yet, despite the excitement of his discoveries, Tone couldn't shake the feeling of impending doom. He knew that his supplies would eventually run out, that he would have to face the reality of his situation. But for now, he pushed those thoughts aside, focusing on the work at hand.
16
+
17
+ One day, as he was analyzing a sample of Titan's surface material, Tone received an unexpected transmission from Earth. His heart leapt at the sound of a human voice, even though he knew it was just a recording. He listened intently, hoping against hope that it was news of a rescue mission, or perhaps a solution to his predicament.
18
+
19
+ But it was not to be. The message was from his mission control team, informing him of their efforts to find a way to bring him home. They spoke of new technologies being developed, of teams working around the clock to find a solution. But Tone knew better. He had seen the state of his return craft, had assessed the damage firsthand. There was no saving him.
20
+
21
+ He felt a pang of sadness, of loss, as he realized that he would never again walk on the surface of Earth, never again feel the warmth of the sun on his skin. But he pushed those feelings aside, focusing on the present. He still had time, and he would make the most of it.
22
+
23
+ "Tone, this is Mission Control. We are doing everything we can to bring you home. Please hold on, help is on the way."
24
+
25
+ Tone smiled wryly, shaking his head. They didn't understand, they couldn't understand. He was already gone, lost in the vastness of space. But he would not go quietly into the night. He would fight, he would struggle, he would do everything in his power to ensure that his mission was not in vain.
26
+
27
+ "Mission Control, this is Tone. I appreciate your efforts, but I fear it may be too late. My supplies are running low, and I do not believe there is any way to repair my return craft. I will continue to gather data and conduct experiments until the end. I want my mission to mean something."
28
+
29
+ There was a pause, and then the voice came back, filled with emotion. "Tone, we understand. We will do everything we can to support you, to make sure your sacrifice is not in vain. Keep transmitting your findings, we will make sure they are used to advance our understanding of the universe."
30
+
31
+ Tone nodded, even though he knew they could not see him. "I will. Thank you, Mission Control. It means a lot to me."
32
+
33
+ The transmission ended, leaving Tone alone once again in the silence of space. He took a deep breath, steeling himself for what was to come. He would not go quietly into the night. He would rage, rage against the dying of the light.
34
+
35
+ And so, Tone continued his work, pouring all of his energy into his research. He delved deeper into the mysteries of Titan, uncovering secrets that had lain hidden for billions of years. His findings were revolutionary, changing the course of human understanding forever.
36
+
37
+ In the end, it was not the cold of space that claimed him, but the cold of his own mortality. Tone's body gave out, weakened by months of malnutrition and lack of sleep. But his mind remained sharp, his thoughts clear until the very end.
38
+
39
+ As the darkness closed in, Tone smiled. He had done it, he had made a difference. His mission had not been in vain. And in the vast expanse of space, his legacy would live on, a testament to the human spirit's ability to persevere in the face of adversity.