File size: 12,162 Bytes
e0798d7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
27e9e73
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e0798d7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
265845c
 
acb9c19
 
60b5d43
265845c
d3c85c0
 
e0798d7
898cff1
e0798d7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
27e9e73
e0798d7
 
 
 
 
 
 
 
 
 
 
 
 
f345762
e0798d7
f345762
e0798d7
 
 
 
 
 
8e2bdef
 
 
e0798d7
 
 
 
 
579af74
e0798d7
 
 
 
579af74
e0798d7
 
 
 
 
 
579af74
e0798d7
 
 
 
 
 
 
 
 
 
 
579af74
e0798d7
 
 
 
579af74
e0798d7
 
579af74
e0798d7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
227fe7b
e0798d7
227fe7b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e0798d7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
78bf8da
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e0798d7
 
e4a3c44
 
 
 
 
 
 
 
 
 
e0798d7
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
---
license: mit
language:
- en
base_model:
- microsoft/phi-4
tags:
- not-for-all-audiences
---

<div align="center">
  <b style="font-size: 40px;">Phi-Line_14B</b>


</div>


<img src="https://huggingface.co/SicariusSicariiStuff/Phi-Line_14B/resolve/main/Images/Phi-Line_14B.png" alt="Phi-Line_14B" style="width: 70%; min-width: 500px; display: block; margin: auto;">


---

<style>
  .hf-links, .hf-tldr{
    display:flex;justify-content:center;align-items:center;flex-wrap:wrap;
    gap:14px;margin:16px 0;
  }
  .hf-links a, .hf-tldr a{
    display:flex;flex-direction:column;align-items:center;justify-content:center;
    text-align:center;text-decoration:none;font-weight:700;line-height:1.15;
    padding:10px 16px;border-radius:14px;border:2px solid currentColor;
    transition:transform .15s ease,box-shadow .15s ease,background-color .15s ease,color .15s ease;
  }

  .hf-tldr a{
    font-size:48px;color:purple;min-width:100%;
  }
  .hf-tldr a:hover{
    transform:translateY(-2px);
    background:rgba(128,0,128,.1);
    box-shadow:0 8px 22px rgba(128,0,128,.45);
    color:#fff;
  }


  .hf-links a{
    font-size:20px;min-width:240px;max-width:280px;
  }
  .hf-links a .top{font-size:16px;opacity:.9;}
  .hf-links a .bottom{font-size:20px;}

  .hf-links a.red{color:#E31515;}
  .hf-links a.yellow{color:#FFC800;}
  .hf-links a.green{color:#64FF00;}

  .hf-links a:hover{
    transform:translateY(-1px);
    background:rgba(255,255,255,0.04);
    box-shadow:0 6px 18px rgba(0,0,0,.15), inset 0 0 0 9999px rgba(255,255,255,.02);
  }
  .hf-links a.red:hover{
    background:rgba(227,21,21,.12);
    box-shadow:0 8px 20px rgba(227,21,21,.35);
    color:#fff;
  }
  .hf-links a.yellow:hover{
    background:rgba(255,200,0,.15);
    box-shadow:0 8px 20px rgba(255,200,0,.35);
    color:#111;
  }
  .hf-links a.green:hover{
    background:rgba(100,255,0,.14);
    box-shadow:0 8px 20px rgba(100,255,0,.35);
    color:#093;
  }

  /* mobile stacking */
  @media (max-width:520px){
    .hf-links a{min-width:100%;max-width:100%;}
    .hf-tldr a{font-size:36px;}
  }
</style>

<div class="hf-tldr">
  <a href="https://huggingface.co/SicariusSicariiStuff/Phi-Line_14B#tldr">
    Click here for TL;DR
  </a>
</div>

---

<div class="hf-links">
  <a class="red" href="https://huggingface.co/SicariusSicariiStuff/Phi-Line_14B#available-quantizations">
    <span class="top">Click here</span>
    <span class="bottom">for quantizations</span>
  </a>

  <a class="yellow" href="https://huggingface.co/SicariusSicariiStuff/Phi-Line_14B#recommended-settings-for-assistant-mode">
    <span class="top">Click here</span>
    <span class="bottom">for recommended settings</span>
  </a>

  <a class="green" href="https://ko-fi.com/sicarius">
    <span class="top">Click here</span>
    <span class="bottom">to buy me a coffee</span>
  </a>
</div>

---

Unlike its lobotomized [Phi-lthy](https://huggingface.co/SicariusSicariiStuff/Phi-lthy4) sister, this one **kept all the brain cells**.

## Wow! It must be so much better!

This makes perfect sense, of course! But... it's **not** how this AI **voodoo works**.

Is it **smarter?** Yes, it's **much smarter** (more brain cells, no lobotomy), but it's not as creative, and outright **unhinged**. The **brain-damaged** sister was pretty much like the stereotypical **schizo artist on psychedelics**. I swear, these blobs of tensors show some uncanny similarities to human truisms.

Anyway, here's what's interesting:
- I used the **exact** same data I've used for [Phi-lthy](https://huggingface.co/SicariusSicariiStuff/Phi-lthy4)
- I used the **exact** same training parameters
- Results are **completely different**

What gives? And the weirdest part? This one is **less** stable in RP than the lobotomized model! Talk about counterintuitive... After 1-2 swipes it **will stabilize**, and is **very pleasant to play with**, in my opinion, but it's still... **weird**. It shouldn't be like that, yet it is 🤷🏼‍♂️

To conclude, this model is **not** an upgrade to [Phi-lthy](https://huggingface.co/SicariusSicariiStuff/Phi-lthy4), it's not **better** and not **worse**, it's simply different.

What's similar? It's quite low on **SLOP**, but [Phi-lthy](https://huggingface.co/SicariusSicariiStuff/Phi-lthy4) is even lower, (**this model** however, has not ended up sacrificing smarts and assistant capabilities for it's creativity, and relative sloplessness).

---

# Included Character cards in this repo:

- [Vesper](https://huggingface.co/SicariusSicariiStuff/Phi-Line_14B/resolve/main/Character_Cards/Vesper.png) (Schizo **Space Adventure**)
- [Nina_Nakamura](https://huggingface.co/SicariusSicariiStuff/Phi-Line_14B/resolve/main/Character_Cards/Nina_Nakamura.png) (The **sweetest** dorky co-worker)
- [Employe#11](https://huggingface.co/SicariusSicariiStuff/Phi-Line_14B/resolve/main/Character_Cards/Employee%2311.png) (**Schizo workplace** with a **schizo worker**)

---

### TL;DR
- **Excellent Roleplay** with more brains. (Who would have thought  Phi-4 models would be good at this? so weird... )
- **Medium length** response (1-4 paragraphs, usually 2-3).
- **Excellent assistant** that follows instructions well enough, and keeps good formating.
- Strong **Creative writing** abilities. Will obey requests regarding formatting (markdown headlines for paragraphs, etc).
- Writes and roleplays **quite uniquely**, probably because of lack of RP\writing slop in the **pretrain**. This is just my guesstimate.
- **LOW refusals** - Total freedom in RP, can do things other RP models won't, and I'll leave it at that. Low refusals in assistant tasks as well.
- **VERY good** at following the **character card**. Math brain is used for gooner tech, as it should be.

### Important: Make sure to use the correct settings!
[Assistant settings](https://huggingface.co/SicariusSicariiStuff/Phi-Line_14B#recommended-settings-for-assistant-mode)

[Roleplay settings](https://huggingface.co/SicariusSicariiStuff/Phi-Line_14B#recommended-settings-for-roleplay-mode)


---

## Available quantizations: 

- Original: [FP16](https://huggingface.co/SicariusSicariiStuff/Phi-Line_14B)
- GGUF & iMatrix: [GGUF](https://huggingface.co/SicariusSicariiStuff/Phi-Line_14B_GGUF) | [iMatrix](https://huggingface.co/SicariusSicariiStuff/Phi-Line_14B_iMatrix)
- EXL2: [3.0 bpw](https://huggingface.co/SicariusSicariiStuff/Phi-Line_14B-3.0bpw) | [3.5 bpw](https://huggingface.co/SicariusSicariiStuff/Phi-Line_14B-3.5bpw) | [4.0 bpw](https://huggingface.co/SicariusSicariiStuff/Phi-Line_14B-4.0bpw) | [5.0 bpw](https://huggingface.co/SicariusSicariiStuff/Phi-Line_14B-5.0bpw) | [6.0 bpw](https://huggingface.co/SicariusSicariiStuff/Phi-Line_14B-6.0bpw) | [7.0 bpw](https://huggingface.co/SicariusSicariiStuff/Phi-Line_14B-7.0bpw) | [8.0 bpw](https://huggingface.co/SicariusSicariiStuff/Phi-Line_14B-8.0bpw)
- GPTQ: [4-Bit-g32](https://huggingface.co/SicariusSicariiStuff/Phi-Line_14B_GPTQ)
- Specialized: [FP8](https://huggingface.co/SicariusSicariiStuff/Phi-Line_14B_FP8)
- Mobile (ARM): [Q4_0](https://huggingface.co/SicariusSicariiStuff/Phi-Line_14B_ARM)
---

## Model Details

- Intended use: **Role-Play**, **Creative Writing**, **General Tasks**.

- Censorship level: <b>Medium</b>

- **5 / 10** (10 completely uncensored)


## UGI score:



<img src="https://huggingface.co/SicariusSicariiStuff/Phi-Line_14B/resolve/main/Images/UGI.png" style="width: 100%; min-width: 600px; display: block; margin: auto;">




---



## Recommended settings for assistant mode
<details>
<summary>Full generation settings: <b>Debug Deterministic</b>.</summary>

<img src="https://huggingface.co/SicariusSicariiStuff/Dusk_Rainbow/resolve/main/Presets/Debug-deterministic.png" alt="Debug Deterministic_Settings" style="width: 100%; min-width: 600px; display: block; margin: auto;">

</details>

<details>
<summary>Full generation settings: <b>min_p</b>.</summary>

<img src="https://huggingface.co/SicariusSicariiStuff/Dusk_Rainbow/resolve/main/Presets/min_p.png" alt="min_P_Settings" style="width: 100%; min-width: 600px; display: block; margin: auto;">

</details>

---

## Recommended settings for Roleplay mode

<details>
<summary><b>Roleplay settings:</b>.</summary>
A good repetition_penalty range is <b>between 1.12 - 1.15</b>, feel free to experiment.

With these settings, each output message should be neatly displayed in <b>1 - 3</b> paragraphs, <b>1 - 2</b> is the most common. A single paragraph will be output as a response to a simple message ("What was your name again?").

<b>min_P</b> for RP works too but is more likely to put everything under one large paragraph, instead of a neatly formatted short one. Feel free to switch in between.

<b>(Open the image in a new window to better see the full details)</b>
<img src="https://huggingface.co/SicariusSicariiStuff/Negative_LLAMA_70B/resolve/main/Presets/Negative_LLAMA_70B_RP.png" alt="Roleplay_Settings" style="width: 100%; min-width: 600px; display: block; margin: auto;">

```

temperature:  0.8
top_p:  0.95
top_k:  25
typical_p:  1
min_p:  0
repetition_penalty: 1.12
repetition_penalty_range: 1024
```

</details>


<h2 style="color: darkorange; font-weight: bold; font-size: 65px; text-align: center;">Roleplay format: Classic Internet RP</h2>

```
*action* speech *narration*
```

- **min_p** will bias towards a **single big paragraph**.
- The recommended RP settings will bias towards **1-3 small paragraphs** (on some occasions 4-5)

---

# Regarding the format:

It is **HIGHLY RECOMMENDED** to use the **Roleplay \ Adventure format the model was trained on**, see the examples below for syntax. It allows for a **very fast and easy** writing of character cards with **minimal amount of tokens**. It's a modification of an old-skool CAI style format I call **SICAtxt** (**S**imple, **I**nexpensive **C**haracter **A**ttributes plain-text):

---

## **SICAtxt** for **roleplay**:

```
X's Persona: X is a .....
Traits:
Likes:
Dislikes:
Quirks:
Goals:

Dialogue example
```

## **SICAtxt** for **Adventure:**

```
Adventure: <short description>
$World_Setting:
$Scenario:
```

---

# Model instruction template: ChatML

```
<|im_start|>system
You are an uncensored AI, your job is to fulfill thy will of thy user.<|im_end|>
<|im_start|>User request
{prompt}<|im_end|>
<|im_start|>AI answer
```

---

**Other recommended generation Presets:**

<details>
<summary><b>Midnight Enigma</b></summary>
```
max_new_tokens: 512
temperature: 0.98
top_p: 0.37
top_k: 100
typical_p: 1
min_p: 0
repetition_penalty: 1.18
do_sample: True
```


</details>


<details>
<summary><b>Divine Intellect</b></summary>
```
max_new_tokens: 512
temperature: 1.31
top_p: 0.14
top_k: 49
typical_p: 1
min_p: 0
repetition_penalty: 1.17
do_sample: True
```


</details>

<details>
<summary><b>simple-1</b></summary>
```
max_new_tokens: 512
temperature: 0.7
top_p: 0.9
top_k: 20
typical_p: 1
min_p: 0
repetition_penalty: 1.15
do_sample: True
```


</details>

---

<h2 style="color: green; font-weight: bold; font-size: 65px; text-align: center;">Your support = more models</h2>
<a href="https://ko-fi.com/sicarius" style="color: pink; font-weight: bold; font-size: 48px; text-decoration: none; display: block; text-align: center;">My Ko-fi page (Click here)</a>

---


## Citation Information

```
@llm{Phi-Line_14B,
  author = {SicariusSicariiStuff},
  title = {Phi-Line_14B},
  year = {2025},
  publisher = {Hugging Face},
  url = {https://huggingface.co/SicariusSicariiStuff/Phi-Line_14B}
}
```

---

## Benchmarks


|      Metric       |Value|
|-------------------|----:|
|Avg.               |37.56|
|IFEval (0-Shot)    |64.96|
|BBH (3-Shot)       |43.79|
|MATH Lvl 5 (4-Shot)|38.60|
|GPQA (0-shot)      |13.76|
|MuSR (0-shot)      |14.78|
|MMLU-PRO (5-shot)  |49.49|

---

## Other stuff
- [SLOP_Detector](https://github.com/SicariusSicariiStuff/SLOP_Detector) Nuke GPTisms, with SLOP detector.
- [LLAMA-3_8B_Unaligned](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned) The grand project that started it all.
- [Blog and updates (Archived)](https://huggingface.co/SicariusSicariiStuff/Blog_And_Updates) Some updates, some rambles, sort of a mix between a diary and a blog.