File size: 4,556 Bytes
1135e11
 
 
 
 
 
 
 
 
 
 
 
 
4bf9f91
 
 
 
 
34a63b1
 
 
 
 
 
0d1e6ae
685246d
6dbfe0a
 
 
887e07a
6dbfe0a
 
 
34a63b1
 
d189a7f
 
4bf9f91
 
 
34a63b1
d189a7f
 
 
 
 
 
 
 
4bf9f91
 
 
34a63b1
4bf9f91
34a63b1
9005629
 
d14ca68
 
9005629
e71964f
 
b860a1c
 
 
 
 
 
 
e71964f
34a63b1
 
4bf9f91
34a63b1
 
 
d189a7f
34a63b1
 
d189a7f
34a63b1
 
 
4bf9f91
 
d189a7f
b905360
 
 
4bf9f91
1135e11
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
---
base_model:
- PocketDoc/Dans-PersonalityEngine-V1.2.0-24b
- SicariusSicariiStuff/Redemption_Wind_24B
tags:
- merge
- mergekit
- lazymergekit
- PocketDoc/Dans-PersonalityEngine-V1.2.0-24b
- SicariusSicariiStuff/Redemption_Wind_24B
---

# WindEngine-24B-Instruct
<html lang="en">
<head>
    <meta charset="UTF-8">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
</head>
<div class="winter-container">
  <div class="winter-case">
    <div class="winter-inner-case">
      <div class="winter-bezel">
<div class="terminal-screen">
  <div style="text-align: center;">
        <h2 style="color: #8ecae6; font-size: 32px;">WinterEngine-24B-Instruct</h2>
    <pre class="code-block" style="display: inline-block; text-align: middle; white-space: pre; color: #ffffff;">  
       ❄       ❄         
        ❄     ❄          
  ❄      ❄❄❄      ❄     
     ❄❄❄❄❄❄❄❄❄      
  ❄      ❄❄❄      ❄     
        ❄     ❄          
       ❄       ❄        
    </pre>
  </div>
  <h3 style="color: #8ecae6;">Key Details</h3>
  <pre class="code-block" style="color: #ffffff; background: linear-gradient(135deg, #219ebc, #8ecae6);">
BASE MODEL: mistralai/Mistral-Small-24B-Base-2501
LICENSE: apache-2.0
LANGUAGE: English
CONTEXT LENGTH: 32768 tokens</pre>
  <h3 style="color: #8ecae6;">Recommended Settings</h3>
  <pre class="code-block" style="color: #ffffff; background: linear-gradient(135deg, #219ebc, #8ecae6);">
TEMPERATURE: 1.2
MIN_P: 0.05
(Everything Else Neutral MEME Samplers Too.)
</pre>
  <h3 style="color: #8ecae6;">Prompting Format</h3>
  <pre class="code-block" style="color: #ffffff; background: linear-gradient(135deg, #219ebc, #8ecae6);">
<|im_start|>system
system prompt<|im_end|>
<|im_start|>user
Hello, WinterEngine!<|im_end|>
<|im_start|>assistant
Hello! How can I help you today?<|im_end|></pre>
<h3 style="color: #8ecae6;">Quants</h3>
<pre class="code-block"style="color: #ffffff; background: linear-gradient(135deg, #219ebc, #8ecae6);">
I-mat: https://huggingface.co/mradermacher/WindEngine-24B-Instruct-i1-GGUF
Normal: https://huggingface.co/mradermacher/WindEngine-24B-Instruct-GGUF
<h2>Big Thanks to mradermacher for the Quants.</h2></pre>
<h3 style="color: #8ecae6;">Story</h3>
<pre class="code-block" style="color: #ffffff; background: linear-gradient(135deg, #219ebc, #8ecae6);">
You can ignore this if you want, but I just wanted to share something. 
I was trying to create a model that follows prompts well, stays uncensored, and brings a lot of creativity β€” especially with roleplay capabilities. 
Started out using the base 24B Instruct model β€” it was decent, but felt a bit dry and overly censored.
So, I began testing and merging different models.
Then found PersonalityEngine 24B, which followed instructions well and had solid roleplay potential, though it felt a little bland.
Discovered Redemption Winds β€” much better at roleplay, but not as strong when it came to following instructions. After trying three different model merges, this pairing turned out to be the best combination.
[The result? A model that follows instructions, excels at roleplay, and β€” for my single folks out there β€” works great for AI girlfriend roleplay, too.] </pre>
  
</div>
      </div>
    </div>
  </div>
</div>
<style>
@import url('https://fonts.googleapis.com/css2?family=Fira+Code&display=swap');
.winter-container { background-color: #edf6f9; padding: 20px; border-radius: 20px; }
.winter-case { border: 2px solid #8ecae6; padding: 10px; }
.terminal-screen { background-color: #023047; color: #ffb703; padding: 15px; border-radius: 15px; font-family: 'Fira Code', monospace; }
.code-block { background: #219ebc; padding: 10px; border-radius: 10px; }
</style>




[LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [PocketDoc/Dans-PersonalityEngine-V1.2.0-24b](https://huggingface.co/PocketDoc/Dans-PersonalityEngine-V1.2.0-24b)
* [SicariusSicariiStuff/Redemption_Wind_24B](https://huggingface.co/SicariusSicariiStuff/Redemption_Wind_24B)

## 🧩 Configuration

```yaml
slices:
  - sources:
      - model: PocketDoc/Dans-PersonalityEngine-V1.2.0-24b
        layer_range: [0, 40]
      - model: SicariusSicariiStuff/Redemption_Wind_24B
        layer_range: [0, 40]
merge_method: slerp
base_model: PocketDoc/Dans-PersonalityEngine-V1.2.0-24b
parameters:
  t:
    - filter: self_attn
      value: [0, 0.5, 0.3, 0.7, 1]
    - filter: mlp
      value: [1, 0.5, 0.7, 0.3, 0]
    - value: 0.5
dtype: bfloat16
```