Darkknight535 commited on
Commit
4bf9f91
·
verified ·
1 Parent(s): 1135e11

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +90 -25
README.md CHANGED
@@ -16,6 +16,96 @@ WindEngine-24B-Instruct is a merge of the following models using [LazyMergekit](
16
  * [PocketDoc/Dans-PersonalityEngine-V1.2.0-24b](https://huggingface.co/PocketDoc/Dans-PersonalityEngine-V1.2.0-24b)
17
  * [SicariusSicariiStuff/Redemption_Wind_24B](https://huggingface.co/SicariusSicariiStuff/Redemption_Wind_24B)
18
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
19
  ## 🧩 Configuration
20
 
21
  ```yaml
@@ -35,29 +125,4 @@ parameters:
35
  value: [1, 0.5, 0.7, 0.3, 0]
36
  - value: 0.5
37
  dtype: bfloat16
38
- ```
39
-
40
- ## 💻 Usage
41
-
42
- ```python
43
- !pip install -qU transformers accelerate
44
-
45
- from transformers import AutoTokenizer
46
- import transformers
47
- import torch
48
-
49
- model = "Darkknight535/WindEngine-24B-Instruct"
50
- messages = [{"role": "user", "content": "What is a large language model?"}]
51
-
52
- tokenizer = AutoTokenizer.from_pretrained(model)
53
- prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
54
- pipeline = transformers.pipeline(
55
- "text-generation",
56
- model=model,
57
- torch_dtype=torch.float16,
58
- device_map="auto",
59
- )
60
-
61
- outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
62
- print(outputs[0]["generated_text"])
63
  ```
 
16
  * [PocketDoc/Dans-PersonalityEngine-V1.2.0-24b](https://huggingface.co/PocketDoc/Dans-PersonalityEngine-V1.2.0-24b)
17
  * [SicariusSicariiStuff/Redemption_Wind_24B](https://huggingface.co/SicariusSicariiStuff/Redemption_Wind_24B)
18
 
19
+
20
+ <html lang="en">
21
+ <head>
22
+ <meta charset="UTF-8">
23
+ <meta name="viewport" content="width=device-width, initial-scale=1.0">
24
+ <style>
25
+ body {
26
+ background: linear-gradient(to bottom, #e0f7fa, #ffffff);
27
+ font-family: Arial, sans-serif;
28
+ color: #004d40;
29
+ text-align: center;
30
+ padding: 20px;
31
+ }
32
+ .title {
33
+ font-size: 2.5rem;
34
+ font-weight: bold;
35
+ margin-bottom: 10px;
36
+ }
37
+ .logo {
38
+ width: 100px;
39
+ height: 100px;
40
+ background: #e0f7fa;
41
+ border-radius: 50%;
42
+ display: flex;
43
+ align-items: center;
44
+ justify-content: center;
45
+ font-size: 3rem;
46
+ color: #004d40;
47
+ box-shadow: 0 0 10px #b2ebf2;
48
+ }
49
+ .description {
50
+ font-size: 1.2rem;
51
+ margin-top: 20px;
52
+ }
53
+ .details, .settings, .prompt-format {
54
+ text-align: left;
55
+ background: #ffffff;
56
+ padding: 10px;
57
+ margin: 10px auto;
58
+ max-width: 600px;
59
+ border-radius: 10px;
60
+ box-shadow: 0 2px 8px #b2ebf2;
61
+ }
62
+ pre {
63
+ background: #e0f7fa;
64
+ padding: 10px;
65
+ border-radius: 8px;
66
+ }
67
+ </style>
68
+ </head>
69
+ <body>
70
+ <div class="logo">❄️🧠</div>
71
+ <div class="title">WinterEngine-24B-Instruct</div>
72
+ <p class="description">A versatile and powerful model designed for general-purpose text generation, roleplay, storywriting, scientific exploration, and more.</p>
73
+
74
+ <div class="details">
75
+ <h3>Key Details</h3>
76
+ <pre>
77
+ BASE MODEL: mistralai/Mistral-Small-24B-Base-2501
78
+ LICENSE: apache-2.0
79
+ LANGUAGE: English
80
+ CONTEXT LENGTH: 32768 tokens
81
+ </pre>
82
+ </div>
83
+
84
+ <div class="settings">
85
+ <h3>Recommended Settings</h3>
86
+ <pre>
87
+ TEMPERATURE: 1.0
88
+ TOP_P: 0.95
89
+ MIN_P: 0.05
90
+ </pre>
91
+ </div>
92
+
93
+ <div class="prompt-format">
94
+ <h3>Prompting Format</h3>
95
+ <pre>
96
+ <|im_start|>system
97
+ system prompt<|im_end|>
98
+ <|im_start|>user
99
+ Hi there!<|im_end|>
100
+ <|im_start|>assistant
101
+ Nice to meet you!<|im_end|>
102
+ </pre>
103
+ </div>
104
+ </body>
105
+ </html>
106
+
107
+
108
+
109
  ## 🧩 Configuration
110
 
111
  ```yaml
 
125
  value: [1, 0.5, 0.7, 0.3, 0]
126
  - value: 0.5
127
  dtype: bfloat16
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
128
  ```