metadata
base_model:
- PocketDoc/Dans-PersonalityEngine-V1.2.0-24b
- SicariusSicariiStuff/Redemption_Wind_24B
tags:
- merge
- mergekit
- lazymergekit
- PocketDoc/Dans-PersonalityEngine-V1.2.0-24b
- SicariusSicariiStuff/Redemption_Wind_24B
WindEngine-24B-Instruct
WindEngine-24B-Instruct is a merge of the following models using LazyMergekit:
WinterEngine-24B-Instruct
WinterEngine is a highly capable model for creative writing, roleplay, coding, and general-purpose inference. Inspired by the calm and depth of winter, this model excels in nuanced storytelling and problem-solving.
Key Details
BASE MODEL: mistralai/Mistral-Small-24B-Base-2501 LICENSE: apache-2.0 LANGUAGE: English CONTEXT LENGTH: 32768 tokens
Recommended Settings
TEMPERATURE: 0.8 TOP_P: 0.9 MIN_P: 0.05
Prompting Format
<|im_start|>system system prompt<|im_end|> <|im_start|>user Hello, WinterEngine!<|im_end|> <|im_start|>assistant Hello! How can I help you today?<|im_end|>
🧩 Configuration
slices:
- sources:
- model: PocketDoc/Dans-PersonalityEngine-V1.2.0-24b
layer_range: [0, 40]
- model: SicariusSicariiStuff/Redemption_Wind_24B
layer_range: [0, 40]
merge_method: slerp
base_model: PocketDoc/Dans-PersonalityEngine-V1.2.0-24b
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5
dtype: bfloat16