Darkknight535's picture
Update README.md
685246d verified
|
raw
history blame
3.02 kB
metadata
base_model:
  - PocketDoc/Dans-PersonalityEngine-V1.2.0-24b
  - SicariusSicariiStuff/Redemption_Wind_24B
tags:
  - merge
  - mergekit
  - lazymergekit
  - PocketDoc/Dans-PersonalityEngine-V1.2.0-24b
  - SicariusSicariiStuff/Redemption_Wind_24B

WindEngine-24B-Instruct

WinterEngine-24B-Instruct

  
    *    *    *
      *  *  *
       * * *
      *  *  *
    *    *    *
    

Key Details

BASE MODEL: mistralai/Mistral-Small-24B-Base-2501
LICENSE: apache-2.0
LANGUAGE: English
CONTEXT LENGTH: 32768 tokens

Recommended Settings

TEMPERATURE: 1.2
MIN_P: 0.05
(Everything Else Neutral MEME Samplers Too.)

Prompting Format

<|im_start|>system
system prompt<|im_end|>
<|im_start|>user
Hello, WinterEngine!<|im_end|>
<|im_start|>assistant
Hello! How can I help you today?<|im_end|>

LazyMergekit:

🧩 Configuration

slices:
  - sources:
      - model: PocketDoc/Dans-PersonalityEngine-V1.2.0-24b
        layer_range: [0, 40]
      - model: SicariusSicariiStuff/Redemption_Wind_24B
        layer_range: [0, 40]
merge_method: slerp
base_model: PocketDoc/Dans-PersonalityEngine-V1.2.0-24b
parameters:
  t:
    - filter: self_attn
      value: [0, 0.5, 0.3, 0.7, 1]
    - filter: mlp
      value: [1, 0.5, 0.7, 0.3, 0]
    - value: 0.5
dtype: bfloat16