merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using unsloth/Mistral-Small-Instruct-2409 as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

models:
  # Core Fiction and Character Detail Models
  - model: Kaoeiri/MS_Moingooistral-2409-22B
    parameters:
      weight: 0.36  # Balanced for nuanced character and story details
      density: 1.15 # Slightly reduced for better interaction with other models

  - model: Kaoeiri/MS-Magpantheonsel-lark-v4x1.6.2-Cydonia-vXXX-22B-5
    parameters:
      weight: 1.0   # Primary engine remains untouched
      density: 0.85 # Retained for depth and coherence

  # World Building & Character Interaction
  - model: Kaoeiri/MS-Inky-2409-22B
    parameters:
      weight: 0.38  # Balanced to allow subtle world-building
      density: 0.78 # Reduced slightly to integrate smoother interaction

  - model: Gryphe/Pantheon-RP-Pure-1.6.2-22b-Small
    parameters:
      weight: 0.42  # Optimized for character interaction
      density: 0.80 # Elevated slightly for smoother roleplay context

  # Character Development Core
  - model: DigitalSouls/BlackSheep-DigitalSoul-22B
    parameters:
      weight: 0.30  # Balanced for character dynamics
      density: 0.78 # Slightly enhanced for deeper conflicts and resolutions

  # Magical Elements
  - model: InferenceIllusionist/SorcererLM-22B
    parameters:
      weight: 0.14  # Retained for magical character depth
      density: 0.76 # Balanced to prevent overpowering influence

  - model: Envoid/Mistral-Small-NovusKyver
    parameters:
      weight: 0.14  # Balanced synergy with SorcererLM
      density: 0.76 # Synchronized with magical elements

  # Secondary Character Enhancement
  - model: TheDrummer/Cydonia-22B-v1.1
    parameters:
      weight: 0.16  # Balanced for supporting depth
      density: 0.70 # Slightly enhanced for richer secondary character narratives
  
  - model: crestf411/MS-sunfall-v0.7.0
    parameters:
      weight: 0.17  # Fine-tuned for precision
      density: 0.72 # Optimized for secondary character arcs

  - model: Kaoeiri/MS_a-coolyte-2409-22B
    parameters:
      weight: 0.16  # Reduced to reflect 1024-rank finetuning
      density: 0.65 # Limited to prevent excessive influence

  # Enhanced Personality Dynamics
  - model: Kaoeiri/MS_fujin-2409-22B
    parameters:
      weight: 0.09  # Introduced to replace Quadrosiac subtly
      density: 0.65 # Contributes to nuanced personality traits
  
  - model: Kaoeiri/MS_dampf-2409-22B
    parameters:
      weight: 0.12  # Enhances character complexity
      density: 0.70 # Integrated for personality interaction

  # Enhanced Story and Character Building
  - model: hf-100/Mistral-Small-Spellbound-StoryWriter-22B-instruct-0.2-chkpt-200-16-bit
    parameters:
      weight: 0.25  # Slightly enhanced for narrative engagement
      density: 0.75 # Elevated for richer storytelling

  - model: ArliAI/Mistral-Small-22B-ArliAI-RPMax-v1.1
    parameters:
      weight: 0.16  # Balanced for dynamic roleplay
      density: 0.68 # Elevated for smoother character interactions
  
  - model: Darkknight535/MS-Moonlight-22B-v3
    parameters:
      weight: 0.25  # Slightly enhanced for detail
      density: 0.68 # Adjusted for depth

  - model: concedo/Beepo-22B
    parameters:
      weight: 0.35  # Refined to prevent overpowering other layers
      density: 0.50 # Contributes subtle narrative shifts
  
  # Cultural and Character Depth
  - model: Saxo/Linkbricks-Horizon-AI-Japanese-Superb-V1-22B
    parameters:
      weight: 0.22  # Optimized for cultural enrichment
      density: 0.65 # Balanced for depth

  # Mermaid and Fantasy Personality
  - model: TroyDoesAI/BlackSheep-MermaidMistral-22B
    parameters:
      weight: 0.24  # Slightly enhanced for personality depth
      density: 0.74 # Elevated for immersive fantasy traits

merge_method: dare_ties
base_model: unsloth/Mistral-Small-Instruct-2409
parameters:
  density: 0.95    # Balanced for rich, cohesive character narratives
  epsilon: 0.035   # Optimized for smoother transitions
  lambda: 1.50     # Elevated for creative and coherent synergy
dtype: bfloat16
tokenizer_source: union
Downloads last month
4
Safetensors
Model size
22.2B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for Kaoeiri/MS-MagpantheonselRP-22B-13.0