metadata
base_model:
- allura-org/Mistral-Small-Sisyphus-24b-2503
- PocketDoc/Dans-DangerousWinds-V1.1.1-24b
- ReadyArt/Forgotten-Safeword-24B
library_name: transformers
tags:
- mergekit
- merge
eidolon-v3
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the TIES merge method using allura-org/Mistral-Small-Sisyphus-24b-2503 as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
# removing Thinker for now
models:
- model: ReadyArt/Forgotten-Safeword-24B
parameters:
weight: 0.9
density: 0.95
- model: PocketDoc/Dans-DangerousWinds-V1.1.1-24b
parameters:
weight: 0.7
density: 0.8
merge_method: ties
base_model: allura-org/Mistral-Small-Sisyphus-24b-2503
parameters:
normalize: true
tokenizer:
source: base
dtype: bfloat16