merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the TIES merge method using SillyTilly/google-gemma-2-9b-it as a base.
Models Merged
The following models were included in the merge:
- inflatebot/G2-9B-Blackout-R1
- anthracite-org/magnum-v3-9b-customgemma2
- nbeerbower/Gemma2-Gutenberg-Doppel-9B
- lemon07r/Gemma-2-Ataraxy-9B
- sam-paech/Delirium-v1
Configuration
The following YAML configuration was used to produce this model:
models:
- model: sam-paech/Delirium-v1 # Modello principale
parameters:
weight: 0.5 # Peso maggiore per il dataset Gutenberg
density: 0.6 # Mantiene il 60% dei suoi parametri unici
- model: nbeerbower/Gemma2-Gutenberg-Doppel-9B
parameters:
weight: 0.2
density: 0.4
- model: lemon07r/Gemma-2-Ataraxy-9B
parameters:
weight: 0.15
density: 0.3
- model: inflatebot/G2-9B-Blackout-R1
parameters:
weight: 0.1
density: 0.3
- model: anthracite-org/magnum-v3-9b-customgemma2
parameters:
weight: 0.05
density: 0.2
merge_method: ties
base_model: SillyTilly/google-gemma-2-9b-it
tokenizer_source: sam-paech/Delirium-v1 # Usa il tokenizer del modello principale
dtype: float16
parameters:
normalize: true
alpha: 0.35 # Bilancia tra task vectors e base model
- Downloads last month
- 17
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.