merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using KidIkaros/Llama-3.2-1B-Instruct-abliterated as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

slices:
  - sources:
      - model: KidIkaros/Llama-3.2-1B-Instruct-abliterated
        layer_range: [0, 16]
      - model: passing2961/Thanos-1B
        layer_range: [0, 16]
      - model: DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1.1
        layer_range: [0, 16]
merge_method: dare_ties
base_model: KidIkaros/Llama-3.2-1B-Instruct-abliterated
parameters:
  density: 0.5
  weight: 0.4
dtype: bfloat16
Downloads last month
9
Safetensors
Model size
1.5B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for mergekit-community/mergekit-dare_ties-ajgjgea