Lotus-Magpic

Another merge that seems promising to me. My goal was to create an RP model with natural flow.

Merge Details

Merge Method

This model was merged using the Linear merge method.

Models Merged

The following models were included in the merge:

  • MN-Violet-Lotus-12B
  • Magnum-Picaro-0.7-v2-12b

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: Magnum-Picaro-0.7-v2-12b
    parameters:
      weight: 0.4
  - model: MN-Violet-Lotus-12B
    parameters:
      weight: 0.6
merge_method: linear
dtype: float16
Downloads last month
32
Safetensors
Model size
12.2B params
Tensor type
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for Ateron/Lotus-Magpic

Finetuned
(2)
this model
Quantizations
3 models