image/png

Inferor

My first merge yay!

This was made thanks to infermatic.ai

Recommended settings on: Infermatic/MN 12B Inferor v0.0 Article

Thanks everyone that is using it and providing feedback. ily - svak

Our discord server is open for discussions for this model.

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the Model Stock merge method using anthracite-org/magnum-v4-12b as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

base_model: anthracite-org/magnum-v4-12b
dtype: bfloat16
merge_method: model_stock
slices:
- sources:
  - layer_range: [0, 40]
    model: Fizzarolli/MN-12b-Sunrose
  - layer_range: [0, 40]
    model: nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B-v2
  - layer_range: [0, 40]
    model: nothingiisreal/MN-12B-Starcannon-v3
  - layer_range: [0, 40]
    model: anthracite-org/magnum-v4-12b
Downloads last month
78
Safetensors
Model size
12.2B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for Infermatic/MN-12B-Inferor-v0.0

Collection including Infermatic/MN-12B-Inferor-v0.0