magnum-v2-123b-fp8-dynamic

This is the sixth in a series of models designed to replicate the prose quality of the Claude 3 models, specifically Sonnet and Opus.
This model is fine-tuned on top of [Mistral-Large-Instruct-2407](https://huggingface.co/mistralai/Mistral-Large-Instruct-2407).

Converted to fp8 dynamic by leafspark; original model link here: anthracite-org/magnum-v2-123b

Using 6xL40 on RunPod Ubuntu container; quantizaton took 30 minutes.

Downloads last month
12
Safetensors
Model size
123B params
Tensor type
BF16
·
F8_E4M3
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for leafspark/magnum-v2-123b-fp8-dynamic

Quantized
(5)
this model