Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
cloudyu
/
Mixtral_7Bx5_MoE_30B_DPO
like
0
Text Generation
Transformers
Safetensors
mixtral
Mixture of Experts
text-generation-inference
License:
mit
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
This is DPO improved version of cloudyu/Mixtral_7Bx5_MoE_30B
DPO Trainer
metric not test
Downloads last month
53
Safetensors
Model size
30B params
Tensor type
BF16
ยท
Files info
Inference Providers
NEW
Text Generation
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for
cloudyu/Mixtral_7Bx5_MoE_30B_DPO
Quantizations
1 model