Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
ZHLiu627
/
zephyr-7b-gemma-rpo-avg
like
0
Safetensors
argilla/dpo-mix-7k
gemma
arxiv:
2405.16436
License:
apache-2.0
Model card
Files
Files and versions
Community
Train
main
zephyr-7b-gemma-rpo-avg
/
tokenizer.json
Commit History
Upload folder using huggingface_hub
16c0fcd
verified
ZHLiu627
commited on
17 days ago