Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Xiaodong
/
Next-DPO-iter2
like
0
Safetensors
Xiaodong/DPO-iter2-data-8k
Model card
Files
Files and versions
Community
main
Next-DPO-iter2
/
README.md
Commit History
Update README.md
c27162b
verified
Xiaodong
commited on
Oct 13, 2024
Update README.md
8a53b49
verified
Xiaodong
commited on
Oct 13, 2024
Create README.md
a82d0ff
verified
Xiaodong
commited on
Oct 13, 2024