Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
SalimBou5
/
sft_dpo_model
like
0
PEFT
Safetensors
arxiv:
1910.09700
Model card
Files
Files and versions
Community
Use this model
fffdb04
sft_dpo_model
/
README.md
Commit History
Update README.md
fffdb04
verified
SalimBou5
commited on
Jun 3, 2024
Upload folder using huggingface_hub
2323254
verified
SalimBou5
commited on
Jun 3, 2024