Nemo-DPO-v11 / mergekit_config.yml
cloudyu's picture
Upload folder using huggingface_hub
a5cea5a verified
raw
history blame contribute delete
163 Bytes
models:
- model: ./Nemo-DPO-v10
parameters:
weight: 0.5
- model: ./Nemo-DPO-v6
parameters:
weight: 1.0
merge_method: linear
dtype: float16