Mistral-CyberLawyer-7B / mergekit_config.yml
mergekit-uploader's picture
Upload folder using huggingface_hub
a3d8fdb verified
raw
history blame
222 Bytes
models:
- model: shibiyaj/lawGPT-chat
parameters:
density: 0.5
weight: 0.5
merge_method: ties
base_model: AdityaXPV/Mistral-7B-law-sage-v0.3
parameters:
normalize: false
int8_mask: true
dtype: float16