arcee-lite-Qwen2-2B / mergekit_config.yml
djuna's picture
Upload folder using huggingface_hub
4d4d925 verified
raw
history blame contribute delete
251 Bytes
slices:
- sources:
- layer_range: [0, 14]
model: arcee-ai/arcee-lite
- sources:
- layer_range: [7, 21]
model: arcee-ai/arcee-lite
- sources:
- layer_range: [14, 28]
model: arcee-ai/arcee-lite
merge_method: passthrough
dtype: bfloat16