--- base_model: - PocketDoc/Dans-PersonalityEngine-V1.2.0-24b - lars1234/Mistral-Small-24B-Instruct-2501-writer - mistralai/Mistral-Small-24B-Instruct-2501 - trashpanda-org/Llama3-24B-Mullein-v1 - unsloth/Mistral-Small-24B-Base-2501 - arcee-ai/Arcee-Blitz - allura-org/Mistral-Small-24b-Sertraline-0304 library_name: transformers tags: - mergekit - mergekitty - merge --- # v0a This is a merge of pre-trained language models created using [mergekitty](https://github.com/allura-org/mergekitty). ## Merge Details ### Merge Method This model was merged using the [SCE](https://arxiv.org/abs/2408.07990) merge method using [unsloth/Mistral-Small-24B-Base-2501](https://huggingface.co/unsloth/Mistral-Small-24B-Base-2501) as a base. ### Models Merged The following models were included in the merge: * [PocketDoc/Dans-PersonalityEngine-V1.2.0-24b](https://huggingface.co/PocketDoc/Dans-PersonalityEngine-V1.2.0-24b) * [lars1234/Mistral-Small-24B-Instruct-2501-writer](https://huggingface.co/lars1234/Mistral-Small-24B-Instruct-2501-writer) * [mistralai/Mistral-Small-24B-Instruct-2501](https://huggingface.co/mistralai/Mistral-Small-24B-Instruct-2501) * [trashpanda-org/Llama3-24B-Mullein-v1](https://huggingface.co/trashpanda-org/Llama3-24B-Mullein-v1) * [arcee-ai/Arcee-Blitz](https://huggingface.co/arcee-ai/Arcee-Blitz) * [allura-org/Mistral-Small-24b-Sertraline-0304](https://huggingface.co/allura-org/Mistral-Small-24b-Sertraline-0304) ### Configuration The following YAML configuration was used to produce this model: ```yaml base_model: unsloth/Mistral-Small-24B-Base-2501 merge_method: sce dtype: float32 out_dtype: bfloat16 models: - model: allura-org/Mistral-Small-24b-Sertraline-0304 parameters: select_topk: 0.50 - model: lars1234/Mistral-Small-24B-Instruct-2501-writer parameters: select_topk: 0.20 - model: PocketDoc/Dans-PersonalityEngine-V1.2.0-24b parameters: select_topk: 0.20 - model: trashpanda-org/Llama3-24B-Mullein-v1 parameters: select_topk: 0.175 - model: arcee-ai/Arcee-Blitz parameters: select_topk: 0.15 - model: mistralai/Mistral-Small-24B-Instruct-2501 parameters: select_topk: 0.15 # apt install git nano -y # uv tool install mergekitty --with hf_transfer # uv tool install https://github.com/aphrodite-engine/aphrodite-engine/releases/download/v0.6.7/aphrodite_engine-0.6.7-cp38-abi3-manylinux1_x86_64.whl --with aphrodite-engine --with setuptools --with hf_transfer # uv tool install huggingface_hub # huggingface-cli login # nano merge.yml # mergekitty-yaml --cuda --lazy-unpickle merge.yml v0a ```