--- license: apache-2.0 base_model: - paulml/NeuralOmniWestBeaglake-7B - paulml/OmniBeagleSquaredMBX-v3-7B - yam-peleg/Experiment21-7B - yam-peleg/Experiment26-7B - Kukedlc/NeuralMaths-Experiment-7b - Gille/StrangeMerges_16-7B-slerp - vanillaOVO/correction_1 library_name: transformers tags: - mergekit - merge --- ![image/png](https://cdn-lfs-us-1.huggingface.co/repos/a8/e2/a8e2d8f376a36f469ddc986ad0d08607fbf43f62ccb86dc66469635aa6817457/b15b9bb9cf5469c2c7b52ccf60ed6ad9ecd6a4f61c145a1fde78e7881b4e01c6?response-content-disposition=inline%3B+filename*%3DUTF-8%27%27bophades.png%3B+filename%3D%22bophades.png%22%3B&response-content-type=image%2Fpng&Expires=1712112080&Policy=eyJTdGF0ZW1lbnQiOlt7IkNvbmRpdGlvbiI6eyJEYXRlTGVzc1RoYW4iOnsiQVdTOkVwb2NoVGltZSI6MTcxMjExMjA4MH19LCJSZXNvdXJjZSI6Imh0dHBzOi8vY2RuLWxmcy11cy0xLmh1Z2dpbmdmYWNlLmNvL3JlcG9zL2E4L2UyL2E4ZTJkOGYzNzZhMzZmNDY5ZGRjOTg2YWQwZDA4NjA3ZmJmNDNmNjJjY2I4NmRjNjY0Njk2MzVhYTY4MTc0NTcvYjE1YjliYjljZjU0NjljMmM3YjUyY2NmNjBlZDZhZDllY2Q2YTRmNjFjMTQ1YTFmZGU3OGU3ODgxYjRlMDFjNj9yZXNwb25zZS1jb250ZW50LWRpc3Bvc2l0aW9uPSomcmVzcG9uc2UtY29udGVudC10eXBlPSoifV19&Signature=MYJcmEMu3Yx8MYGPh12zqN92lPH-fv94Nk%7EI1Cxk85XvKUyk0Dj%7E2bE3rQwuH9OJBPPfApv40uTAXAE7c6twJ40QY-Hopu0VynK89CX0b5Ompm8Lxa%7EDZi-bHJW3PjDgwf3bSb2Xwk37ndPUZla7aDaI8r9O5c8M6AeXUvhRMEe7amNIHSkFpitGPSCnbDrDeeBzuG5j8zFBpUVIiY8Hco9XiWUDWCZffyJ7%7EtHyLbNlu%7EWLCsK4Q6UBKFjDCey-1DkSqlTxuwPYFXWci79j4HlckNVkn2-60rfj-6nPXKQmC6LVL-J2LPLFvcON4D8aMsElmuvBM9l7ygkfoB6iGw__&Key-Pair-Id=KCD77M1F0VK2B) # bophades-mistral-7B This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [yam-peleg/Experiment26-7B](https://huggingface.co/yam-peleg/Experiment26-7B) as a base. ### Models Merged The following models were included in the merge: * [paulml/NeuralOmniWestBeaglake-7B](https://huggingface.co/paulml/NeuralOmniWestBeaglake-7B) * [paulml/OmniBeagleSquaredMBX-v3-7B](https://huggingface.co/paulml/OmniBeagleSquaredMBX-v3-7B) * [yam-peleg/Experiment21-7B](https://huggingface.co/yam-peleg/Experiment21-7B) * [Kukedlc/NeuralMaths-Experiment-7b](https://huggingface.co/Kukedlc/NeuralMaths-Experiment-7b) * [Gille/StrangeMerges_16-7B-slerp](https://huggingface.co/Gille/StrangeMerges_16-7B-slerp) * [vanillaOVO/correction_1](https://huggingface.co/vanillaOVO/correction_1) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: paulml/OmniBeagleSquaredMBX-v3-7B parameters: density: 0.5 weight: 0.5 - model: paulml/NeuralOmniWestBeaglake-7B parameters: density: 0.5 weight: 0.5 - model: Gille/StrangeMerges_16-7B-slerp parameters: density: 0.5 weight: 0.5 - model: yam-peleg/Experiment21-7B parameters: density: 0.5 weight: 0.5 - model: vanillaOVO/correction_1 parameters: density: 0.5 weight: 0.5 - model: Kukedlc/NeuralMaths-Experiment-7b parameters: density: 0.5 weight: 0.5 merge_method: dare_ties base_model: yam-peleg/Experiment26-7B parameters: normalize: true dtype: bfloat16 ```