--- base_model: - Sao10K/L3-8B-Lunaris-v1 - kromeurus/L3.1-Clouded-Uchtave-v0.1-8B - kromeurus/L3.1-Aglow-Vulca-v0.1-8B library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [DARE TIES](https://arxiv.org/abs/2311.03099) merge method using [Sao10K/L3-8B-Lunaris-v1](https://huggingface.co/Sao10K/L3-8B-Lunaris-v1) as a base. ### Models Merged The following models were included in the merge: * [kromeurus/L3.1-Clouded-Uchtave-v0.1-8B](https://huggingface.co/kromeurus/L3.1-Clouded-Uchtave-v0.1-8B) * [kromeurus/L3.1-Aglow-Vulca-v0.1-8B](https://huggingface.co/kromeurus/L3.1-Aglow-Vulca-v0.1-8B) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: Sao10K/L3-8B-Lunaris-v1 - model: kromeurus/L3.1-Clouded-Uchtave-v0.1-8B parameters: density: [0.35, 0.45, 0.5, 0.55, 0.65, 0.55, 0.5, 0.45, 0.35] weight: [0.495, 0.165, 0.165, 0.495, 0.495, 0.165, 0.165, 0.495] - model: kromeurus/L3.1-Aglow-Vulca-v0.1-8B parameters: density: [0.65, 0.55, 0.5, 0.45, 0.35, 0.45, 0.5, 0.55, 0.65] weight: [0.165, 0.495, 0.495, 0.165, 0.165, 0.495, 0.495, 0.165] merge_method: dare_ties base_model: Sao10K/L3-8B-Lunaris-v1 parameters: normalize: false int8_mask: true dtype: float32 out_dtype: bfloat16 ```