--- base_model: - Elfrino/XwinXtended-20B - Undi95/PsyMedRP-v1-20B - Elfrino/PsyMedLewdPass - Undi95/MXLewd-L2-20B library_name: transformers tags: - mergekit - merge --- # merge Trying a new merge technique (sce). Early results are interesting.. ![BubbleBot3](https://huggingface.co/Elfrino/RhizomaticReverie-20B-Q5_K_M-GGUF/resolve/main/BubbleBot3.png) This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [SCE](https://arxiv.org/abs/2408.07990) merge method using [Undi95/PsyMedRP-v1-20B](https://huggingface.co/Undi95/PsyMedRP-v1-20B) as a base. ### Models Merged The following models were included in the merge: * [Elfrino/XwinXtended-20B](https://huggingface.co/Elfrino/XwinXtended-20B) * [Elfrino/PsyMedLewdPass](https://huggingface.co/Elfrino/PsyMedLewdPass) * [Undi95/MXLewd-L2-20B](https://huggingface.co/Undi95/MXLewd-L2-20B) ### Configuration The following YAML configuration was used to produce this model: ```yaml merge_method: sce models: - model: Undi95/MXLewd-L2-20B - model: Elfrino/PsyMedLewdPass - model: Elfrino/XwinXtended-20B base_model: Undi95/PsyMedRP-v1-20B tokenizer: source: base parameters: select_topk: 0.8 dtype: float32 out_dtype: bfloat16 normalize: true ```