--- base_model: - Fizzarolli/clite-500m - h2oai/h2o-danube3-500m-base library_name: transformers tags: - mergekit - mergekitty - merge --- # tmpzvufsy8v This is a merge of pre-trained language models created using [mergekitty](https://github.com/allura-org/mergekitty). ## Merge Details ### Merge Method This model was merged using the [Model Breadcrumbs](https://arxiv.org/abs/2312.06795) merge method using [h2oai/h2o-danube3-500m-base](https://huggingface.co/h2oai/h2o-danube3-500m-base) as a base. ### Models Merged The following models were included in the merge: * [Fizzarolli/clite-500m](https://huggingface.co/Fizzarolli/clite-500m) ### Configuration The following YAML configuration was used to produce this model: ```yaml base_model: h2oai/h2o-danube3-500m-base merge_method: breadcrumbs parameters: density: 0.95 gamma: 0.01 slices: - sources: - layer_range: [0, 16] model: Fizzarolli/clite-500m parameters: weight: 1.0 - layer_range: [0, 16] model: h2oai/h2o-danube3-500m-base ```