File size: 899 Bytes
b699f20 3bbd01f b699f20 3bbd01f 6726933 b699f20 552ccad b699f20 3bbd01f |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 |
---
tags:
- merge
- mergekit
- lazymergekit
- mistral
- ResplendentAI/Datura_7B
- Epiculous/Mika-7B
base_model:
- ResplendentAI/Datura_7B
- Epiculous/Mika-7B
language:
- en
library_name: transformers
license: other
---
# <img src="https://cdn-icons-png.flaticon.com/512/1531/1531037.png" alt="favicon" style="display: inline-block; vertical-align: middle; width: 20px; height: 20px; margin-right: 10px;"> Foxglove_7B
Foxglove_7B is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [ResplendentAI/Datura_7B](https://huggingface.co/ResplendentAI/Datura_7B)
* [Epiculous/Mika-7B](https://huggingface.co/Epiculous/Mika-7B)
## Configuration
- **Slices:**
- **Sources:**
- Model: ResplendentAI/Datura_7B
- Model: Epiculous/Mika-7B
- **Merge Method:** SLERP
- **Base Model:** ResplendentAI/Datura_7B |