|
--- |
|
base_model: |
|
- tiiuae/Falcon3-10B-Base |
|
library_name: transformers |
|
tags: |
|
- mergekit |
|
- merge |
|
|
|
--- |
|
|
|
# Eyas 17B |
|
|
|
## Overview |
|
|
|
Eyas 17B is a frankenmerge based on the Falcon 3-10B architecture. Built using the [mergekit](https://github.com/cg123/mergekit) library, Eyas 17B is optimized for a range of natural language processing tasks. |
|
|
|
## Merge Details |
|
|
|
### Merge Method |
|
|
|
This model was created using the **passthrough merge method**. This method allows for a seamless integration of model layers to produce a new, high-performance model while maintaining compatibility with the Hugging Face `transformers` library. |
|
|
|
### Models Merged |
|
|
|
The following models were included in the merge: |
|
* [tiiuae/Falcon3-10B-Base](https://huggingface.co/tiiuae/Falcon3-10B-Base) |
|
|
|
### Configuration |
|
|
|
The following YAML configuration was used to produce Eyas 17B: |
|
|
|
```yaml |
|
slices: |
|
- sources: |
|
- layer_range: [0, 10] |
|
model: tiiuae/Falcon3-10B-Base |
|
- sources: |
|
- layer_range: [5, 15] |
|
model: tiiuae/Falcon3-10B-Base |
|
- sources: |
|
- layer_range: [10, 20] |
|
model: tiiuae/Falcon3-10B-Base |
|
- sources: |
|
- layer_range: [15, 25] |
|
model: tiiuae/Falcon3-10B-Base |
|
- sources: |
|
- layer_range: [20, 30] |
|
model: tiiuae/Falcon3-10B-Base |
|
- sources: |
|
- layer_range: [25, 35] |
|
model: tiiuae/Falcon3-10B-Base |
|
- sources: |
|
- layer_range: [30, 40] |
|
model: tiiuae/Falcon3-10B-Base |
|
merge_method: passthrough |
|
dtype: float16 |
|
|