Update README.md
Browse files
README.md
CHANGED
@@ -7,14 +7,18 @@ tags:
|
|
7 |
- merge
|
8 |
|
9 |
---
|
10 |
-
# merge
|
11 |
|
12 |
-
|
|
|
|
|
|
|
|
|
13 |
|
14 |
## Merge Details
|
|
|
15 |
### Merge Method
|
16 |
|
17 |
-
This model was
|
18 |
|
19 |
### Models Merged
|
20 |
|
@@ -23,7 +27,7 @@ The following models were included in the merge:
|
|
23 |
|
24 |
### Configuration
|
25 |
|
26 |
-
The following YAML configuration was used to produce
|
27 |
|
28 |
```yaml
|
29 |
slices:
|
@@ -50,5 +54,3 @@ slices:
|
|
50 |
model: tiiuae/Falcon3-10B-Base
|
51 |
merge_method: passthrough
|
52 |
dtype: float16
|
53 |
-
|
54 |
-
```
|
|
|
7 |
- merge
|
8 |
|
9 |
---
|
|
|
10 |
|
11 |
+
# Eyas 17B
|
12 |
+
|
13 |
+
## Overview
|
14 |
+
|
15 |
+
Eyas 17B is a frankenmerge based on the Falcon 3-10B architecture. Built using the [mergekit](https://github.com/cg123/mergekit) library, Eyas 17B is optimized for a range of natural language processing tasks.
|
16 |
|
17 |
## Merge Details
|
18 |
+
|
19 |
### Merge Method
|
20 |
|
21 |
+
This model was created using the **passthrough merge method**. This method allows for a seamless integration of model layers to produce a new, high-performance model while maintaining compatibility with the Hugging Face `transformers` library.
|
22 |
|
23 |
### Models Merged
|
24 |
|
|
|
27 |
|
28 |
### Configuration
|
29 |
|
30 |
+
The following YAML configuration was used to produce Eyas 17B:
|
31 |
|
32 |
```yaml
|
33 |
slices:
|
|
|
54 |
model: tiiuae/Falcon3-10B-Base
|
55 |
merge_method: passthrough
|
56 |
dtype: float16
|
|
|
|