Update README.md
Browse files
README.md
CHANGED
@@ -10,7 +10,7 @@ tags:
|
|
10 |
---
|
11 |
# merge
|
12 |
|
13 |
-
|
14 |
|
15 |
## Merge Details
|
16 |
### Merge Method
|
|
|
10 |
---
|
11 |
# merge
|
12 |
|
13 |
+
The merits of multi-stage arcee_fusion merges are clearly shown in [sometimesanotion/Lamarck-14B-v0.7-Fusion](https://huggingface.co/sometimesanotion/Lamarck-14B-v0.7-Fusion), which has a valuable uptick in GPQA over its predecessors. Will its gains be maintained with a modified version of the SLERP recipe from [suayptalha/Lamarckvergence-14B](https://huggingface.co/suayptalha/Lamarckvergence-14B)? Clearly, self-attention and perceptrons can unlock a lot of power in this kind of merge.
|
14 |
|
15 |
## Merge Details
|
16 |
### Merge Method
|