Update README.md
Browse files
README.md
CHANGED
@@ -23,11 +23,11 @@ New base model, this one actually expects you to use Llama 3 Instruct format. Th
|
|
23 |
This is evolution 1. Yes, I know it makes no sense. I explain this in my rant down below. I'm going to list the recipe for the model now, but know the reality is more complex than just this:
|
24 |
|
25 |
|
26 |
-
Stock for the "True Merge"
|
27 |
- PKU-Baichuan-MLSystemLab/Llama3-PBM-Nova-70B
|
28 |
- yentinglin/Llama-3-Taiwan-70B-Instruct
|
29 |
- Sao10K/L3.3-70B-Euryale-v2.3
|
30 |
-
- (Custom Base Model-Stock Soup)
|
31 |
|
32 |
|
33 |
# Why a different approach?
|
|
|
23 |
This is evolution 1. Yes, I know it makes no sense. I explain this in my rant down below. I'm going to list the recipe for the model now, but know the reality is more complex than just this:
|
24 |
|
25 |
|
26 |
+
Stock for the "True Merge" -- This was a TIES Merge, the reasoning is explained below for using TIES over Model Stock this time. Although, model stock was also used.
|
27 |
- PKU-Baichuan-MLSystemLab/Llama3-PBM-Nova-70B
|
28 |
- yentinglin/Llama-3-Taiwan-70B-Instruct
|
29 |
- Sao10K/L3.3-70B-Euryale-v2.3
|
30 |
+
- (Custom Base Model-Stock Soup -- Recipe Below)
|
31 |
|
32 |
|
33 |
# Why a different approach?
|