Update README.md
Browse files
README.md
CHANGED
@@ -13,18 +13,12 @@ here we go again. multi-step merge, various models involved at various ratios wi
|
|
13 |
|
14 |
this thing came to me in a fever dream when I was hung over, but after slightly tweaking the recipe it turned out surprisingly decent. using with the settings included.
|
15 |
|
16 |
-
## Quantized versions:
|
17 |
-
- GGUF iMat: [Quant-Cartel/0x01-8x7b-iMat-GGUF](https://huggingface.co/Quant-Cartel/0x01-8x7b-iMat-GGUF)
|
18 |
-
- exl2 rpcal: [Quant-Cartel/0x01-8x7b-exl2-rpcal](https://huggingface.co/Quant-Cartel/0x01-8x7b-exl2-rpcal)
|
19 |
-
|
20 |
## Update:
|
21 |
The following settings have proved to work good too:
|
22 |
- Context: https://files.catbox.moe/q91rca.json
|
23 |
- Instruct: https://files.catbox.moe/2w8ja2.json
|
24 |
- Textgen: https://files.catbox.moe/s25rad.json
|
25 |
|
26 |
-
|
27 |
-
|
28 |
## Constituent parts
|
29 |
```yaml
|
30 |
# primordial_slop_a:
|
@@ -48,34 +42,6 @@ The following settings have proved to work good too:
|
|
48 |
- model: Envoid/Mixtral-Instruct-ITR-DADA-8x7B
|
49 |
```
|
50 |
|
51 |
-
|
52 |
-
|
53 |
-
|
54 |
-
|
55 |
-
## Merge Details
|
56 |
-
### Merge Method
|
57 |
-
|
58 |
-
This model was merged using the SLERP merge method.
|
59 |
-
|
60 |
-
### Models Merged
|
61 |
-
|
62 |
-
The following models were included in the merge:
|
63 |
-
* ./primordial_slop_d
|
64 |
-
* ./primordial_slop_c
|
65 |
-
|
66 |
-
### Configuration
|
67 |
-
|
68 |
-
The following YAML configuration was used to produce this model:
|
69 |
-
|
70 |
-
```yaml
|
71 |
-
models:
|
72 |
-
- model: ./primordial_slop_c
|
73 |
-
- model: ./primordial_slop_d
|
74 |
-
merge_method: slerp
|
75 |
-
base_model: ./primordial_slop_c
|
76 |
-
parameters:
|
77 |
-
t:
|
78 |
-
- value: 0.33
|
79 |
-
dtype: float16
|
80 |
-
|
81 |
-
```
|
|
|
13 |
|
14 |
this thing came to me in a fever dream when I was hung over, but after slightly tweaking the recipe it turned out surprisingly decent. using with the settings included.
|
15 |
|
|
|
|
|
|
|
|
|
16 |
## Update:
|
17 |
The following settings have proved to work good too:
|
18 |
- Context: https://files.catbox.moe/q91rca.json
|
19 |
- Instruct: https://files.catbox.moe/2w8ja2.json
|
20 |
- Textgen: https://files.catbox.moe/s25rad.json
|
21 |
|
|
|
|
|
22 |
## Constituent parts
|
23 |
```yaml
|
24 |
# primordial_slop_a:
|
|
|
42 |
- model: Envoid/Mixtral-Instruct-ITR-DADA-8x7B
|
43 |
```
|
44 |
|
45 |
+
## Quantized versions:
|
46 |
+
- GGUF iMat: [Quant-Cartel/0x01-8x7b-iMat-GGUF](https://huggingface.co/Quant-Cartel/0x01-8x7b-iMat-GGUF)
|
47 |
+
- exl2 rpcal: [Quant-Cartel/0x01-8x7b-exl2-rpcal](https://huggingface.co/Quant-Cartel/0x01-8x7b-exl2-rpcal)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|