Update README.md
Browse files
README.md
CHANGED
@@ -4,8 +4,6 @@ library_name: transformers
|
|
4 |
---
|
5 |
# Laser-Dolphin-Mixtral-4x7b-dpo
|
6 |
|
7 |
-
*New version is coming because of chat template issues. The other MoE models in my collection do not have this issue and have been tested more*
|
8 |
-
|
9 |

|
10 |
|
11 |
Credit to Fernando Fernandes and Eric Hartford for their project [laserRMT](https://github.com/cognitivecomputations/laserRMT)
|
@@ -83,29 +81,11 @@ It recursively sorts the left and right sub-arrays and concatenates the results
|
|
83 |
|
84 |
Q4_K_M and Q5_K_M quants are available [here](https://huggingface.co/macadeliccc/laser-dolphin-mixtral-4x7b-dpo-GGUF)
|
85 |
|
86 |
-

|
8 |
|
9 |
Credit to Fernando Fernandes and Eric Hartford for their project [laserRMT](https://github.com/cognitivecomputations/laserRMT)
|
|
|
81 |
|
82 |
Q4_K_M and Q5_K_M quants are available [here](https://huggingface.co/macadeliccc/laser-dolphin-mixtral-4x7b-dpo-GGUF)
|
83 |
|
84 |
+

|
85 |
|
86 |
## Eval
|
87 |
|
88 |
+
**New evaluations in progress**
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
89 |
|
90 |
## Citations
|
91 |
|