Update README.md
Browse files
README.md
CHANGED
@@ -59,7 +59,7 @@ The six LLaMa models trained in (1) and (2) are merged into mixtral blocks using
|
|
59 |
Read the paper for further details.
|
60 |
|
61 |
### Sources
|
62 |
-
[1] https://tdcommons.ai/single_pred_tasks/overview
|
63 |
[2] https://github.com/arcee-ai/mergekit
|
64 |
|
65 |
<!--
|
|
|
59 |
Read the paper for further details.
|
60 |
|
61 |
### Sources
|
62 |
+
[1] https://tdcommons.ai/single_pred_tasks/overview <br>
|
63 |
[2] https://github.com/arcee-ai/mergekit
|
64 |
|
65 |
<!--
|