sydonayrex commited on
Commit
ec06719
·
verified ·
1 Parent(s): 21b1a70

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -0
README.md CHANGED
@@ -14,6 +14,9 @@ pipeline_tag: text-generation
14
  library_name: transformers
15
  ---
16
  <img src="llama-blackjack.jpeg" width="512" height="512">
 
 
 
17
  # Uploaded model
18
 
19
  - **Developed by:** sydonayrex
 
14
  library_name: transformers
15
  ---
16
  <img src="llama-blackjack.jpeg" width="512" height="512">
17
+
18
+ The provided model is a multi-layerer folded model, using multiple layers from the base Llama3 8B Instruct base, to increase its size to 21B parameters using mergekit. Rather than just using passthrough, task arithmetic was used. Further fine tuning was performed to ensure the model's weights and inference should be rebaselined.
19
+
20
  # Uploaded model
21
 
22
  - **Developed by:** sydonayrex