sydonayrex's picture
Update README.md
c836238 verified
|
raw
history blame
1.21 kB
metadata
base_model: sydonayrex/AI-Llama3-21B
language:
  - en
license: llama3
tags:
  - text-generation-inference
  - transformers
  - unsloth
  - llama
  - trl
  - sft
pipeline_tag: text-generation
library_name: transformers

The provided model is a multi-layerer folded model, using multiple layers from the base Llama3 8B Instruct base, to increase its size to 21B parameters using mergekit. Rather than just using passthrough, task arithmetic was used. Further fine tuning was performed to ensure the model's weights and inference should be rebaselined.

q3_k_s GGUF :https://huggingface.co/sydonayrex/Blackjack-Llama3-21B-Q3_K_S-GGUF q4_k_m GGUF :https://huggingface.co/sydonayrex/Blackjack-Llama3-21B-Q4_K_M-GGUF

Uploaded model

  • Developed by: sydonayrex
  • License: Llama 3
  • Finetuned from model : sydonayrex/AI-Llama3-21B

This llama model was trained 2x faster with Unsloth and Huggingface's TRL library.

Llama image generated by Meta AI.