Update README.md
Browse files
README.md
CHANGED
@@ -13,7 +13,10 @@ language:
|
|
13 |
|
14 |
- **Developed by:** Tuan Pham (FPTU HCM Student)
|
15 |
- **Model type:** Llama2-7B Decoder-only
|
16 |
-
- **Finetuned from model :**
|
|
|
|
|
|
|
17 |
- **Bilingual support :** English and Vietnamese
|
18 |
|
19 |
### Model Sources
|
@@ -49,7 +52,7 @@ Use the code below to get started with the model.
|
|
49 |
from torch.cuda.amp import autocast
|
50 |
from transformers import AutoModelForCausalLM, AutoTokenizer, TextStreamer, pipeline
|
51 |
|
52 |
-
model_name = "1TuanPham/
|
53 |
model = AutoModelForCausalLM.from_pretrained(model_name,
|
54 |
torch_dtype=torch.bfloat16,
|
55 |
use_cache=True,
|
|
|
13 |
|
14 |
- **Developed by:** Tuan Pham (FPTU HCM Student)
|
15 |
- **Model type:** Llama2-7B Decoder-only
|
16 |
+
- **Finetuned from model :**
|
17 |
+
* meta-llama/Llama-2-7b
|
18 |
+
* bkai-foundation-models/vietnamese-llama2-7b-120GB
|
19 |
+
* yeen214/llama2_7b_merge_orcafamily.
|
20 |
- **Bilingual support :** English and Vietnamese
|
21 |
|
22 |
### Model Sources
|
|
|
52 |
from torch.cuda.amp import autocast
|
53 |
from transformers import AutoModelForCausalLM, AutoTokenizer, TextStreamer, pipeline
|
54 |
|
55 |
+
model_name = "1TuanPham/T-Llama-v1.1"
|
56 |
model = AutoModelForCausalLM.from_pretrained(model_name,
|
57 |
torch_dtype=torch.bfloat16,
|
58 |
use_cache=True,
|