Updated release history and overview
Browse files
README.md
CHANGED
@@ -25,7 +25,7 @@ Krutrim Large Language Model (LLM) is a 2 trillion token multilingual foundation
|
|
25 |
- 7B parameter dense transformer model comparable similarly sized LLama-2 model;
|
26 |
- Natively multilingual delivering best-in-class performance for a 7B mdoel on Indic benchmarks;
|
27 |
- Exceeds performance of similar sized models on multilingual Indic generation tasks including creative writing, summarization, and translation;
|
28 |
-
- Available in
|
29 |
|
30 |
## Model Developer
|
31 |
- OLA Krutrim Team
|
@@ -37,7 +37,8 @@ Krutrim Large Language Model (LLM) is a 2 trillion token multilingual foundation
|
|
37 |
|
38 |
| Model Name | Release Date |Release Note | Reference|
|
39 |
|------------|-------------|-------------|-------------|
|
40 |
-
| Krutrim-1-
|
|
|
41 |
|
42 |
|
43 |
## Data Freshness
|
|
|
25 |
- 7B parameter dense transformer model comparable similarly sized LLama-2 model;
|
26 |
- Natively multilingual delivering best-in-class performance for a 7B mdoel on Indic benchmarks;
|
27 |
- Exceeds performance of similar sized models on multilingual Indic generation tasks including creative writing, summarization, and translation;
|
28 |
+
- Available in instruction-tuned version
|
29 |
|
30 |
## Model Developer
|
31 |
- OLA Krutrim Team
|
|
|
37 |
|
38 |
| Model Name | Release Date |Release Note | Reference|
|
39 |
|------------|-------------|-------------|-------------|
|
40 |
+
| Krutrim-1-Base | 2024-01-31 | Trained from scratch | |
|
41 |
+
| Krutrim-1-Instruct | 2024-01-31 | SFT on Krutrim-1 Base |[Here](https://huggingface.co/krutrim-ai-labs/Krutrim-1-instruct)|
|
42 |
|
43 |
|
44 |
## Data Freshness
|