Update README.md
Browse files
README.md
CHANGED
|
@@ -299,11 +299,19 @@ print(results)
|
|
| 299 |
|
| 300 |
### Results
|
| 301 |
|
| 302 |
-
#### Open LLM Leaderboard
|
|
|
|
| 303 |
|
| 304 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 305 |
|
| 306 |
-

|
| 307 |
|
| 308 |
#### VMLU
|
| 309 |
|
|
@@ -403,16 +411,3 @@ Many thanks for
|
|
| 403 |
## Model Card Contact
|
| 404 |
|
| 405 |
**Lam H** ([email protected])
|
| 406 |
-
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
|
| 407 |
-
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_lamhieu__ghost-7b-v0.9.0)
|
| 408 |
-
|
| 409 |
-
| Metric |Value|
|
| 410 |
-
|---------------------------------|----:|
|
| 411 |
-
|Avg. |56.89|
|
| 412 |
-
|AI2 Reasoning Challenge (25-Shot)|53.07|
|
| 413 |
-
|HellaSwag (10-Shot) |77.93|
|
| 414 |
-
|MMLU (5-Shot) |55.09|
|
| 415 |
-
|TruthfulQA (0-shot) |47.79|
|
| 416 |
-
|Winogrande (5-shot) |73.72|
|
| 417 |
-
|GSM8k (5-shot) |33.74|
|
| 418 |
-
|
|
|
|
| 299 |
|
| 300 |
### Results
|
| 301 |
|
| 302 |
+
#### [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
|
| 303 |
+
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_lamhieu__ghost-7b-v0.9.0)
|
| 304 |
|
| 305 |
+
| Metric |Value|
|
| 306 |
+
|---------------------------------|----:|
|
| 307 |
+
|Avg. |56.89|
|
| 308 |
+
|AI2 Reasoning Challenge (25-Shot)|53.07|
|
| 309 |
+
|HellaSwag (10-Shot) |77.93|
|
| 310 |
+
|MMLU (5-Shot) |55.09|
|
| 311 |
+
|TruthfulQA (0-shot) |47.79|
|
| 312 |
+
|Winogrande (5-shot) |73.72|
|
| 313 |
+
|GSM8k (5-shot) |33.74|
|
| 314 |
|
|
|
|
| 315 |
|
| 316 |
#### VMLU
|
| 317 |
|
|
|
|
| 411 |
## Model Card Contact
|
| 412 |
|
| 413 |
**Lam H** ([email protected])
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|