File size: 1,083 Bytes
a2fb909 775e733 433c58b ddf23c6 433c58b 7d5e919 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 |
---
license: agpl-3.0
---
LLaMA-13B merged with Instruct-13B weights, just werks.
Prompt format:
```
user instruction here
optional additional user input
generated output
```
Example prompt:
```
Does this tweet have negative or positive sentiment?
i hate my life!!!!
negative
```
Feel free to donate:
XMR: ```86Z8nLSVPx3SZ5z7iWugeK5JruAeGPUJyExD9e3wdTSxUvFMhGXNG9ucPqCm8M29y1AxP6ta56GBQ4GiEUMzeew9MfX1yct```
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_llama-anon__instruct-13b)
| Metric | Value |
|-----------------------|---------------------------|
| Avg. | 47.83 |
| ARC (25-shot) | 56.14 |
| HellaSwag (10-shot) | 80.27 |
| MMLU (5-shot) | 47.89 |
| TruthfulQA (0-shot) | 36.97 |
| Winogrande (5-shot) | 73.56 |
| GSM8K (5-shot) | 2.27 |
| DROP (3-shot) | 37.7 |
|