File size: 667 Bytes
1fc34c6 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 |
# Model Card
## Model Description
This is a Large Language Model (LLM) trained on a dataset of DIBT_10k_prompts. This is a test model to ensure I can publish models.
## Evaluation Results
### Hellaswag
| Tasks |Version|Filter|n-shot| Metric | |Value | |Stderr|
|---------|------:|------|-----:|--------|---|-----:|---|-----:|
|hellaswag| 1|none | 0|acc |↑ |0.2872|± |0.0045|
| | |none | 0|acc_norm|↑ |0.3082|± |0.0046|
## How to Use
To use this model, simply download the checkpoint and load it into your preferred deep-learning framework.
I suggest not using this model.
Use EleutherAI/pythia-160m instead. |