File size: 322 Bytes
938bce8
 
8f970b3
 
 
 
 
938bce8
8f970b3
1
2
3
4
5
6
7
8
9
---
license: apache-2.0
datasets:
- Skylion007/openwebtext
language:
- en
pipeline_tag: text-generation
---
A pretrained language model based on the Mistral 7B model, shrunk to about 248 million parameters. Minimal training was needed for this model, with only 250,000 examples over 125,000 steps required for convergence.