metadata
license: apache-2.0
datasets:
- HuggingFaceTB/smollm-corpus
language:
- en
base_model:
- mittagessen/bytellama_random
This is a ByteLlama 101M model pretrained on the Cosmopedia v2 portion of the SmolLM corpus for 2 epochs.
license: apache-2.0
datasets:
- HuggingFaceTB/smollm-corpus
language:
- en
base_model:
- mittagessen/bytellama_random
This is a ByteLlama 101M model pretrained on the Cosmopedia v2 portion of the SmolLM corpus for 2 epochs.