Pre-training 11M parameter models on 29k, 5M, 10M, 20M, & 205M rows of Dolma dataset using pico-lm
Thomas Nguyen
ThomasTheMaker
AI & ML interests
Building the world's fastest CPU LLM inference layer
Recent Activity
updated
a model
about 1 hour ago
ThomasTheMaker/ReplaceMe-Experiments
published
a model
2 days ago
ThomasTheMaker/ReplaceMe-Experiments
updated
a collection
2 days ago
Pico LM
Organizations
None yet