Pre-training 11M parameter models on 29k, 5M, 10M, 20M, & 205M rows of Dolma dataset using pico-lm
Thomas Nguyen
ThomasTheMaker
AI & ML interests
Building the world's fastest CPU LLM inference layer
Recent Activity
updated
a model
about 14 hours ago
ThomasTheMaker/ReplaceMe-Experiments
published
a model
about 14 hours ago
ThomasTheMaker/ReplaceMe-Experiments
updated
a collection
about 15 hours ago
Pico LM
Organizations
None yet