Running
on
Zero
SmoLLMv2
🐢
Text generation using smollmv2-135M model
Text generation using smollmv2-135M model
GPT model pre-training step on Shakespeare dataset
Train and perform inference on MNIST dataset
Text Tokenization using Byte-Pair Encoding (BPE)