Sleeping
🏆
SmolLM2 135M Text Generator
SmoLM implementation trained on Cosmopedia-10k dataset
SmoLM implementation trained on Cosmopedia-10k dataset
GPT-2 model trained on Julius Caesar play by Shakespear
BPE based Tokenizer for Devanagri based texts
ResNet 50 model trained on Imagenet-1k data.