Granite 3.1 Language Models Collection A series of language models with 128K context length trained by IBM licensed under Apache 2.0 license. • 9 items • Updated 17 days ago • 59
D_AU- MOE/Mixture of Experts Models (see also "source" coll) Collection Mixture of Expert Models by me. This leverages the power of multiple models at the same time during generation for next level performance. • 25 items • Updated 3 days ago • 6
MobileLLM Collection Optimizing Sub-billion Parameter Language Models for On-Device Use Cases (ICML 2024) https://arxiv.org/abs/2402.14905 • 9 items • Updated Nov 27, 2024 • 111
LLM in a flash: Efficient Large Language Model Inference with Limited Memory Paper • 2312.11514 • Published Dec 12, 2023 • 259