TinyLlama-1.1B-Chat-v1.0-llamafile
llamafile lets you distribute and run LLMs with a single file. announcement blog post
Downloads
- tinyllama-1.1b-chat-v1.0.Q3_K_M-server.llamafile
- tinyllama-1.1b-chat-v1.0.Q4_K_M-server.llamafile
- tinyllama-1.1b-chat-v1.0.Q5_0-server.llamafile
- tinyllama-1.1b-chat-v1.0.Q5_K_M-server.llamafile
- tinyllama-1.1b-chat-v1.0.Q8_0-server.llamafile
This repository was created using the llamafile-builder
- Downloads last month
- 22
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model's library.
Model tree for rabil/TinyLlama-1.1B-Chat-v1.0-llamafile
Base model
TinyLlama/TinyLlama-1.1B-Chat-v1.0
Quantized
TheBloke/TinyLlama-1.1B-Chat-v1.0-GGUF