
Unofficial Mistral Community
community
AI & ML interests
Unofficial org for community upload of Mistral's Open Source models.
Recent Activity
View all activity
mistral-community's activity
Post
2367
Nice new space to see how fast your personal or organization followers are growing on HF:
julien-c/follow-history
As you can see, I still have more followers than @julien-c even if he's trying to change this by building such cool spaces πππ
julien-c/follow-history
As you can see, I still have more followers than @julien-c even if he's trying to change this by building such cool spaces πππ

mrfakenameΒ
posted
an
update
6 days ago
Post
2308
GGUF quants (text-only) for the new Mistral Small 3.1 24B are now live:
mrfakename/mistral-small-3.1-24b-instruct-2503-gguf
mrfakename/mistral-small-3.1-24b-instruct-2503-gguf

mrfakenameΒ
posted
an
update
7 days ago
Post
2173
Converted the new Mistral Small 3.1 models to HF format (currently text-only, no vision):
Instruct: mrfakename/mistral-small-3.1-24b-instruct-2503-hf
Base: mrfakename/mistral-small-3.1-24b-base-2503-hf
GGUF quants coming soon!
Instruct: mrfakename/mistral-small-3.1-24b-instruct-2503-hf
Base: mrfakename/mistral-small-3.1-24b-base-2503-hf
GGUF quants coming soon!
Cannot apply chat template from tokenizer
1
#31 opened 11 days ago
by
DarkLight1337
Post
7225
I was chatting with
@peakji
, one of the cofounders of Manu AI, who told me he was on Hugging Face (very cool!).
He shared an interesting insight which is that agentic capabilities might be more of an alignment problem rather than a foundational capability issue. Similar to the difference between GPT-3 and InstructGPT, some open-source foundation models are simply trained to 'answer everything in one response regardless of the complexity of the question' - after all, that's the user preference in chatbot use cases. Just a bit of post-training on agentic trajectories can make an immediate and dramatic difference.
As a thank you to the community, he shared 100 invite code first-come first serve, just use βHUGGINGFACEβ to get access!
He shared an interesting insight which is that agentic capabilities might be more of an alignment problem rather than a foundational capability issue. Similar to the difference between GPT-3 and InstructGPT, some open-source foundation models are simply trained to 'answer everything in one response regardless of the complexity of the question' - after all, that's the user preference in chatbot use cases. Just a bit of post-training on agentic trajectories can make an immediate and dramatic difference.
As a thank you to the community, he shared 100 invite code first-come first serve, just use βHUGGINGFACEβ to get access!
Post
4669
10,000+ models based on Deepseek R1 have been publicly shared on Hugging Face! Which ones are your favorite ones: https://huggingface.co/models?sort=trending&search=r1. Truly game-changer!

JustinLin610Β
authored
a
paper
18 days ago
prefill assistant responses
1
#16 opened 20 days ago
by
alexsafayan

Post
5899
Super happy to welcome Nvidia as our latest enterprise hub customer. They have almost 2,000 team members using Hugging Face, and close to 20,000 followers of their org. Can't wait to see what they'll open-source for all of us in the coming months!
Nvidia's org: https://huggingface.co/nvidia
Enterprise hub: https://huggingface.co/enterprise
Nvidia's org: https://huggingface.co/nvidia
Enterprise hub: https://huggingface.co/enterprise

JustinLin610Β
authored
a
paper
24 days ago

JustinLin610Β
authored
a
paper
about 1 month ago
Post
2829
What are the best organizations to follow on
@huggingface
?
On top of my head:
- Deepseek (35,000 followers): https://huggingface.co/deepseek-ai
- Meta Llama (27,000 followers): https://huggingface.co/meta-llama
- Black Forrest Labs (11,000 followers): https://huggingface.co/black-forest-labs
- OpenAI (5,000 followers): https://huggingface.co/openai
- Nvidia (16,000 followers): https://huggingface.co/nvidia
- MIcrosoft (9,000 followers): https://huggingface.co/microsoft
- AllenAI (2,000 followers): https://huggingface.co/allenai
- Mistral (5,000 followers): https://huggingface.co/mistralai
- XAI (600 followers): https://huggingface.co/xai-org
- Stability AI (16,000 followers): https://huggingface.co/stabilityai
- Qwen (16,000 followers): https://huggingface.co/Qwen
- GoogleAI (8,000 followers): https://huggingface.co/google
- Unsloth (3,000 followers): https://huggingface.co/unsloth
- Bria AI (4,000 followers): https://huggingface.co/briaai
- NousResearch (1,300 followers): https://huggingface.co/NousResearch
Bonus, the agent course org with 17,000 followers: https://huggingface.co/agents-course
On top of my head:
- Deepseek (35,000 followers): https://huggingface.co/deepseek-ai
- Meta Llama (27,000 followers): https://huggingface.co/meta-llama
- Black Forrest Labs (11,000 followers): https://huggingface.co/black-forest-labs
- OpenAI (5,000 followers): https://huggingface.co/openai
- Nvidia (16,000 followers): https://huggingface.co/nvidia
- MIcrosoft (9,000 followers): https://huggingface.co/microsoft
- AllenAI (2,000 followers): https://huggingface.co/allenai
- Mistral (5,000 followers): https://huggingface.co/mistralai
- XAI (600 followers): https://huggingface.co/xai-org
- Stability AI (16,000 followers): https://huggingface.co/stabilityai
- Qwen (16,000 followers): https://huggingface.co/Qwen
- GoogleAI (8,000 followers): https://huggingface.co/google
- Unsloth (3,000 followers): https://huggingface.co/unsloth
- Bria AI (4,000 followers): https://huggingface.co/briaai
- NousResearch (1,300 followers): https://huggingface.co/NousResearch
Bonus, the agent course org with 17,000 followers: https://huggingface.co/agents-course
Post
3488
We crossed 1B+ tokens routed to inference providers partners on HF, that we released just a few days ago.
Just getting started of course but early users seem to like it & always happy to be able to partner with cool startups in the ecosystem.
Have you been using any integration and how can we make it better?
https://huggingface.co/blog/inference-providers
Just getting started of course but early users seem to like it & always happy to be able to partner with cool startups in the ecosystem.
Have you been using any integration and how can we make it better?
https://huggingface.co/blog/inference-providers
Support of flash attention 2?
2
#29 opened about 1 month ago
by
LuciusLan
Fastest way for inference?
3
#28 opened about 2 months ago
by
psycy
Fastest way for inference?
3
#28 opened about 2 months ago
by
psycy

reach-vbΒ
authored
a
paper
about 2 months ago
Getting shape mismatch while loading saved Pixtral model
4
#24 opened about 2 months ago
by
ss007