
TempestTeam
AI & ML interests
Our goal is to train the largest possible SSM model while minimizing infrastructure requirements. This approach reduces both economic and environmental impact without significantly compromising the model's linguistic performance.
Recent Activity
BrestStormTeam
Mission:
We aim to efficiently train large-scale State Space Models (SSM) while significantly reducing infrastructure usage. Our goal is to minimize economic and environmental impacts without substantially compromising linguistic performance.
Model:
Tempest-LLM – an efficient language model based on Mamba2, leveraging advanced compression methods to achieve an encoding efficiency of 1.58 bits per parameter.
Training Approach:
Our model benefits from a balanced multilingual training strategy, ensuring equal proficiency in:
- 🇫🇷 French
- 🇬🇧 English
- 🇪🇸 Spanish
This multilingual training enhances linguistic versatility and cultural adaptability across different languages and contexts.
Impact:
- Economic: Reduced computational infrastructure leads to lower operational costs.
- Ecological: Lower power consumption and minimal infrastructure requirements decrease environmental footprint.
- Performance: Maintains robust linguistic accuracy and fluency despite compression and optimization.
Vision:
BrestStormTeam is committed to showing that linguistic AI technologies can be both powerful and sustainable, contributing responsibly to AI innovation.