Cyrile commited on
Commit
98e4a69
Β·
verified Β·
1 Parent(s): 4688c85

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -9
README.md CHANGED
@@ -10,10 +10,10 @@ pinned: false
10
  **BrestStormTeam**
11
 
12
  **Mission:**
13
- To push the boundaries of efficient AI model training by developing the largest-scale State Space Model (SSM) with the smallest infrastructure footprint possible. We are committed to minimizing both economic costs and ecological impact without compromising linguistic performance.
14
 
15
  **Model:**
16
- **Tempest-LLM** – a state-of-the-art model based on **Mamba2**, utilizing advanced compression techniques achieving an encoding efficiency of **1.58 bits per parameter**.
17
 
18
  **Training Approach:**
19
  Our model benefits from a balanced multilingual training strategy, ensuring equal proficiency in:
@@ -21,14 +21,12 @@ Our model benefits from a balanced multilingual training strategy, ensuring equa
21
  - πŸ‡¬πŸ‡§ **English**
22
  - πŸ‡ͺπŸ‡Έ **Spanish**
23
 
24
- This equidistributed multilingual training enhances linguistic versatility and cultural adaptability across different languages and contexts.
25
 
26
  **Impact:**
27
- - **Economic:** Reduced computational infrastructure leads to significantly lower operational costs.
28
- - **Ecological:** Lower power consumption and infrastructure needs substantially decrease environmental footprint.
29
- - **Performance:** Maintains high linguistic accuracy and fluency despite compression and optimization.
30
 
31
  **Vision:**
32
- BrestStormTeam aims to demonstrate that cutting-edge linguistic AI technologies can be powerful, accessible, and sustainable, setting a new standard in responsible AI innovation.
33
-
34
-
 
10
  **BrestStormTeam**
11
 
12
  **Mission:**
13
+ We aim to efficiently train large-scale State Space Models (SSM) while significantly reducing infrastructure usage. Our goal is to minimize economic and environmental impacts without substantially compromising linguistic performance.
14
 
15
  **Model:**
16
+ **Tempest-LLM** – an efficient language model based on **Mamba2**, leveraging advanced compression methods to achieve an encoding efficiency of **1.58 bits per parameter**.
17
 
18
  **Training Approach:**
19
  Our model benefits from a balanced multilingual training strategy, ensuring equal proficiency in:
 
21
  - πŸ‡¬πŸ‡§ **English**
22
  - πŸ‡ͺπŸ‡Έ **Spanish**
23
 
24
+ This multilingual training enhances linguistic versatility and cultural adaptability across different languages and contexts.
25
 
26
  **Impact:**
27
+ - **Economic:** Reduced computational infrastructure leads to lower operational costs.
28
+ - **Ecological:** Lower power consumption and minimal infrastructure requirements decrease environmental footprint.
29
+ - **Performance:** Maintains robust linguistic accuracy and fluency despite compression and optimization.
30
 
31
  **Vision:**
32
+ BrestStormTeam is committed to showing that linguistic AI technologies can be both powerful and sustainable, contributing responsibly to AI innovation.