Update README.md
Browse files
README.md
CHANGED
|
@@ -18,7 +18,10 @@ library_name: transformers
|
|
| 18 |
|
| 19 |
# Model Summary
|
| 20 |
|
| 21 |
-
|
|
|
|
|
|
|
|
|
|
| 22 |
|
| 23 |
# Use
|
| 24 |
|
|
|
|
| 18 |
|
| 19 |
# Model Summary
|
| 20 |
|
| 21 |
+
> MolmoE-1B is a multimodal Mixture-of-Experts LLM with 1.5B active and 7.2B total parameters released in September 2024 (0924) based on [OLMoE-1B-7B-0924](https://huggingface.co/allenai/OLMoE-1B-7B-0924). It yields state-of-the-art performance among multimodal models with a similar size while being fully open-source.
|
| 22 |
+
|
| 23 |
+
- **Paper:** WIP
|
| 24 |
+
- **Code:** WIP
|
| 25 |
|
| 26 |
# Use
|
| 27 |
|