Update README.md
Browse files
README.md
CHANGED
@@ -117,7 +117,7 @@ language:
|
|
117 |
nomic-embed-text-v2-moe is SoTA multilingual MoE text embedding model:
|
118 |
|
119 |
- **High Performance**: SoTA Multilingual performance compared to ~300M parameter models, competitive with models 2x in size
|
120 |
-
- **Multilinguality**: Supports 100
|
121 |
- **Flexible Embedding Dimension**: Trained with [Matryoshka Embeddings](https://arxiv.org/abs/2205.13147) with 3x reductions in storage cost with minimal performance degredations
|
122 |
- **Fully-Open Source**: Model weights, [code](https://github.com/nomic-ai/contrastors), and training data (see code repo) released
|
123 |
|
|
|
117 |
nomic-embed-text-v2-moe is SoTA multilingual MoE text embedding model:
|
118 |
|
119 |
- **High Performance**: SoTA Multilingual performance compared to ~300M parameter models, competitive with models 2x in size
|
120 |
+
- **Multilinguality**: Supports ~100 languages and trained over 1.6B pairs
|
121 |
- **Flexible Embedding Dimension**: Trained with [Matryoshka Embeddings](https://arxiv.org/abs/2205.13147) with 3x reductions in storage cost with minimal performance degredations
|
122 |
- **Fully-Open Source**: Model weights, [code](https://github.com/nomic-ai/contrastors), and training data (see code repo) released
|
123 |
|