Update README.md
Browse files
README.md
CHANGED
|
@@ -123,7 +123,7 @@ PowerMoE-3B is a 3B sparse Mixture-of-Experts (sMoE) language model trained with
|
|
| 123 |
Paper: https://arxiv.org/abs/2408.13359
|
| 124 |
|
| 125 |
## Usage
|
| 126 |
-
Note:
|
| 127 |
|
| 128 |
### Generation
|
| 129 |
This is a simple example of how to use **PowerMoE-3b** model.
|
|
|
|
| 123 |
Paper: https://arxiv.org/abs/2408.13359
|
| 124 |
|
| 125 |
## Usage
|
| 126 |
+
Note: Requires installing HF transformers from source.
|
| 127 |
|
| 128 |
### Generation
|
| 129 |
This is a simple example of how to use **PowerMoE-3b** model.
|