Update model card: Correct pipeline tag and add library name
#1
by
nielsr
HF staff
- opened
README.md
CHANGED
@@ -1,15 +1,16 @@
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
-
pipeline_tag:
|
|
|
4 |
---
|
5 |
|
6 |
<h2 align="center"> <a href="https://arxiv.org/abs/2405.14297">Dynamic Mixture of Experts: An Auto-Tuning Approach for Efficient Transformer Models</a></h2>
|
7 |
-
<h5 align="center"> If our project helps you, please give us a star β on <a href="https://github.com/LINs-lab/DynMoE">GitHub</a> and cite our paper!</
|
8 |
<h5 align="center">
|
9 |
|
10 |
## π° News
|
11 |
|
12 |
-
- **[
|
13 |
- **[2024.05.25]** π₯ Our **checkpoints** are available now!
|
14 |
- **[2024.05.23]** π₯ Our [paper](https://arxiv.org/abs/2405.14297) is released!
|
15 |
|
@@ -57,4 +58,8 @@ This project is released under the Apache-2.0 license as found in the [LICENSE](
|
|
57 |
archivePrefix={arXiv},
|
58 |
primaryClass={cs.LG}
|
59 |
}
|
60 |
-
```
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
+
pipeline_tag: text-generation
|
4 |
+
library_name: transformers
|
5 |
---
|
6 |
|
7 |
<h2 align="center"> <a href="https://arxiv.org/abs/2405.14297">Dynamic Mixture of Experts: An Auto-Tuning Approach for Efficient Transformer Models</a></h2>
|
8 |
+
<h5 align="center"> If our project helps you, please give us a star β on <a href="https://github.com/LINs-lab/DynMoE">GitHub</a> and cite our paper!</h5>
|
9 |
<h5 align="center">
|
10 |
|
11 |
## π° News
|
12 |
|
13 |
+
- **[2025.01.23]**: π Our paper is accepted to ICLR 2025!
|
14 |
- **[2024.05.25]** π₯ Our **checkpoints** are available now!
|
15 |
- **[2024.05.23]** π₯ Our [paper](https://arxiv.org/abs/2405.14297) is released!
|
16 |
|
|
|
58 |
archivePrefix={arXiv},
|
59 |
primaryClass={cs.LG}
|
60 |
}
|
61 |
+
```
|
62 |
+
|
63 |
+
## Star History
|
64 |
+
|
65 |
+
[](https://star-history.com/#LINs-lab/DynMoE&Date)
|