Update model card: Correct pipeline tag and add library name

#1
by nielsr HF staff - opened
Files changed (1) hide show
  1. README.md +9 -4
README.md CHANGED
@@ -1,15 +1,16 @@
1
  ---
2
  license: apache-2.0
3
- pipeline_tag: image-text-to-text
 
4
  ---
5
 
6
  <h2 align="center"> <a href="https://arxiv.org/abs/2405.14297">Dynamic Mixture of Experts: An Auto-Tuning Approach for Efficient Transformer Models</a></h2>
7
- <h5 align="center"> If our project helps you, please give us a star ⭐ on <a href="https://github.com/LINs-lab/DynMoE">GitHub</a> and cite our paper!</h2>
8
  <h5 align="center">
9
 
10
  ## πŸ“° News
11
 
12
- - **[2024.05.31]** πŸ”₯ Our [code](https://github.com/LINs-lab/DynMoE/) is released!
13
  - **[2024.05.25]** πŸ”₯ Our **checkpoints** are available now!
14
  - **[2024.05.23]** πŸ”₯ Our [paper](https://arxiv.org/abs/2405.14297) is released!
15
 
@@ -57,4 +58,8 @@ This project is released under the Apache-2.0 license as found in the [LICENSE](
57
  archivePrefix={arXiv},
58
  primaryClass={cs.LG}
59
  }
60
- ```
 
 
 
 
 
1
  ---
2
  license: apache-2.0
3
+ pipeline_tag: text-generation
4
+ library_name: transformers
5
  ---
6
 
7
  <h2 align="center"> <a href="https://arxiv.org/abs/2405.14297">Dynamic Mixture of Experts: An Auto-Tuning Approach for Efficient Transformer Models</a></h2>
8
+ <h5 align="center"> If our project helps you, please give us a star ⭐ on <a href="https://github.com/LINs-lab/DynMoE">GitHub</a> and cite our paper!</h5>
9
  <h5 align="center">
10
 
11
  ## πŸ“° News
12
 
13
+ - **[2025.01.23]**: πŸŽ‰ Our paper is accepted to ICLR 2025!
14
  - **[2024.05.25]** πŸ”₯ Our **checkpoints** are available now!
15
  - **[2024.05.23]** πŸ”₯ Our [paper](https://arxiv.org/abs/2405.14297) is released!
16
 
 
58
  archivePrefix={arXiv},
59
  primaryClass={cs.LG}
60
  }
61
+ ```
62
+
63
+ ## Star History
64
+
65
+ [![Star History Chart](https://api.star-history.com/svg?repos=LINs-lab/DynMoE&type=Date)](https://star-history.com/#LINs-lab/DynMoE&Date)