Update README.md
Browse files
README.md
CHANGED
@@ -9,7 +9,7 @@ Skimformer is a two-stage Transformer that replaces self-attention with Skim-Att
|
|
9 |
[Skim-Attention: Learning to Focus via Document Layout](https://arxiv.org/abs/2109.01078)
|
10 |
Laura Nguyen, Thomas Scialom, Jacopo Staiano, Benjamin Piwowarski, [EMNLP 2021](https://2021.emnlp.org/papers)
|
11 |
|
12 |
-
A collaboration
|
13 |
|
14 |
## Citation
|
15 |
|
|
|
9 |
[Skim-Attention: Learning to Focus via Document Layout](https://arxiv.org/abs/2109.01078)
|
10 |
Laura Nguyen, Thomas Scialom, Jacopo Staiano, Benjamin Piwowarski, [EMNLP 2021](https://2021.emnlp.org/papers)
|
11 |
|
12 |
+
A collaboration between [reciTAL](https://recital.ai/en/) & [MLIA](https://mlia.lip6.fr/) (ISIR, Sorbonne Université)
|
13 |
|
14 |
## Citation
|
15 |
|