Adapters for the paper "M2QA: Multi-domain Multilingual Question Answering".
We evaluate 2 setups: MAD-X+Domain and MAD-X²
AI & ML interests
Parameter-Efficient Fine-Tuning
Adapters from the paper "What to Pre-Train on? Efficient Intermediate Task Selection" (Poth et al., 2021)
-
What to Pre-Train on? Efficient Intermediate Task Selection
Paper • 2104.08247 • Published -
AdapterHub/roberta-base-pf-imdb
Text Classification • Updated • 10 -
AdapterHub/roberta-base-pf-conll2003
Token Classification • Updated • 6 • 1 -
AdapterHub/bert-base-uncased-pf-anli_r3
Text Classification • Updated • 10
Adapters from the paper "AdapterFusion: Non-Destructive Task Composition for Transfer Learning" (Pfeiffer et al., 2021)
Adapters for the paper "M2QA: Multi-domain Multilingual Question Answering".
We evaluate 2 setups: MAD-X+Domain and MAD-X²
MAD-X language adapters from the paper "MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer" for BERT and XLM-RoBERTa.
Adapters from the paper "What to Pre-Train on? Efficient Intermediate Task Selection" (Poth et al., 2021)
-
What to Pre-Train on? Efficient Intermediate Task Selection
Paper • 2104.08247 • Published -
AdapterHub/roberta-base-pf-imdb
Text Classification • Updated • 10 -
AdapterHub/roberta-base-pf-conll2003
Token Classification • Updated • 6 • 1 -
AdapterHub/bert-base-uncased-pf-anli_r3
Text Classification • Updated • 10
Adapters from the paper "AdapterFusion: Non-Destructive Task Composition for Transfer Learning" (Pfeiffer et al., 2021)
Adapters from the paper "Lifting the Curse of Multilinguality by Pre-training Modular Transformers"