Papers
arxiv:2004.02984

MobileBERT: a Compact Task-Agnostic BERT for Resource-Limited Devices

Published on Apr 6, 2020
Authors:
,
,
,
,
,

Abstract

Natural Language Processing (NLP) has recently achieved great success by using huge pre-trained models with hundreds of millions of parameters. However, these models suffer from heavy model sizes and high latency such that they cannot be deployed to resource-limited mobile devices. In this paper, we propose Mobile<PRE_TAG>BERT</POST_TAG> for compressing and accelerating the popular BERT model. Like the original BERT, Mobile<PRE_TAG>BERT</POST_TAG> is task-agnostic, that is, it can be generically applied to various downstream NLP tasks via simple fine-tuning. Basically, Mobile<PRE_TAG>BERT</POST_TAG> is a thin version of <PRE_TAG>BERT_LARGE</POST_TAG>, while equipped with bottleneck structures and a carefully designed balance between self-attentions and feed-forward networks. To train Mobile<PRE_TAG>BERT</POST_TAG>, we first train a specially designed teacher model, an inverted-bottleneck incorporated <PRE_TAG>BERT_LARGE</POST_TAG> model. Then, we conduct knowledge transfer from this teacher to Mobile<PRE_TAG>BERT</POST_TAG>. Empirical studies show that Mobile<PRE_TAG>BERT</POST_TAG> is 4.3x smaller and 5.5x faster than BERT_BASE while achieving competitive results on well-known benchmarks. On the natural language inference tasks of GLUE, Mobile<PRE_TAG>BERT</POST_TAG> achieves a GLUEscore o 77.7 (0.6 lower than BERT_BASE), and 62 ms latency on a Pixel 4 phone. On the SQuAD v1.1/v2.0 question answering task, Mobile<PRE_TAG>BERT</POST_TAG> achieves a dev F1 score of 90.0/79.2 (1.5/2.1 higher than BERT_BASE).

Community

Sign up or log in to comment

Models citing this paper 10

Browse 10 models citing this paper

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2004.02984 in a dataset README.md to link it from this page.

Spaces citing this paper 7

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.