AraBART is the first Arabic model in which the encoder and the decoder are pretrained end-to-end, based on BART. AraBART follows the architecture of BART-Base which has 6 encoder and 6 decoder layers and 768 hidden dimensions. In total AraBART has 139M parameters.

AraBART achieves the best performance on multiple abstractive summarization datasets, outperforming strong baselines including a pretrained Arabic BERT-based models and multilingual mBART and mT5 models.

Downloads last month
650
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for moussaKam/AraBART

Finetunes
8 models

Spaces using moussaKam/AraBART 2