StyleBART: Decorate Pretrained Model with Style Adapters for Unsupervised Stylistic Headline Generation
Abstract
Stylistic headline generation is the task to generate a headline that not only summarizes the content of an article, but also reflects a desired style that attracts users. As style-specific article-headline pairs are scarce, previous researches focus on <PRE_TAG>unsupervised approach</POST_TAG>es with a standard headline generation dataset and mono-style corpora. In this work, we follow this line and propose StyleBART, an <PRE_TAG>unsupervised approach</POST_TAG> for stylistic headline generation. Our method decorates the pretrained <PRE_TAG>BART model</POST_TAG> with <PRE_TAG>adapters</POST_TAG> that are responsible for different styles and allows the generation of headlines with diverse styles by simply switching the <PRE_TAG>adapters</POST_TAG>. Different from previous works, StyleBART separates the task of style learning and headline generation, making it possible to freely combine the base model and the style <PRE_TAG>adapters</POST_TAG> during inference. We further propose an inverse paraphrasing task to enhance the style <PRE_TAG>adapters</POST_TAG>. Extensive automatic and human evaluations show that StyleBART achieves new state-of-the-art performance in the unsupervised <PRE_TAG>stylistic headline generation</POST_TAG> task, producing high-quality headlines with the desired style.
Models citing this paper 0
No model linking this paper
Datasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 0
No Space linking this paper
Collections including this paper 0
No Collection including this paper