Electronics, Vol. 13, Pages 3558: Efficient Headline Generation with Hybrid Attention for Long Texts

1 week ago 21

Electronics, Vol. 13, Pages 3558: Efficient Headline Generation with Hybrid Attention for Long Texts

Electronics doi: 10.3390/electronics13173558

Authors: Wenjin Wan Cong Zhang Lan Huang

Headline generation aims to condense key information from an article or a document into a concise one-sentence summary. The Transformer structure is in general effective for such tasks, yet it suffers from a dramatic increase in training time and GPU consumption as the input text length grows. To address this problem, a hybrid attention mechanism is proposed. Both local and global semantic information among words are modeled in a way that significantly improves training efficiency, especially for long text. Effectiveness is not sacrificed; in fact, fluency and semantic coherence of the generated headlines are enhanced. Experimental results on an open benchmark dataset show that, compared to the baseline model’s best performance, the proposed model obtains a 14.7%, 16.7%, 14.4% and 9.1% increase in the F1 values of the ROUGE-1, the ROUGE-2, the ROUGE-L and the ROUGE-WE metrics, respectively. The semantic coherence of the generated text is also improved, as shown by a 2.8% improvement in the BERTScore’s F1 value. These results show that the effectiveness of the proposed headline generation model with the hybrid attention mechanism is also improved. The hybrid attention mechanism could provide references for relevant text generation tasks.

Read Entire Article