Deep reinforcement and transfer learning for abstractive text summarization: A review

Alomari, Ayham and Idris, Norisma and Sabri, Aznul Qalid Md and Alsmadi, Izzat (2022) Deep reinforcement and transfer learning for abstractive text summarization: A review. Computer Speech & Language, 71. ISSN 0885-2308, DOI

Full text not available from this repository.


Automatic Text Summarization (ATS) is an important area in Natural Language Processing (NLP) with the goal of shortening a long text into a more compact version by conveying the most important points in a readable form. ATS applications continue to evolve and utilize effective approaches that are being evaluated and implemented by researchers. State-of-the-Art (SotA) technologies that demonstrate cutting-edge performance and accuracy in abstractive ATS are deep neural sequence-to-sequence models, Reinforcement Learning (RL) approaches, and Transfer Learning (TL) approaches, including Pre-Trained Language Models (PTLMs). The graph-based Transformer architecture and PTLMs have influenced tremendous advances in NLP applications. Additionally, the incorporation of recent mechanisms, such as the knowledge-enhanced mechanism, significantly enhanced the results. This study provides a comprehensive review of recent research advances in the area of abstractive text summarization for works spanning the past six years. Past and present problems are described, as well as their proposed solutions. In addition, abstractive ATS datasets and evaluation measurements are also highlighted. The paper concludes by comparing the best models and discussing future research directions.

Item Type: Article
Funders: None
Uncontrolled Keywords: Abstractive summarization; Sequence-to-sequence; Reinforcement learning; Pre-trained models
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Divisions: Faculty of Computer Science & Information Technology
Depositing User: Ms. Juhaida Abd Rahim
Date Deposited: 03 Aug 2022 02:22
Last Modified: 03 Aug 2022 02:22

Actions (login required)

View Item View Item