We apologize for a recent technical issue with our email system, which temporarily affected account activations. Accounts have now been activated. Authors may proceed with paper submissions. PhDFocusTM
CFP last date
20 December 2024
Reseach Article

Abstractive Summarization of Document using Dual Encoding Framework

by Monika H. Rajput, B. R. Mandre
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 176 - Number 39
Year of Publication: 2020
Authors: Monika H. Rajput, B. R. Mandre
10.5120/ijca2020920544

Monika H. Rajput, B. R. Mandre . Abstractive Summarization of Document using Dual Encoding Framework. International Journal of Computer Applications. 176, 39 ( Jul 2020), 50-58. DOI=10.5120/ijca2020920544

@article{ 10.5120/ijca2020920544,
author = { Monika H. Rajput, B. R. Mandre },
title = { Abstractive Summarization of Document using Dual Encoding Framework },
journal = { International Journal of Computer Applications },
issue_date = { Jul 2020 },
volume = { 176 },
number = { 39 },
month = { Jul },
year = { 2020 },
issn = { 0975-8887 },
pages = { 50-58 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume176/number39/31463-2020920544/ },
doi = { 10.5120/ijca2020920544 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-07T00:40:55.029763+05:30
%A Monika H. Rajput
%A B. R. Mandre
%T Abstractive Summarization of Document using Dual Encoding Framework
%J International Journal of Computer Applications
%@ 0975-8887
%V 176
%N 39
%P 50-58
%D 2020
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Popularity of the web is increasing day by day and social media is becoming a huge source of information. It becomes difficult to analyze this enormous information quickly. Text summarization solves this problem, it minifies text such that repeated data are removed and important information is extracted and represented in the concise way which can help us to understand the information instantly. It is impossible to summarize all this information manually as it contains a huge number of unstructured information and reviews. Manual summarization is a tedious, monotonic and time consuming task. Therefore, method is needed for mining and summarizing information, reviews and produce representative summaries. To deal with this problem, an abstractive summarization of documents using an encoder decoder based approach is proposed. Abstractive Text Summarization gets the most essential content of a text corpus, compresses it to a shorter text, keeps its original meaning and maintains its semantic and grammatical correctness. For this, it uses deep learning architecture in natural language processing. It uses recurrent neural networks that connect the input and output data in encoder-decoder architecture with an added attention mechanism for better results. The proposed work is implemented with two datasets namely CNN/Dailymail and DUC 2004. The experimental results show that the model produces a highly coherent, concise and grammatically correct summary.

References
  1. A. M. Rush, S. Chopra, and J. Weston, “A neural attention model for abstractive sentence summarization,” in Proc. Conf. Empir. Methods Nat. Lang. Process. (EMNLP), Lisbon, Portugal, Sep. 2015, pp. 379–389.
  2. S. Chopra, M. Auli, and A. M. Rush, “Abstractive sentence summarization with attentive recurrent neural networks,” in Proc. Conf. North Amer. Assoc. Comput. Linguist. Human Lang. Technol., San Diego CA, USA, Jun. 2016, pp. 93–98.
  3. R. Nallapati, B. Zhou, C. N. dos Santos, Ç Gülçehre, and B. Xiang, “Abstractive text summarization using sequence-to-sequence RNNs and beyond,” in Proc. 20th SIGNLL Conf. Comput. Nat. Lang. Learn. (CoNLL), Berlin, Germany, Aug. 2016, pp. 280–290.
  4. W. Zeng, W. Luo, S. Fidler, and R. Urtasun, “Efficient summarization with read-again and copy mechanism,” CoRR, vol. abs/1611.03382, 2016.
  5. A. See, P. J. Liu, and C. D. Manning, “Get to the point: Summarization with pointer-generator networks,” in Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Association for Computational Linguistics, 2017, pp. 1073–1083.
  6. Romain Paulus, Caiming Xiong, and Richard Socher, “A deep reinforced model for abstractive summarization”. In Proceedings of the 6th International Conference on Learning Representations, Vancouver, Canada, 2018.
  7. P. Li, W. Lam, L. Bing, and Z. Wang, “Deep recurrent generative decoder for abstractive text summarization,” in Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, 2017, pp. 2091–2100.
  8. D. Bahdanau, J. Chorowski, D. Serdyuk, P. Brakel, and Y. Bengio, “End to end attention-based large vocabulary speech recognition,” in Proc. IEEE Int. Conf. Acoust. Speech Signal Process. (ICASSP), Shanghai, China, Mar. 2016, pp. 4945–4949.
  9. K. M. Hermann et al., “Teaching machines to read and comprehend,” in Proc. Adv. Neural Inf. Process. Syst. Annu. Conf. Neural Inf. Process.Syst., Montreal, QC, Canada, Dec. 2015, pp. 1693–1701.
  10. R. Nallapati, F. Zhai, and B. Zhou, “SummaRuNNer: A recurrent neural network based sequence model for extractive summarization of documents,” in Proc. 31st AAAI Conf. Artif. Intell, San Francisco, CA, USA, Feb. 2017, pp. 3075–3081.
  11. D. Zajic, B. Dorr, and R. Schwartz, “Bbn/umd at DUC-2004: Topiary,” in Proc. Doc. Understanding Conf. NLT/NAACL, 2004, pp. 112–119.
  12. Q. Zhou, N. Yang, F. Wei, and M. Zhou, “Selective encoding for abstractive sentence summarization,” in Proc. Meeting Assoc. Comput. Linguist, 2017, pp. 1095–1104.
  13. Kaichun Yao, Libo Zhang, “Dual Encoding for Abstractive Text Summarization”, IEEE TRANSACTIONS ON CYBERNETICS, China, 2018 pp. 2168-2180.
Index Terms

Computer Science
Information Sciences

Keywords

Summarization Abstractive Summarization Natural Language Processing Deep Learning