DSpace Repository

Abstractive Method of Text Summarization with Sequence to Sequence RNNs

Show simple item record

dc.contributor.author Masum, Abu Kaisar Mohammad
dc.contributor.author Rabby, AKM Shahariar Azad
dc.contributor.author Talukder, Md. Ashraful Islam
dc.contributor.author Abujar, Sheikh
dc.contributor.author Hossain, Syed Akhter
dc.date.accessioned 2021-08-11T09:56:23Z
dc.date.available 2021-08-11T09:56:23Z
dc.date.issued 2019-07-08
dc.identifier.uri http://dspace.daffodilvarsity.edu.bd:8080/handle/123456789/5965
dc.description.abstract Text summarization is one of the famous problems in natural language processing and deep learning in recent years. Generally, text summarization contains a short note on a large text document. Our main purpose is to create a short, fluent and understandable abstractive summary of a text document. For making a good summarizer we have used amazon fine food reviews dataset, which is available on Kaggle. We have used reviews text descriptions as our input data, and generated a simple summary of that review descriptions as our output. To assist produce some extensive summary, we have used a bi-directional RNN with LSTM's in encoding layer and attention model in decoding layer. And we applied the sequence to sequence model to generate a short summary of food descriptions. There are some challenges when we working with abstractive text summarizer such as text processing, vocabulary counting, missing word counting, word embedding, the efficiency of the model or reduce value of loss and response machine fluent summary. In this paper, the main goal was increased the efficiency and reduce train loss of sequence to sequence model for making a better abstractive text summarizer. In our experiment, we've successfully reduced the training loss with a value of 0.036 and our abstractive text summarizer able to create a short summary of English to English text. en_US
dc.language.iso en_US en_US
dc.publisher Scopus en_US
dc.subject Text Processing en_US
dc.subject Word-Embedding en_US
dc.subject Missing Word Counting en_US
dc.subject Vocabulary Counting en_US
dc.subject Deep Learning en_US
dc.subject Bidirectional RNN en_US
dc.subject Encoding en_US
dc.subject Decoding en_US
dc.title Abstractive Method of Text Summarization with Sequence to Sequence RNNs en_US
dc.type Article en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Browse

My Account

Statistics