DSpace Repository

Sequence-to-sequence Bangla Sentence Generation with LSTM Recurrent Neural Networks

Show simple item record

dc.contributor.author Islam, Md. Sanzidul
dc.contributor.author Mousumi, Sadia Sultana Sharmin
dc.contributor.author Abujar, Sheikh
dc.contributor.author Hossain, Syed Akhter
dc.date.accessioned 2021-11-07T06:45:26Z
dc.date.available 2021-11-07T06:45:26Z
dc.date.issued 2019-05-31
dc.identifier.uri http://dspace.daffodilvarsity.edu.bd:8080/handle/123456789/6348
dc.description.abstract Sequence to sequence text generation is the most efficient approach for automatically converting the script of a word from a source sequence to a target sequence. Text generation is the application of natural language generation which is useful in sequence modeling like the machine translation, speech recognition, image captioning, language identification, video captioning and much more. In this paper we have discussed about Bangla text generation, using deep learning approach, Long Short-term Memory (LSTM), a special kind of RNN (Recurrent Neural Network). LSTM networks are suitable for analyzing sequences of text data and predicting the next word. LSTM could be a respectable solution if you want to predict the very next point of a given time sequence. In this article we proposed a artificial Bangla Text Generator with LSTM, which is very early for this language and also this model is validated with satisfactory accuracy rate. en_US
dc.language.iso en_US en_US
dc.publisher Procedia Computer Science, Elsevier B.V. en_US
dc.subject Language modeling en_US
dc.subject Text generation en_US
dc.subject Machine learning en_US
dc.subject Deep learning en_US
dc.title Sequence-to-sequence Bangla Sentence Generation with LSTM Recurrent Neural Networks en_US
dc.type Article en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Browse

My Account

Statistics