DSpace Repository

Bangla Sign Language Conversation Interpreter Using Image Processing

Show simple item record

dc.contributor.author Roy, Prosenjit
dc.contributor.author Rahman, Md. Arifur
dc.contributor.author Manik, Md. Radwan Naeem
dc.date.accessioned 2019-07-04T06:41:33Z
dc.date.available 2019-07-04T06:41:33Z
dc.date.issued 2018-11-09
dc.identifier.uri http://hdl.handle.net/123456789/2681
dc.description.abstract This project about sign language conversation interpreter system. It is made for speak disable people. Here a man who don’t know the sign language can make conversation with a speak disable people. In this project we used efficient methods to convert sign language into text. We used our own dataset and collected dataset of Bangla Sign Languages using hand gestures. Inputs will take by webcam and recognize by neural network system and give text output by using gestures. We have created a CNN which is similar model of the MNIST classifying model. It can use both Tensorflow and Keras. This model was trained using Keras by video streaming. In a video stream sign language, interpreter can turn it fast into text. We used histogram back projection method to recognize people’s hand as object. For better result we calculated the grey level of hands. en_US
dc.language.iso en_US en_US
dc.publisher Daffodil International University en_US
dc.relation.ispartofseries ;P11747
dc.relation.ispartofseries ;P11748
dc.subject Computer Science en_US
dc.subject neural network en_US
dc.subject Language Convert Text en_US
dc.title Bangla Sign Language Conversation Interpreter Using Image Processing en_US
dc.type Working Paper en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Browse

My Account

Statistics