dc.description.abstract |
Sign Language is the mode of communication among the deaf and dumb. However,
integrating them into the main stream is very difficult as the majority of the society is
unaware of their language. So, to bridge the communication gap between the hearing and
speech impaired and the rest in Bangladesh, I conducted a research to recognize Bangla
sign language using a computer-vision based approach.
Sign language not only help for the people who can't speak or hear, it's also help for human
computer interaction system or robotics system. To achieve my goals, I used Convolutional
Neural Networks (CNNs) to train individual signs. In the future, this research, besides
helping as an interpreter, can also open doors to numerous other applications like sign
language tutorials or dictionaries and also help the deaf and dumb to search the web or send
mails more conveniently.
It has two parts. One is Train part and second is sign detection part. Train part set by deep
learning method using CNN network and make train dataset. The detection part takes sign
from webcam and detect the Bengali numerical value by classifying from the train dataset. |
en_US |