Abstract:
Bangla Sign Language Detection system converts the Bangla sign language into text,
so that deaf-mute people can communicate with ordinary people. Deaf-mute people are
quite detached from society because there is a communication gap between normal
people and a deaf-mute people. On the strength of technological welfare, now it is
possible to capture any deaf-mute people’s gesture and with the help of machine
learning, it can be converted into text. In this research, we adopt a development model
for recognizing gestures that accommodates MediaPipe for extracting hands and posing
landmarks and long short-term memory (LSTM) to train and recognize the gesture. This
will convert Bangla sign language gestures into readable text. The requirements
analysis served as the foundation for our proposed model, which will be carried out in
four stages: data collection and preprocessing; training and testing of the suggested
neural network; and finally, testing in real time. A gesture model is taught to recognize
gestures with the help of a self-created dataset of Bangla sign language. The trained
model successfully identifies the gesture, and the text equivalent of the gesture is
displayed on screen. The purpose of this model is to help create a medium to
communicate between normal people and the hearing impaired.