dc.description.abstract |
Face Emotion Recognition has been a one of the common fields of work ever since Deep
Leaning technology has been introduced. Although a lot of techniques have acquired a high
performance, yet this technology has not been used for real-life application. This is caused
by the lack of robustness in Face Emotion Recognition. Most of the works on Face Emotion
Recognition focuses to extract the facial features and the emotion features mostly from
profile face images. However, it is important to train the models to be able to detect faces
and recognize emotions from a side angle or a tilted as well in order to make sure the model
is real-life applicable. As most of the probable uses of Face Emotion Recognition requires
that the model is capable of recognizing an emotion from various angles whether it is
vertically deflected or horizontally. Therefore, we have trained two CNN models
MobileNetV2 and VGG16 on a dataset that contains both profile faces and faces of
vertically deflected images and horizontally deflected angles. After detecting the face
location in an image using DNN from face_recognition library, using the transfer learning
approach, we achieved 87% and 60% training accuracy from VGG16 and MobileNetV2
respectively. Therefore, the models can be used to detect face emotions from both profile
images and side faced images. With such a robustness, we are one step further to optimizing
a Face Emotion Recognition technology that could be used various real-life events to
develop the quality of services, feedbacks and analyzing demands in different industries. |
en_US |