Abstract:
This project involves identification of facial expressions that reveal human emotions can help com-
puters to better assess the human state of mind, so as to provide a more customized interaction.
We explore the recognition of human facial expressions through a deep learning approach using a
Convolutional Neural Network (CNN) algorithm. The system uses a labelled data set containing
around 32,298 images with multiple facial expressions for training and testing. The pretraining
phase involves a face detection subsystem with noise removal, including feature extraction. The
generated classification model used for prediction can identify seven emotions of the Facial Action
Coding System (FACS). The Facial Action Coding System (FACS) refers to a set of facial muscle
movements that correspond to a displayed emotion. Using FACS, we are able to determine the dis-
played emotion of a participant. This analysis of facial expressions is one of very few techniques
available for assessing emotions in real-time.