Facial emotions have been a compelling factor of a human communication that help us to understand the underlying intentions of others. Primarily people interpret the emotional status of other people, like sadness, anger and happiness using the facial and vocal expressions. As per the several researches indicate that the verbal components only transmit the one-third of a communication where nonverbal components transmit two-third of the communication. Among the nonverbal components, facial emotions have been a significant information channel in the interpersonal communication by carrying the underlying emotional context of a communication.
Therefore, facial emotion has been drawn a significant attention within last two decades not only in perceptual and cognitive science, but also in the computer science.Focus towards the facial emotion recognition (FER) has been growing lately with the rapid growing of areas such as artificial intelligent techniques, virtual reality (VR), augmented reality (AR), human computer interaction (HCI), and advanced driver assistant systems (ADASs). There are various sensors such as electrocardiogram (ECG), electromyography (EMG), electroencephalograph (EEG) and camera as FER inputs. But camera has been the most auspicious sensor hence it provides a more informative details to FER and without wanting to be worn.