top of page

Emotional State Detection in Facial Expressions

 

There are various methods that can be used in detecting emotional states from facial expressions, all of which are based on the tracking and detection of movement in head and facial muscles. The complexity of a detection system is affected by the complexity of mental states it is required to differentiate. A simpler system can detect facial muscle movements by tracking feature points, and directly map them against pre-defined emotional states.


A more complex system outlined here has three levels of abstraction (El Kaliouby & Robinson, 2004): the action level, gesture level and emotional state level. The input for each level of abstraction is a sequence of outputs from the lower layer. A sequence of muscle actions is recognized as a facial gesture, and an emotional state can be inferred from a sequence of facial gestures.

Face and Head Action Detection

 

The detection of face and head actions is the basic, and the first step of detecting emotional states. The process involves tracking feature points of a face across consecutive frames, from which the movements of different muscles can be identified.

Facial Gestures Recognition

 

Sequences of face and head movements form gestures, such as a 'nod' gesture being a sequence of 'head pitch up' and 'head pitch up' movements. Hidden Markov models are used for this pattern recognition.

Mental State Inference

 

From a set of facial and head gesture observations at a given time instance, the mental state inference system is able to calculate the probability that the user is experiencing any of the supported mental states.

bottom of page