top of page

Emotion Recognition through Body Gesture

 

 

Body gesture detection is considered one of the hardest methods to carry out emotion recognition. So why would we want to detect gesture while we can just use other easily acquired data? Imagine a system where the security cameras can detect if train conductors/ security guards are not paying attention in their jobs, or even a learning system where we can detect if a student understands the materials in class and can provide feedback without the student having to ask. Gesture recognition can play a big part where others cannot. Though In practice, gesture data is usually integrated with facial and speech to give the full multimodal analysis.

 

 

 

Constructing a model from body sensors and cameras to be analysed 

 

 

There are many ways gestures can be interpret. One commonly used and reliable technique is mapping the input model into 3D coordinate system which can be done in many diffrent level of precision.

 
Using RGB camera and Depth data

 

 

One popular example for this method is the Xbox Kinect system which uses video cameras to observe the environment the user is in and compare it with the user's movement. Data obtained is calculated into numerical values (e.g. arousal level, Valence) to be compared with the database.

 
Difficulties and Accuracy 

 

 

There are still many challenges in detecting emotion through gestures regarding accuracy and ease of use compared to other methods such as facial expression and speech recognition.

 

bottom of page