MIT logo

EmotionNet: Emotion Recognition Based on Multimodal Physiological Signals Through Deep Learning

Image diagram of experiment design. On the left, there is an image of the experiment participant watching the affective videos on a screen while wearing the Muse headband and the Emotive band. On the top right, there is a diagram of the CNN algorithm. On the bottom right, there is an illustration of the results of emotion estimates among 9 emotional states.

Abstract

The field of emotion recognition aims to automatically quantify human emotional states based on behavioural and physiological data. Interesting applications of this area of study include the early detection and relieve of strong negative emotions, and the diagnosis of neurological disorders that affect people’s emotional well being. Much of the current emotion recognition research aims to differentiate from 3 emotional states, positive, negative, or neutral, and further work is needed to classify more nuanced emotional states. In addition, there is growing interest for classifying emotions based on wearable devices that are available beyond laboratory settings. My research develops a convolutional neural network algorithm that is able to estimates participants’ ratings of 9 distinct emotions based on EEG, EDA, and BVP data collected with commercial-grade devices. My algorithm performs a regression estimate that is within 0.30 of participants’ self-reported emotions on a 0 to 4 scale.

See EmotionNet Research Paper Here and GitHub Repo Here