Abstract: In this paper, we present a novel work about predicting the facial expressions from physiological signals of the brain. The main contributions of this paper are twofold. a) Investigation of the predictability of facial micro-expressions from EEG. b) Identification of the relevant features to the prediction. To reach our objectives, an experiment was conducted and we have proceeded in three steps: i) We recorded facial expressions and the corresponding EEG signals of participant while he/she is looking at pictures stimuli from the IAPS (International Affective Picture System). ii) We fed machine learning algorithms with time-domain and frequency-domain features of one second EEG signals with also the corresponding facial expression data as ground truth in the training phase. iii) Using the trained classifiers, we predict facial emotional reactions without the need to a camera. Our method leads us to very promising results since we have reached high accuracy. It also provides an additional important result by locating which electrodes can be used to characterize specific emotion. This system will be particularly useful to evaluate emotional reactions in virtual reality environments where the user is wearing VR headset that hides the face and makes the traditional webcam facial expression detectors obsolete.
Related Posts
-
Why Dial Testing Alone Isn’t Enough in Media Testing — How to Build on It for Better Results
Consumer Insights
-
The Power of Emotional Engagement: Entertainment Content Testing with Affectiva’s Facial Expression Analysis
Consumer Insights
-
Tracking Emotional Engagement in Audience Measurement is Critical for Industry Success
Consumer Insights
-
How Real-Time Audience Intelligence Is Revolutionizing Modern Advertising
Consumer Insights