Using Electroencephalograms to Interpret and Monitor the Emotions
Amin Shahab, Claude Frasson
Detecting the real-time human emotions became recently one important issue in Artificial Intelligent (AI). Numbers of research on emotional facial expressions, the effect of emotion on heart rate, eye movement and the evolution of emotions with the time show the interest of this topic. This paper presents a method for observing the human’s emotional evolutions (sequence of emotions) based on brain activities in its different parts. The Emotiv EPOC headset collects the data of Electroencephalograms (EEG) of the participant to calculate the arousal and valence. After training the system with headset output data, the noise and brain’s data other than emotional information will be cut out according to two levels of filtering. Finally, mapping the result (arousal and valence) with two dimensions circumplex space model presents the real-time emotional evolutions of the participant. Real-time emotional evolutions show all the picks of positive and negative feelings, moreover, analyzing the EEG data will allow recognizing the general emotions, which are the strongest routine senses of the participant. Comparing the unexpected reaction, the time taken by general emotion and feelings in picks, give us a tool to observe the health situation of the people and on the other hand, is an instrument to measure the mood of the people against a video game, news and advertising.
The final publication is available at Springer via https://doi.org/10.1007/978-3-319-67615-9_18.