Automatic emotion recognition is a challenging task due to the various channels emotions can be expressed with. Data collected from real conversations are difficult to classify using one modality or channel (e.g., face images). That is why multimodal techniques have recently become more popular in automatic emotion recognition.
Olga Perepelkina, Chief Research Scientist at Neurodata Lab, will give a talk on multimodal systems for automatic emotion recognition the company develops. She will talk about body motion analysis in Affective Computing, and what role bodily behaviors play in emotional expression. She will also show the recently released demo tools for automatic emotion recognition, speech processing, and remote PPG estimation.