Emotion Recognition

egoEMOTION: Egocentric Vision and Physiological Signals for Emotion and Personality Recognition in Real-World Tasks featured image

egoEMOTION: Egocentric Vision and Physiological Signals for Emotion and Personality Recognition in Real-World Tasks

A new multimodal dataset and architecture combining egocentric vision and physiological signals for in-the-wild emotion and personality recognition, presented at NeurIPS 2025 …

m.-jammot

On Multimodal Emotion Recognition for Human-Chatbot Interaction in the Wild

Systematic study of multimodal emotion recognition in natural human-chatbot interactions, evaluating text, acoustic, and behavioral signal fusion strategies. ICMI 2024.

n.-kovacevic
Affective State Prediction from Smartphone Touch and Sensor Data in the Wild featured image

Affective State Prediction from Smartphone Touch and Sensor Data in the Wild

Multimodal affective state prediction from smartphone touch and sensor data in naturalistic conditions, using deep learning fusion. CHI 2022.

avatar
Dr. Rafael Wampfler

Affective State Prediction Based on Semi-Supervised Learning from Smartphone Touch Data

Semi-supervised learning for affective state prediction from smartphone touch data, leveraging abundant unlabeled naturalistic data. CHI 2020.

avatar
Dr. Rafael Wampfler
Affective Computing & Emotion Recognition featured image

Affective Computing & Emotion Recognition

Multimodal deep learning systems for affective state prediction from smartphone sensors, biometric data, egocentric vision, and physiological signals — enabling real-world …