Multimodal Learning

egoEMOTION: Egocentric Vision and Physiological Signals for Emotion and Personality Recognition in Real-World Tasks featured image

egoEMOTION: Egocentric Vision and Physiological Signals for Emotion and Personality Recognition in Real-World Tasks

A new multimodal dataset and architecture combining egocentric vision and physiological signals for in-the-wild emotion and personality recognition, presented at NeurIPS 2025 …

m.-jammot
Affective State Prediction from Smartphone Touch and Sensor Data in the Wild featured image

Affective State Prediction from Smartphone Touch and Sensor Data in the Wild

Multimodal affective state prediction from smartphone touch and sensor data in naturalistic conditions, using deep learning fusion. CHI 2022.

avatar
Dr. Rafael Wampfler