C. Holz

egoEMOTION: Egocentric Vision and Physiological Signals for Emotion and Personality Recognition in Real-World Tasks featured image

egoEMOTION: Egocentric Vision and Physiological Signals for Emotion and Personality Recognition in Real-World Tasks

A new multimodal dataset and architecture combining egocentric vision and physiological signals for in-the-wild emotion and personality recognition, presented at NeurIPS 2025 …

m.-jammot

A Joint Personality-Emotion Framework for Personality-Consistent Conversational Agents

A joint framework modeling personality and emotion for personality-consistent conversational agents, using contrastive learning to decouple emotion from semantic content. IVA 2025. …

n.-kovacevic

On Multimodal Emotion Recognition for Human-Chatbot Interaction in the Wild

Systematic study of multimodal emotion recognition in natural human-chatbot interactions, evaluating text, acoustic, and behavioral signal fusion strategies. ICMI 2024.

n.-kovacevic

Chatbots With Attitude: Enhancing Chatbot Interactions Through Dynamic Personality Infusions

Dynamic personality infusion for chatbots — modulating expressed Big Five personality traits at inference time to improve user engagement and interaction quality. CUI 2024.

n.-kovacevic
The Personality Dimensions GPT-3 Expresses During Human-Chatbot Interactions featured image

The Personality Dimensions GPT-3 Expresses During Human-Chatbot Interactions

A large-scale empirical characterization of the personality dimensions GPT-3 expresses during human-chatbot interaction, using Big Five psychometrics. Published in ACM IMWUT 2024.

n.-kovacevic

Personality Trait Recognition Based on Smartphone Typing Characteristics in the Wild

Personality trait recognition from everyday smartphone typing dynamics in naturalistic conditions, using deep learning on keystroke patterns. IEEE Transactions on Affective …

n.-kovacevic
Affective State Prediction from Smartphone Touch and Sensor Data in the Wild featured image

Affective State Prediction from Smartphone Touch and Sensor Data in the Wild

Multimodal affective state prediction from smartphone touch and sensor data in naturalistic conditions, using deep learning fusion. CHI 2022.

avatar
Dr. Rafael Wampfler