Paper-Conference

Talking to Your Data: Exploring Embodied Conversation as an Interface for Personal Health Reflection

A system enabling users to "talk to their health data" through an embodied conversational agent using a dual-agent architecture, compared against traditional dashboards in a pilot …

n.-kovacevic
PhonemeNet: A Transformer Pipeline for Text-Driven Facial Animation featured image

PhonemeNet: A Transformer Pipeline for Text-Driven Facial Animation

A transformer pipeline for text-driven facial animation exploiting phoneme-level speech structure, achieving real-time performance and best-in-class lip synchronization accuracy. …

p.-witzig
egoEMOTION: Egocentric Vision and Physiological Signals for Emotion and Personality Recognition in Real-World Tasks featured image

egoEMOTION: Egocentric Vision and Physiological Signals for Emotion and Personality Recognition in Real-World Tasks

A new multimodal dataset and architecture combining egocentric vision and physiological signals for in-the-wild emotion and personality recognition, presented at NeurIPS 2025 …

m.-jammot

Steering Narrative Agents through a Dynamic Cognitive Framework for Guided Emergent Storytelling

A dynamic cognitive framework for narrative agents in interactive storytelling, combining BDI representations with LLM generation to balance story coherence with player agency. …

c.-yang

BEE: Belief-Value-Aligned, Explainable, and Extensible Cognitive Framework for Conversational Agents

BEE is a modular cognitive framework for conversational agents featuring belief management, value alignment, transparent reasoning, and extensibility. Best Paper Honorable Mention …

c.-yang

A Joint Personality-Emotion Framework for Personality-Consistent Conversational Agents

A joint framework modeling personality and emotion for personality-consistent conversational agents, using contrastive learning to decouple emotion from semantic content. IVA 2025. …

n.-kovacevic
A Platform for Interactive AI Character Experiences featured image

A Platform for Interactive AI Character Experiences

A full-pipeline platform for interactive AI character experiences, demonstrated through Digital Einstein and deployed at scientific conferences, technology events, and public …

avatar
Dr. Rafael Wampfler

Immersive Conversations with Digital Einstein: Linking a Physical System and AI

SIGGRAPH Asia 2024 Emerging Technologies demonstration describing the physical installation and AI integration of Digital Einstein at the Tokyo venue.

avatar
Dr. Rafael Wampfler

EmoSpaceTime: Decoupling Emotion and Content through Contrastive Learning for Expressive 3D Speech Animation

EmoSpaceTime decouples emotion and content in 3D speech animation through contrastive learning, enabling fine-grained control over emotional expressivity independent of spoken …

p.-witzig

On Multimodal Emotion Recognition for Human-Chatbot Interaction in the Wild

Systematic study of multimodal emotion recognition in natural human-chatbot interactions, evaluating text, acoustic, and behavioral signal fusion strategies. ICMI 2024.

n.-kovacevic