<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Embodied Conversational Agents | Dr. Rafael Wampfler</title><link>https://rafael-wampfler.github.io/tags/embodied-conversational-agents/</link><atom:link href="https://rafael-wampfler.github.io/tags/embodied-conversational-agents/index.xml" rel="self" type="application/rss+xml"/><description>Embodied Conversational Agents</description><generator>HugoBlox Kit (https://hugoblox.com)</generator><language>en-us</language><lastBuildDate>Mon, 23 Mar 2026 00:00:00 +0000</lastBuildDate><item><title>Digital Einstein</title><link>https://rafael-wampfler.github.io/projects/digital-einstein/</link><pubDate>Tue, 01 Jun 2021 00:00:00 +0000</pubDate><guid>https://rafael-wampfler.github.io/projects/digital-einstein/</guid><description>&lt;h2 id="overview"&gt;Overview&lt;/h2&gt;
&lt;p&gt;Digital Einstein is a flagship embodied conversational agent that brings the historical figure of Albert Einstein to life through real-time multimodal AI interaction. The system combines speech recognition and synthesis, facial animation, gesture control, and a cognitively grounded language understanding pipeline to deliver immersive, personality-consistent conversations.&lt;/p&gt;
&lt;p&gt;Digital Einstein serves three interconnected roles:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Research platform&lt;/strong&gt; — a testbed for studying human–agent interaction in constrained embodied settings, yielding insights on affective computing, personality modeling, dialog act classification, and conversational AI.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Education platform&lt;/strong&gt; — a live demonstration of conversational AI and multimodal deep learning deployed in university events, science outreach, and public engagement.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Public engagement tool&lt;/strong&gt; — reaching thousands of visitors globally at scientific conferences, tech summits, museums, and public events, generating sustained international recognition for ETH Zurich.&lt;/li&gt;
&lt;/ol&gt;
&lt;h2 id="motivation"&gt;Motivation&lt;/h2&gt;
&lt;p&gt;How can AI systems convincingly portray a well-known historical personality — someone whose knowledge, values, and speaking style are culturally established — in real-time dialogue with arbitrary members of the public? This challenge crystallizes core problems in interactive AI: maintaining factual and characterological consistency, adapting dynamically to unpredictable user inputs, and delivering a compelling embodied experience at scale.&lt;/p&gt;
&lt;p&gt;Digital Einstein was conceived as both a scientific challenge and a communication vehicle: making abstract advances in AI tangible for general audiences while simultaneously driving rigorous research on the underlying problems.&lt;/p&gt;
&lt;h2 id="approach"&gt;Approach&lt;/h2&gt;
&lt;p&gt;The system is built on a full-pipeline architecture described in the SIGGRAPH 2025 paper &lt;em&gt;&amp;ldquo;A Platform for Interactive AI Character Experiences&amp;rdquo;&lt;/em&gt;. Key components include:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Perception layer&lt;/strong&gt;: Real-time speech recognition via Microsoft Azure Speech Services and multimodal input processing through a webcam-based vision pipeline, including face detection, user characterization, head pose estimation, and re-identification.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Cognitive reasoning layer&lt;/strong&gt;: Knowledge-grounded dialogue management with integrated response generation, powered by GPT-4.1 mini, featuring dynamic personality infusion that adapts outputs to user-selectable archetypes: Digital Einstein, Rude Bulldozer, Drama Volcano, Zen Master, and Hashtag Prophet.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Animation synthesis&lt;/strong&gt;: Data-driven facial animation synchronized with speech output using NVIDIA Audio2Face, blended with emotion-conditioned expressions, and complemented by a curated library of motion-captured body animations categorized by avatar state.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Embodiment&lt;/strong&gt;: A stylized Albert Einstein avatar rendered in Unity on a 65-inch display, integrated into a themed early-20th-century physical environment with spatial audio, a hidden microphone, and physical personality sliders built from potentiometers and an Arduino.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;The SIGGRAPH Asia 2024 demonstration paper &lt;em&gt;&amp;ldquo;Immersive Conversations with Digital Einstein: Linking a Physical System and AI&amp;rdquo;&lt;/em&gt; details the physical installation setup, including the integration of an animatronic head with the real-time AI pipeline at the Tokyo venue.&lt;/p&gt;
&lt;h2 id="key-results"&gt;Key Results&lt;/h2&gt;
&lt;p&gt;Digital Einstein has been demonstrated at over 20 major events worldwide, including:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;SIGGRAPH Asia 2024&lt;/strong&gt; (Tokyo, Japan) — Emerging Technologies&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;SIGGRAPH 2025&lt;/strong&gt; (Vancouver, Canada)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;GITEX Global 2024 &amp;amp; 2025&lt;/strong&gt; (Dubai, UAE) — Swiss Pavilion&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;World Economic Forum 2024 &amp;amp; 2026&lt;/strong&gt; (Davos, Switzerland)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Berlin Science Week 2025&lt;/strong&gt; (Berlin, Germany)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Swiss Re Resilience Summit 2024&lt;/strong&gt; (Rüschlikon, Switzerland)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Microsoft Initiative to Advance AI Diffusion in Switzerland 2025&lt;/strong&gt; (Berne)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;After the Algorithm Festival 2026&lt;/strong&gt; (Zurich, Switzerland)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;The project has generated sustained international media coverage and public interest, positioning ETH Zurich as a world leader in embodied conversational AI.&lt;/p&gt;
&lt;figure&gt;&lt;img src="https://rafael-wampfler.github.io/projects/digital-einstein/gitex.jpg"
alt="Swiss Ambassador to the UAE, Arthur Mattli, interacting with Digital Einstein at GITEX Global in Dubai."&gt;&lt;figcaption&gt;
&lt;p&gt;Swiss Ambassador to the UAE, Arthur Mattli, interacting with Digital Einstein at GITEX Global in Dubai.&lt;/p&gt;
&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;h2 id="learn-more"&gt;Learn More&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="primary-publications"&gt;Primary Publications&lt;/h2&gt;
&lt;p&gt;&lt;strong&gt;R. Wampfler&lt;/strong&gt;, C. Yang, D. Elste, N. Kovačević, P. Witzig and M. Gross (2025). &lt;em&gt;A Platform for Interactive AI Character Experiences&lt;/em&gt;. Proceedings of the SIGGRAPH Conference Papers &amp;lsquo;25 (Vancouver, Canada, August 10–14, 2025), pp. 1–11.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;R. Wampfler&lt;/strong&gt;, N. Kovačević, P. Witzig, C. Yang, M. Gross (2024). &lt;em&gt;Immersive Conversations with Digital Einstein: Linking a Physical System and AI&lt;/em&gt;. In SIGGRAPH Asia 2024 Emerging Technologies (SA &amp;lsquo;24) (Tokyo, Japan, December 3–6, 2024).&lt;/p&gt;</description></item><item><title>Virtual Psychotherapist</title><link>https://rafael-wampfler.github.io/projects/virtual-psychotherapist/</link><pubDate>Sun, 01 Jan 2023 00:00:00 +0000</pubDate><guid>https://rafael-wampfler.github.io/projects/virtual-psychotherapist/</guid><description>&lt;h2 id="overview"&gt;Overview&lt;/h2&gt;
&lt;p&gt;Access to evidence-based psychotherapy remains severely limited worldwide — constrained by long waitlists, resource scarcity, and geographic disparities. The Virtual Psychotherapist project develops embodied conversational AI agents that complement clinical care by extending access, supporting therapeutic practice, and enabling scalable training.&lt;/p&gt;
&lt;p&gt;This initiative is conducted in close collaboration with the &lt;strong&gt;University of Lucerne&lt;/strong&gt;, providing clinical expertise and direct access to real therapy data.&lt;/p&gt;
&lt;h2 id="motivation"&gt;Motivation&lt;/h2&gt;
&lt;p&gt;Two distinct but complementary challenges motivate this project:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Scaling patient access&lt;/strong&gt;: Many individuals who would benefit from psychotherapy cannot access it due to cost, waitlists, or geography. AI-based companions that operate between sessions, provide continuous support, and conduct structured therapeutic conversations could meaningfully improve outcomes at scale.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Improving therapist training&lt;/strong&gt;: Training clinicians in evidence-based interventions requires repeated practice with feedback — but opportunities for safe, standardized training are limited by ethical and resource constraints. Simulated patient systems powered by AI can provide unlimited deliberate practice in controlled settings.&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;h2 id="system-architecture"&gt;System Architecture&lt;/h2&gt;
&lt;p&gt;Both applications are built on a shared platform combining:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Large language model-based dialogue&lt;/strong&gt;: State-of-the-art LLMs for contextually appropriate, therapeutically grounded response generation&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Retrieval-augmented generation (RAG)&lt;/strong&gt;: Responses grounded in evidence-based therapeutic literature, reducing hallucinations and ensuring adherence to clinical frameworks&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Real-time psychological analysis&lt;/strong&gt;: Parallel processing pipelines that extract facts, detect psychological flexibility processes, recognize emotions, and monitor safety in real time&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Embodied avatar presentation&lt;/strong&gt;: Synchronized speech synthesis and 3D avatar animation delivered through mobile and desktop applications&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Clinician oversight&lt;/strong&gt;: Structured analysis outputs accessible to supervising therapists, enabling human-in-the-loop clinical governance&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="patient-facing-application"&gt;Patient-Facing Application&lt;/h3&gt;
&lt;p&gt;The patient-facing component enables individuals to conduct therapeutic conversations with an embodied avatar between sessions. The system follows &lt;strong&gt;process-based therapy&lt;/strong&gt; principles — particularly Acceptance and Commitment Therapy (ACT), an empirically supported approach targeting psychological flexibility through six core processes: acceptance, cognitive defusion, present-moment awareness, self-as-context, values clarification, and committed action.&lt;/p&gt;
&lt;p&gt;Critically, the system is designed for clinical supervision, not autonomous intervention. All session data is structured and accessible to the supervising therapist, who can monitor progress and intervene as needed.&lt;/p&gt;
&lt;p&gt;An evaluation against responses from professional psychotherapists demonstrated that the system&amp;rsquo;s responses were rated significantly higher on understanding, interpersonal effectiveness, collaboration, and ACT alignment — while emphasizing that clinical judgment and the therapeutic relationship remain irreplaceable.&lt;/p&gt;
&lt;h3 id="therapist-training-application"&gt;Therapist Training Application&lt;/h3&gt;
&lt;p&gt;The training application enables psychotherapists to practice and refine therapeutic techniques through role-play interactions with a simulated patient. Key features include:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Clinically grounded patient simulation&lt;/strong&gt;: Virtual patient behavior conditioned on profiles derived from real therapy sessions, covering a range of clinical presentations and scenarios (suicidality, resistance, heightened anxiety, therapeutic rupture, and more)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Real-time ACT fidelity feedback&lt;/strong&gt;: An automated evaluator assesses each therapist utterance for adherence to ACT principles, providing immediate visual feedback and the option to retry alternative responses&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Configurable scenarios&lt;/strong&gt;: Therapists can select specific clinical scenarios to target their practice&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;A systematic evaluation across 49 therapy transcripts identified GPT-4o-mini as the optimal feedback model, achieving the closest alignment with human supervisor ACT fidelity ratings.&lt;/p&gt;
&lt;h2 id="safety-and-ethics"&gt;Safety and Ethics&lt;/h2&gt;
&lt;p&gt;Safety is a primary design constraint. The system includes:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Crisis detection&lt;/strong&gt;: Explicit classification of suicidal ideation and self-harm signals, triggering immediate presentation of crisis resources and mandatory clinician review&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Unsafe-interaction detection&lt;/strong&gt;: Identification of conditions (e.g., active psychosis, mania) where LLM interaction may be counterproductive, with protocol-defined fallback responses&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Non-autonomous design&lt;/strong&gt;: The system is explicitly positioned as a complement to clinical care, not a replacement — structured to require and facilitate clinician oversight&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="key-results"&gt;Key Results&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Patient-facing system responses rated significantly higher than human therapist responses by automated evaluation and expert psychotherapists across understanding, collaboration, and ACT alignment&lt;/li&gt;
&lt;li&gt;Therapist training simulation rated as realistic by practicing psychologists; turn-by-turn feedback shown to increase therapist awareness of intervention choices&lt;/li&gt;
&lt;li&gt;Automated ACT fidelity assessment achieves strong agreement with human expert ratings across 49 therapy transcripts&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="research-partners"&gt;Research Partners&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;University of Lucerne&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;</description></item><item><title>RehaBot</title><link>https://rafael-wampfler.github.io/projects/rehabot/</link><pubDate>Mon, 01 Jan 2024 00:00:00 +0000</pubDate><guid>https://rafael-wampfler.github.io/projects/rehabot/</guid><description>&lt;h2 id="overview"&gt;Overview&lt;/h2&gt;
&lt;p&gt;RehaBot is an embodied conversational agent designed to support patients in rehabilitation and home-care settings. The avatar represents a medical professional — capable of conducting structured patient interactions, administering assessments, and delivering health education — to help bridge the gap between in-clinic care and independent recovery at home.&lt;/p&gt;
&lt;p&gt;This project is developed in collaboration with &lt;strong&gt;Inselspital Bern&lt;/strong&gt; (University Hospital) and &lt;strong&gt;Bern University of Applied Sciences (BFH)&lt;/strong&gt;.&lt;/p&gt;
&lt;h2 id="motivation"&gt;Motivation&lt;/h2&gt;
&lt;p&gt;Rehabilitation after medical treatment — whether from stroke, orthopedic surgery, cardiac events, or chronic disease — requires sustained patient engagement over weeks or months. Yet contact with healthcare professionals is necessarily episodic, leaving long gaps during which patients must self-manage. Lack of guidance, motivation, and timely feedback during these intervals is a major driver of poor rehabilitation outcomes and preventable hospital readmissions.&lt;/p&gt;
&lt;p&gt;An embodied conversational agent that patients can interact with at home — to receive reminders, answer questions, conduct structured assessments, and provide health education — addresses this gap directly. By combining medical knowledge with empathetic communication and a human-like embodied presence, RehaBot aims to make professional-quality support continuously available between clinical appointments.&lt;/p&gt;
&lt;h2 id="approach"&gt;Approach&lt;/h2&gt;
&lt;p&gt;RehaBot integrates several complementary AI capabilities within a unified embodied avatar system:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Medical knowledge integration&lt;/strong&gt;: Structured clinical knowledge relevant to the patient&amp;rsquo;s rehabilitation pathway, enabling accurate and safe responses to health questions&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Conversational assessment&lt;/strong&gt;: The ability to administer structured health questionnaires and functional assessments through natural spoken dialogue, adapting pacing and clarification to individual patient needs&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Empathetic communication&lt;/strong&gt;: Affective modeling that allows the agent to detect and respond to emotional signals in patient speech — frustration, discouragement, anxiety — with appropriate supportive responses&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Health education&lt;/strong&gt;: Accessible explanations of rehabilitation exercises, medication adherence, warning signs, and self-management strategies, adapted to the patient&amp;rsquo;s comprehension level&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Patient-professional interface&lt;/strong&gt;: Structured summaries of patient interactions accessible to supervising clinicians, supporting continuity of care and early detection of clinical deterioration&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;The system is built on the same core platform as the Digital Einstein project, enabling rapid deployment of new capabilities while maintaining consistent embodied presentation quality.&lt;/p&gt;
&lt;h2 id="key-results"&gt;Key Results&lt;/h2&gt;
&lt;p&gt;Recent work exploring embodied conversational interfaces for personal health data reflection demonstrates that users who engage with health information through a conversational agent formulate significantly more specific and actionable health plans compared to traditional dashboard-based exploration. Embodied conversation lowers the cognitive burden of interpreting health data and supports a shift from passive data inspection to active health sensemaking.&lt;/p&gt;
&lt;h2 id="research-partners"&gt;Research Partners&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Inselspital Bern&lt;/strong&gt; (University Hospital of Bern)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Bern University of Applied Sciences (BFH)&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;</description></item><item><title>MIND — Cognitive Health Monitoring</title><link>https://rafael-wampfler.github.io/projects/mind/</link><pubDate>Wed, 01 Jan 2025 00:00:00 +0000</pubDate><guid>https://rafael-wampfler.github.io/projects/mind/</guid><description>&lt;h2 id="overview"&gt;Overview&lt;/h2&gt;
&lt;p&gt;The MIND (Monitoring and Supporting Cognitive Health) project creates a mobile platform for the early detection of cognitive impairment in aging populations. Early identification of decline — before it significantly impacts daily life — enables timely intervention, supports independent living, and improves long-term outcomes for older adults and their families.&lt;/p&gt;
&lt;p&gt;MIND is part of the &lt;strong&gt;Future Health Technologies 2 (FHT2)&lt;/strong&gt; initiative, a major research program funded by the &lt;strong&gt;National Research Foundation Singapore (NRF) CREATE&lt;/strong&gt; grant.&lt;/p&gt;
&lt;h2 id="motivation"&gt;Motivation&lt;/h2&gt;
&lt;p&gt;Cognitive decline — from mild cognitive impairment to dementia — affects hundreds of millions of people worldwide. Current clinical detection relies predominantly on periodic in-clinic assessments that are episodic, costly, and often catch decline only after substantial impairment has occurred. A continuously operating, passive monitoring system that can detect subtle early warning signs in everyday behavior could transform the standard of care.&lt;/p&gt;
&lt;p&gt;Two key insights motivate the MIND approach:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Movement and navigation encode cognition&lt;/strong&gt;: Changes in how people navigate their environment — route choices, wayfinding strategies, spatial memory — are among the earliest and most sensitive indicators of cognitive decline. Analyzing GPS and sensor traces at scale can reveal these changes.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Language reflects cognitive health&lt;/strong&gt;: Subtle changes in vocabulary, sentence complexity, topic management, and response latency in everyday conversation are measurable markers of cognitive change, detectable through automated language analysis.&lt;/li&gt;
&lt;/ol&gt;
&lt;h2 id="approach"&gt;Approach&lt;/h2&gt;
&lt;p&gt;MIND integrates three complementary AI systems into a unified mobile platform:&lt;/p&gt;
&lt;h3 id="large-geospatial-models-lgms"&gt;Large Geospatial Models (LGMs)&lt;/h3&gt;
&lt;p&gt;Large Geospatial Models analyze longitudinal GPS and mobility traces to detect patterns indicative of cognitive change. These include:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Route repetitiveness&lt;/strong&gt;: Increasing restriction of daily movement range&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Navigation errors&lt;/strong&gt;: Unusual detours or disorientation in familiar environments&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Mobility diversity&lt;/strong&gt;: Changes in the variety of visited locations over time&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;LGMs are trained on large-scale mobility datasets and fine-tuned to identify individual-level deviations from baseline behavior.&lt;/p&gt;
&lt;h3 id="llms-for-conversational-cognitive-assessment"&gt;LLMs for Conversational Cognitive Assessment&lt;/h3&gt;
&lt;p&gt;Large language models analyze conversation transcripts from daily interactions with the MIND app, detecting cognitive markers such as:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Reduced lexical diversity and increased use of filler words&lt;/li&gt;
&lt;li&gt;Difficulty with topic maintenance and coherence&lt;/li&gt;
&lt;li&gt;Slowed response generation and increased hesitation&lt;/li&gt;
&lt;li&gt;Confusion or confabulation in response to everyday questions&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;These markers are tracked longitudinally to identify meaningful change relative to the individual&amp;rsquo;s own baseline.&lt;/p&gt;
&lt;h3 id="embodied-avatars-for-guided-assessment-and-training"&gt;Embodied Avatars for Guided Assessment and Training&lt;/h3&gt;
&lt;p&gt;An embodied conversational avatar conducts structured cognitive assessments — such as memory recall tasks, verbal fluency tests, and orientation questions — through natural spoken dialogue. The avatar also guides cognitive training exercises designed to maintain cognitive reserve.&lt;/p&gt;
&lt;p&gt;The embodied format is critical: older adults are more likely to engage regularly with an interactive, socially present agent than with a text-based questionnaire or passive sensor. The avatar adapts its communication style to individual users, adjusting vocabulary, pacing, and support level to ensure accessibility.&lt;/p&gt;
&lt;h3 id="integration-and-privacy"&gt;Integration and Privacy&lt;/h3&gt;
&lt;p&gt;All three systems operate on a shared mobile platform with strong privacy protections. Data is analyzed on-device where possible, and users maintain full control over what is shared with care providers. The platform produces structured reports for geriatricians and primary care physicians, enabling early clinical intervention.&lt;/p&gt;
&lt;h2 id="significance"&gt;Significance&lt;/h2&gt;
&lt;p&gt;MIND represents a convergence of several research threads — geospatial AI, conversational AI, affective computing, and embodied interaction — applied to one of the most pressing health challenges of our time. By enabling early, passive, continuous monitoring, the platform aims to support successful aging in place and to delay or prevent the transition to care dependency.&lt;/p&gt;
&lt;h2 id="research-partners-and-funding"&gt;Research Partners and Funding&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;National Research Foundation Singapore (NRF) CREATE&lt;/strong&gt; — Future Health Technologies 2 (FHT2)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Bond University (Australia)&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;Collaboration with life sciences and medicine partners in Singapore&lt;/li&gt;
&lt;/ul&gt;</description></item><item><title>Talking to Your Data: Exploring Embodied Conversation as an Interface for Personal Health Reflection</title><link>https://rafael-wampfler.github.io/publications/talking-to-your-data-2026/</link><pubDate>Mon, 23 Mar 2026 00:00:00 +0000</pubDate><guid>https://rafael-wampfler.github.io/publications/talking-to-your-data-2026/</guid><description/></item><item><title>A Platform for Interactive AI Character Experiences</title><link>https://rafael-wampfler.github.io/publications/platform-interactive-ai-2025/</link><pubDate>Sun, 10 Aug 2025 00:00:00 +0000</pubDate><guid>https://rafael-wampfler.github.io/publications/platform-interactive-ai-2025/</guid><description/></item><item><title>Immersive Conversations with Digital Einstein: Linking a Physical System and AI</title><link>https://rafael-wampfler.github.io/publications/digital-einstein-siggraph-asia-2024/</link><pubDate>Tue, 03 Dec 2024 00:00:00 +0000</pubDate><guid>https://rafael-wampfler.github.io/publications/digital-einstein-siggraph-asia-2024/</guid><description/></item></channel></rss>