Research Initiative

EASI-Health: Embodied Artificial Social Intelligence as a Universal Interface in Health Communication

EASI-Health is a research project on embodied conversational AI and communication neuroscience that can serve as a universal interface for health communication across bedside care, urgent care, and mental health care.

Authors: Ralf Schmälzle, Gary Bente, Sri Kalyanaraman (Michigan State University) & James Curran (Haptix Studio / Great Lakes Reality Labs)

Summary & Key Goals

Summary

EASI-Health develops believable, adaptable embodied AI agents that act as supportive “early responders” in sensitive health interactions.

  • Embodied conversational AI
  • Communication neuroscience
  • Multi-domain deployment

Key Goals

  • Develop socially aware agents that complement human professionals.
  • Integrate nonverbal skills with advanced language models to “read the room.”
  • Validate via behavioral and neurophysiological measures.
  • Create a flexible platform for basic and applied research.

Core Challenge

LLM-based AI excels verbally but lacks crucial nonverbal and relational skills for genuine trust and empathy.

The Social Brain & Health Communication

Effective health outcomes rely on successful communication, from understanding instructions to building therapeutic trust.

The human brain has dedicated systems for social signals — faces, voice tone, gestures, and body language — which are primary data for empathy, trust, and understanding.

Key Brain Regions

  • Medial Prefrontal Cortex (mPFC)
  • Temporoparietal Junction (TPJ)
  • Superior Temporal Sulcus (STS)
  • Amygdala

Critical Questions

  • Is the message understood?
  • Does it motivate behavior change?
  • Is the patient anxious or distrustful?
  • Is there rapport?

Methodology

1) Build: Engineering Socially Aware Agents

  • High-fidelity embodied avatars
  • Customized Language Models (CLMs): domain-specific, secure, locally hosted
  • Generative Body Language (GBL): nonverbal repertoire derived from real interactions

2) Test: Real-World Health Scenarios

  • Post-operative recovery coaching
  • Empathetic health motivation & counseling
  • Patient intake & information gathering
  • Mental health screening & support

3) Validate: Measuring "Under the Skin"

  • Psychophysiology: Heart Rate (HR), Electrodermal Activity (EDA)
  • Neurophysiology: EEG and fMRI; future: fNIRS
  • Behavior analysis: facial/eye-tracking, nonverbal & conversation analytics

Our Research Streams

Embodied Agents for Coaching

Agents developed and tested in health & wellness contexts.

Neuroscience of Message Processing

fMRI, EEG, and psychophysiology to understand how health messages are received and predict outcomes.

Human–AI Communication Dynamics

How humans perceive, trust, and build rapport with AI counterparts.

Collaborative Ecosystem: University & Industry

Center for Avatar Research & Immersive Social Media Applications

Role: Research on human-to-human and human–AI interaction in VR/AR.

Facilities: Professional motion capture, high-fidelity 3D character creation, VR/XR dev in Unreal/Unity.

Neuroscience of Messages Lab

Role: Communication neuroscience research using real-life interpersonal and mediated messages.

Facilities: Multi-channel mobile EEG, comprehensive psychophysiology, neural responses to media.

Great Lakes Reality Labs & Haptix Studio

Role: Technology development, digital asset creation, implementation.

Broader Impact

Basic Science

Investigate fundamental mechanisms of communication, trust, and message understanding; frontier research in social AI.

Clinical Practice

Pathways to scalable tools to reduce staff burnout, improve patient education, and provide empathetic first contact.

University–Hospital Partnership

Synergy across clinical needs, communication science, neuroscience, and AI development for patient care.

Future Directions

Rapid Prototyping Platform

Flexible system to quickly build and test AI-driven health interventions.

Deeper Neuro-Validation

fMRI, EEG, and fNIRS to investigate the social brain’s response to AI agents in health situations.

Adaptive Interactions

Closed-loop systems where agents adapt verbal and nonverbal behavior in real time to user state (e.g., distress).

References