Sara Khalilipicha recently presented our latest work at the 19th EAI International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth 2025) held in Eindhoven, the Netherlands, October 15–17, 2025.
Her talk, “Designing VR for Chronic Pain: Transforming AI-Powered Biofeedback to Interoceptive Awareness and Neuromodulation,” showcased how the Pain Studies Lab is rethinking VR for chronic pain beyond simple distraction.
Instead of using VR just to “take patients’ minds off the pain,” the project builds on our long-term work with the Virtual Meditative Walk (VMW): a foggy forest VR environment designed for mindfulness-based stress reduction. In VMW, patients’ physiological signals drive changes in the environment. As stress rises, fog thickens; as they relax and regulate their breathing, the fog begins to clear and ambient sounds become more present, supporting interoceptive awareness and a sense of agency over their own nervous system.
In this new work, Sara presented an AI-powered biofeedback pipeline that uses Electrodermal Activity (EDA) and Heart Rate Variability (HRV) to detect stress and emotional states far more accurately than our earlier, rule-based method in VMW. Trained on the WESAD dataset, the Extra Trees classifier achieved about 98.7% accuracy for multiclass emotion detection and 99.3% accuracy for binary stress detection, compared with roughly 76% accuracy for the original slope-based biofeedback method. This means the system can now adapt the VR environment much more precisely to users’ changing psychophysiological states, tightening the link between biofeedback, interoceptive awareness, and neuromodulation in chronic pain interventions.
The work is co-authored with Diane Gromala, Armin Froozanfar, Chris Shaw, Patti Derbyshire, and Efe Erhan, and continues the Pain Studies Lab’s research agenda on immersive, AI-supported interventions for chronic pain.
You can watch the recording of Sara’s PervasiveHealth presentation here:
Video of the presentation