Is Physical Simulation the Future of Healthcare?

From using digital doubles for physical therapy, to virtual reality practitioners and support dogs—interactive character simulation will improve wellness and enrich medicine. We built some prototypes to explore these applications.

From using digital doubles for physical therapy, to virtual reality practitioners and support dogs—interactive character simulation will improve wellness and enrich medicine. We built some prototypes to explore these applications.


Back pain is endemic to modern life; according to the American Chiropractic Association, back pain is one of the primary reasons for missed work, costs Americans at least $50 billion a year, and is the single leading cause of disability worldwide[1].

And while the causes for back pain are broad, the majority of cases of back pain are mechanical, rather than the result of a serious condition[2]. The DeepMotion team knows this well; as a group of technologists, we spend long days crouched over our computers. With a large percentage of our team suffering from chronic back pain, we began thinking about how our work on physical simulation could be leveraged to build virtual reality and simulation tools for health, therapy, and wellness. For us, prototyping meaningful solutions is essential in gaining insight and developing better software for AI-driven simulation.

Character Simulation For Education

Our first prototype to address posture and physical stress was an educational tool for visualizing skeletal pressure. Using articulated physics, we created a character that was biomechanically modeled. Meaning, we gave a traditional 3D character a simulated body complete with joints, musculature, skeletal density and more, using math that describes the physical properties of a body. When the character moves, it does so via muscle manipulation, similar to real world human locomotion.


Digital characters that are physically responsive to their environment can be used for exciting new interactive experiences. We found one such application was using a “digital double” to gain insight into healthy motion. To do this, we wrapped the digital character in a visualizer that would measure and report skeletal pressure as the character moved. (You can learn how to create a physically simulated character here.)


This tool was built to evaluate motion and stress in a dynamic, non-invasive way. Physical trainers can gain insight into the way different motions distribute pressure and can develope new exercises in simulation without risking personal injury over unproven methods.

Physical Therapy in VR

It’s worth noting that simulation has long been a heuristic tool for medicine and health; at its very core, simulation is the structure of representing cause, effect, and scope of possibility—simulated medical experiences are often an essential intermediary between education and practice (both for patients and healthcare professionals). From autopsy simulation, to patient care simulation, to orthodontic simulation—health professionals use simulation to illustrate concepts to their patients, improve training and educational curriculum, and to save money on raw materials. It is no surprise then that extended reality mediums—which propose more integrated, immersive simulated experiences—are being used to develop a new wave of health and medical applications.


In order to make a more personalized skeletal pressure visualizer, we created a full body VR Avatar using one of our physically simulated characters. By configuring the biomechanically modeled biped to our six point tracking system, we were able to represent the user’s body with fidelity to physical reality. Now, when the user moves, we can see in real-time the impact of his movements and posture on his joints. Tools like our VR skeletal pressure visualizer can help patients and doctors alike to learn about sustainable posture, the effect of weight distribution, and how to implement simple physical changes to alleviate acute pressure.

Interactive Characters for Psychological Wellness

Biomechanical simulation is an exciting application of interactive character creation, but it certainly isn’t the only application. Interactive characters can also serve as a therapeutic presence in VR experiences. Much of medicine is, after all, interpersonal in nature. Fostering open communication is essential for getting to the root of physiological and physical ailments—making constraints like a clinic's ambiance, geographic location, or wait times significant factors in ensuring quality care. Our VR Avatar solution can help bridge the gap between patients and doctors by allowing them to meet virtually, and interactive 3D agents can altogether simulate connection for those in need of therapeutic companionship.

In recent years, researchers have found compelling evidence for something many pet owners have long suspected: having a furry friend is good for your health and extends life expectancy[3]. However, pets are a big responsibility: they are expensive, require consistent care, and are barred from many public spaces and housing accomodations. We simulated an interactive dog to create the experience of canine companionship that so many people crave, but can’t always access in their own lives. When you clap, the dog comes, when you pet him, he responds physically in real-time. The more digital pets behave like real animals—in their physical communication and unpredictable exuberance—the more we may see the physiological benefits of therapeutic pets[4] transferred to the end user.

Virtual Reality medicine is still in its infancy as a field, but early applications promise more holistic solutions for therapy and health. We can’t wait to see the ways in which interactive character simulation is adopted for accessible, custom care.

DeepMotion is a pioneer in the emerging field of Motion Intelligence. We are building tools for lifelike graphics using physical simulation and artificial intelligence. Our mission is to enable interactive content and expand creative capability by revolutionizing real-time, procedural animation. DeepMotion's cloud-based services allow anyone to train digital actors responsive motion skills like parkour, dancing, athletics, martial arts, and more. Led by members from Blizzard, Pixar, Disney, ROBLOX, Microsoft, Ubisoft, Stanford, CMU, and Tsinghua, we're leveraging decades of experience in the field to help individuals and teams create better content for AR, VR, Games, Film, and Robotics.

  1. ↩︎

  2. ↩︎

  3. ↩︎

  4. ↩︎