Newly diagnosed patients often experience heightened anxiety and a profound loss of control. Imaging scans, frequently among the first steps in diagnosis and staging, can intensify these emotions, especially for individuals who were previously healthy, as uncertainty about what to expect adds to the distress.
Building a Proof of Concept
MSK sought a virtual reality tool to help prepare patients for diagnostic and treatment activities and reduce fear. We created a calming, guided VR experience grounded in evidence-based techniques commonly used in Cognitive Behavioral Therapy, designed to help patients recognize, neutralize, and manage anxiety throughout their cancer care journey.

Lindsay Welhoelter
Software Engineer

Louis Riccardi
Project Manager

Christopher Brause
Sr. Product Designer

Erica Parker
Sr. Product Designer

What to Expect
Participating patients can familiarize themselves in advance with the facility, the flow of the day, and the people they will encounter. The experience also provides a realistic simulation of what the scan or procedure will feel like, helping reduce uncertainty on the actual day of care.
Imaging Modalities
We focused on MRI scans because their intense, unexpected noise and enclosed environment frequently trigger anxiety and claustrophobia, particularly for patients encountering the experience for the first time.

The End-to-End VR Experience
VR enables us to create a controlled, calming, and informative simulation that prepares patients for what to expect on the day of their scan. As part of the user experience, we also accounted for real-world actions patients must take during their journey. The prototype below, built in After Effects, demonstrates how pass-through instructions guide users to safely sit down before entering the MRI.
Patient Journey Map
This map outlines key checkpoints derived from our initial research milestones. It served as a tool to track progress, measure effectiveness, and identify where adjustments to the VR therapy plan were needed. Ultimately, the journey map provided a clear, structured framework for the VR therapy experience, supporting patient outcomes in a safe and effective way.

Clinical Journey Map
This map captures key checkpoints informed by our early research milestones. It was used to track progress, evaluate success, and identify where adjustments to the VR therapy plan were required. Ultimately, the journey map established a clear, structured framework for the VR therapy experience, supporting patients in achieving their desired outcomes safely and effectively.

Product Blueprint
Before and after building the proof of concept, we conducted interviews to assess product viability and understand the patient journey. Ongoing feedback from clinical teams informed the development of the VR walkthrough, and with each iteration, we returned to key stakeholders for additional user testing and validation.

“Would be helpful to do this for Breast Imaging as a lot of the patients are younger and undergoing screening for the first” - MRI Technologist
“It would be great to have this or a version of this video on loop in the waiting room!” - Associate Attending
“Would be great to have this type of prep experience for Image Guide Breast Radiology…This is great. - Director of Faculty Development
“At Vanderbilt they are using expensive mock MRIs to help patients with claustrophobia conditioning. With VR, we can do the same conditioning quicker and in a scalable way”- Clinical Nurse Specialist
Stakeholder Interviews
We conducted interviews to evaluate product viability and understand the user journey. Input from clinical teams directly informed the VR walkthrough, and with each iteration, we returned to key groups for user testing to validate and refine the experience.
Key Takeaways
The greatest value came from creating a tool that meaningfully enhanced the MSKCC patient experience. Patients and caregivers were given the opportunity to sense and engage with the medical environment early, something they would not otherwise experience until arriving at MSK for the first time.

The Virtual Environment
After completing the user journey, we took a closer look at how that journey is mapped on a real 2D floor plan using the actual rooms from Koch. We used this floor plan to outline our 3D checkpoints for VR users. We created 7 main focus areas (check points) for users to experience.

A Detailed User Flow
Once we had our locations determined in 3D, I set out to create interactions for each space. Planning the wireframe of each virtual space helped us plan and determine the location of our UI elements, such as way-finding signage, selection menus, check points, and more. The auditory script was also planned in sync with the visual wireframe.

Waiting Room - Step 1
Waiting Room - Step 1

MRI Tech - Step 4

Changing Rooms - Step 2

MRI Room - Step 5

Locker Rooms - Step 3

MRI Experience - Step 6
A Complex User Interface
We developed a simple UI using Figma to help users navigate and select their VR environment. Starting with menus and buttons that have large text and are accessible throughout the user journey. A patient can twist their wrist for a shortcut menu to appear, allowing them to exit, pause, or re-read instructions. We used over 35+ digital 3D assets to help make our environment look more realistic to the 7th floor Koch building


User Experience
The next hurdle was figuring out how to move around that environment. As we looked into VR locomotion, we discovered that it can be a major cause of VR sickness, so that became a driving factor in many of our UX decisions.
Locomotion
We built a teleportation based locomotion system. Using this system, the player can aim their controller at a target and select and they’ll be instantaneously repositioned at that target location.

Validating UX via A|B Testing
An example of how we used XR-Rig to create a much better Virtual Reality user experience. In this example, we are piloting "Pressable” buttons. I designed 3 key buttons that are used interchangeably throughout: the states are dormant, hovered, and pressed. We chose white, grey and yellow buttons that create a more intuitive and confident selection encounter.

In the first video on the left, you can see how confusing it might be for users to feel self-assured when choosing any menu option. There are barely any signifiers that confirm where a selection can be made and verify if its even successful.

In the right video because we’ve included various signifiers, not just the reticle pointer changing colors. These selections can “push in” and have visual indicators that something is happening when their hand points at an object (sprite).