Ultrasound Part 2: Why Self-directed, Just-in-Time Training is So Important 

By Eric Gantwerker, MD, MMSc(MedEd), FACS (Level Ex VP, Medical Director)

We have witnessed a huge shift in knowledge access over the previous several decades. What was formerly exclusively available in libraries and professors’ brains is now available on the devices that everyone has in their pockets at all times. As a result of the digital age, knowledge shortfalls can now be corrected promptly with Just-in-Time access to content and facts. However, there is still a need for learners to acquire foundational knowledge and conceptual understanding in order to integrate new information into their knowledge base.  

A Post-Pandemic Education Landscape 

COVID-19 undoubtedly changed the landscape of education at every level, as schools scrambled to create online learning platforms to address the massive in-person learning losses. This was no different in healthcare education where training was heavily predicated on in-person experiential learning opportunities. Even simulation suffered during the pandemic, as centers were closed and access to hardware-based learning resources grinded to a halt. 

Through the pandemic’s forced closures and cancellations, we realized the higher cost, lower access, and chance that exists with in-person experiential learning. Higher expense and time investment for clinicians to travel and participate in in-person opportunities, less access for those with financial or geographical limitations, and the reliance on chance dictating the learning opportunities for patient-based training. 

As the new normal sets in post-pandemic, educational leaders are analyzing the ways that asynchronous and synchronous, remote learning opportunities can transform education by replacing and augmenting in-person, synchronous learning experiences. This means taking advantage of the Just-in-Time learning, lower cost, and higher touchpoints afforded by mobile and software-based solutions to impart knowledge and sharpen skills.

The future is hybrid learning for both medical and surgical education. We have realized the learning curve for any cognitive or psychomotor skill is most efficiently learned through a combination of asynchronous, synchronous remote components, and synchronous, in-person elements. If we can get learners higher on the learning curve prior to the high cost, low access touchpoints, we maximize the efficiency of learning. This asynchronous learning should be available when clinicians need it, on the device they already have. 

Asynchronous learning easily accessible for clinicians with a device they already have.

The Value of Self-Regulated Learning  

In education, we discuss the concept of self-regulated learning (SRL), which refers to how highly-motivated learners take command of their own learning and go through to identify their own knowledge and skill gaps, actively address them, receive feedback, and continue along as they address learning deficits.

In general, highly motivated and astute learners, such as astronauts, just need access to materials—without oversight or faculty—to address those deficits. As astronauts train for missions, their training schedule is jam-packed, and only a small fraction of that time is devoted to learning about medical emergencies and procedures. 

Typically for near space missions, crews have access to flight surgeons on the ground to guide them through any medical scenario—but what happens when they are on a deep space mission to Mars where the communication delay is 20 minutes each way? The emergency may have already played itself out by the time communication to the ground and back has happened. So astronauts need Just-in-Time, efficient mechanisms to quickly train on how to evaluate and treat these emergencies, even if it’s just a refresher for them. This includes cognitive (what is the diagnosis?), as well as psychomotor tasks, (how do I perform an ultrasound, and what am I looking at?).  

To meet this need, Level Ex developed a virtual training platform for space crews centered on this Just-in-Time training approach. Building on our prior work and in collaboration with the Translational Research Institute for Space Health (TRISH) and KBR, we built a solution for the upcoming Polaris Dawn mission consisting of two parts—both aimed at enabling astronauts to better monitor their health and maximize their safety in space. 
Our pre-flight orientation and training guide teaches the crew how to use a handheld Butterfly iQ+ device for ultrasound imaging that will be onboard the spacecraft. During their 5-day orbit mission, the crew will use Just-in-Time training and procedural guidance that Level Ex created to perform the ultrasound procedures on themselves and collect data. The crew will be tracking their blood flow patterns daily to learn more about how the zero gravity environment influences the human body. This experiment will also test the efficacy of using virtual training solutions like video games for Just-in-Time training on medical technology and procedures.

Polaris Dawn crew members practice using a handheld Butterfly iQ+ device for ultrasound imaging. The Polaris Dawn mission is slated to launch in 2023.  

Polaris Dawn crew members practice using a handheld Butterfly iQ+ device for ultrasound imaging. The Polaris Dawn mission is slated to launch in 2023.   

Training the Mind, Without the Medical Device    

Many may wonder how they’re going to learn a psychomotor/technical task without a specific medical device in their hand. The answer depends on whether they are a novice or expert in that task. If they are a novice, the first parts of any procedure are knowing: 

  • The context (where am I, and what I am trying to do?)
  • The specific parameters of the equipment they are using (what does this button do?)
  • The steps of the procedure
  • How to analyze what they see 
  • How to physically perform the task  

Many of these are actually cognitive in nature, meaning the learner doesn’t need the actual medical device in hand. Oftentimes, not having the medical device in hand actually optimizes the cognitive load so they can focus on the elements without fidgeting with the device itself. 

Too often, however, this part is done through passive didactics and endless reading of manuals and documents. But there is a better way. Having a meaningful, interactive experience on their own device can create this opportunity to learn. This is not to say they never need to train with the specific medical device, but if they already have a strong understanding when they do have the device in hand, they can focus on enacting the strategy they have already created through countless cycles of trial and error beforehand. 

Even force feedback, the simulation of real-world physical touch, has a significant visual component that your brain processes more than tactile feedback. For example, think of a video of a rubberband around a watermelon about to burst, the viewer perceives there is force without actually touching the watermelon.  

Expert learners can also do a fair amount of training on the medical device before having it in-hand. Again, one needs to orient to what is different from prior experience and understand the strategy and approach. This is why “how I do it” videos are so popular among surgeons, because they can simply watch the video and then go to the bedside and enact that strategy. Their expert eyes easily see changes in patterns and integrate them into their knowledge base. They will need the physical device at some point, but by that time, they will have already run through the procedure in their head hundreds of times. 

Regardless of their experience level, training beforehand on their own time, with their own device is a much lower cost and higher frequency touchpoint as compared with any in-person lab or cadaver workshop. If the learner is well-trained on the cognitive components of the procedure, they can focus all their attention on actually holding the device and the mechanical and technical aspects of the procedure. This will limit the time needed to learn in-person, reduce the watching of boring lectures and reading technical guides, and drill into them what they need to know, when they need it. 

Such self-directed, Just-in-Time training maximizes efficiency of learning any new device or technique, equaling better quality of learning at lower cost. It’s a win-win.   


Interested in learning more about Level Ex’s technology and how it’s accelerating medical device training and adoption? Contact us.

Revolutionizing Spine Surgery with Spine XR: A Comprehensive Guide

Explore the advantages of our interactive spine game. Which platform is right for you?

Surgical navigation technology has transformed spine surgery by enabling surgeons to accurately predict trajectories and visualize implant placement during minimally invasive procedures. However, complications occasionally surface during such procedures—and one type of complication that we’ll discuss in this post can derail both the software and the surgical outcome. 

In short, the spine risks being displaced when a surgeon is applying physical force during surgery. This displacement can throw off the surgical navigation system and potentially lead to complications for the patient. Surgeons often have no indication they’re off course, and the dilemma compounds as they proceed. 

How can clinicians improve navigational precision in spine surgery given these current limitations?

Enter Spine XR, a mixed reality game that simplifies spine surgery by offering an unobstructed view of the spine, allowing surgeons to hone their skills in preventing displacement.

How Spine XR Works

Level Ex’s team of developers, physicians, and in-house medical experts built Spine XR to strip spine surgery down to the essentials, quite literally. Spine XR separates the spine from the patient’s body in a highly interactive game, enabling players to visualize how the spine reacts to physical forces and understand displacement risks during surgery. 

The gameplay fosters critical and creative thinking, helping players develop a lasting mental model of the actions and techniques that reduce displacement risks. This ultimately enhances procedural accuracy and safety in spine surgery.

Where to Play Spine XR: Multiple Options for Different Needs and Budgets

Spine XR is available across various platforms, ensuring accessibility for users and medical teams with different needs and budgets. Here we’ll explore the dynamic features of Spine XR based on the different platforms surgeons can use to play the game, and outline the benefits of each experience.

Performing Spine Surgery with Augmented Reality (AR)

Augmented reality (AR) technology enhances a player’s physical environment by overlaying simulated objects and information. Players can move freely around the augmented experience as if the simulated content is directly in front of them.  

Spine XR was the first healthcare application for Magic Leap 2 (ML2), a high-performance AR device that runs customized enterprise solutions at scale with one of the highest compute performance capabilities of any standalone AR device on the market. Using the ML2 headset, players can insert screws into a virtual spine and hammer them with a virtual mallet, working to choose the best screw sequence to minimize displacement.

AR is ideal for players who want an immersive experience while still being able to interact with their environment and others around them—for instance, at a conference booth or as part of an interactive educational experience.

AR is ideal for players who want an immersive experience while still being able to interact with their environment and others around the.

Immersive AR Experiences No Matter the Lighting or Environment

Magic Leap’s Segmented Dimming™ feature brings Spine XR to life by dimming portions of the display directly behind virtual assets, creating an immersive experience that works well in various environments, even outdoors. Both global and segmented dimming options ensure easy text legibility and a better experience for users in bright rooms or industrial spaces.

Bright conference rooms or event halls are easy choices for such dimming options. Just take the example of the 2022 North American Spine Show (NASS) in Chicago where we debuted Spine XR. Not surprisingly, the experience was a huge hit with NASS attendees, driving traffic and drawing a steady crowd in the large arena.  

Magic Leap’s Segmented Dimming™ feature brings Spine XR to life even in bright tradeshow exhibit halls.

A More Affordable Spine XR Experience with Oculus Quest 2 (VR)

Spine XR is available on the Oculus Quest 2 virtual reality (VR) platform for players seeking an immersive experience without the cost of Magic Leap. This version offers similar features on a more affordable platform that allows for broader scaling of headset-based training. The Oculus Quest 2 VR experience transports users to a virtual operating room, providing a fully immersive experience with a controlled environment and fewer distractions.

The VR experience is not optimal for every player, however. When users can’t see the world around them, some may experience anxiety or reluctance to move around for fear of bumping into something or causing an accident, making VR a less ideal option for busy in-person events. There is also a chance some may suffer from motion sickness due to the immersion of the experience.

That being said, the VR experience enables learners to be completely immersed in the surgical experience with optimal control. The Oculus Quest VR option is best for players who want fewer distractions and a tighter focus on the Spine XR experience.

The Oculus Quest VR option is best for players who want fewer distractions and a tighter focus on the Spine XR experience.

Spine XR on Mobile and Desktop for Convenient Access Anytime, Anywhere

To make Spine XR even more accessible, we’ve made it available on desktop computers, tablets, and mobile devices. Users can easily launch the game by scanning a QR code, without any downloads, installs, or additional hardware required. 

This 2D version of the game includes all Spine XR assets, and the multi-user interaction feature allows players to perform spine surgery with friends or colleagues across the globe. 

Limitations of the 2D experience include less physical interaction with the assets, a reduced ability to build spatial awareness, and less immersion compared to AR and VR. However, the convenience and ease of remote play without the need for special hardware is unbeatable. 

To make Spine XR even more accessible, we’ve made it available on desktop computers, tablets, and mobile devices.

Choosing the Best Spine XR Platform for You

The ideal Spine XR platform depends on your specific needs and preferences:

  • AR: AR platforms like Magic Leap 2 are ideal for conferences and interactive training events. Magic Leap’s Segmented Dimming™ feature gives players the ability to move around and interact with Spine XR no matter the lighting condition. 
  • VR: Oculus Quest 2 provides a more affordable VR experience with fewer distractions and a controlled environment. AR and VR are also better options for helping players build muscle memory as it relates to procedures and techniques.
  • Mobile and Desktop: Spine XR for desktop, tablets, and smartphones offers a convenient, powerful, and scalable gameplay experience that can be accessed anytime and anywhere, but with reduced immersion in a 2D environment.

Not surprisingly, Level Ex clients often choose a combination of options to deploy their solutions, including Spine XR. While a more immersive option like AR is ideal for helping clinicians build a lasting mental model of the physical motions and techniques needed for a procedure (not to mention drawing a crowd to your booth), mobile and desktop deployments are ideal for ensuring remote and/or ongoing access to the game. Our mobile and desktop versions of Spine XR ensure clients can drive ongoing engagement and training, which ultimately helps reinforce memorization of the procedural steps and techniques needed to ensure a positive result.

Interested in learning how games like Spine XR can help you achieve your business objectives? Have a medical device challenge you’re itching to solve? Get in touch

Ultrasound Part 1: Training Your Customers like Astronauts

How Med Device Companies are Using Ultrasound Tech Developed for NASA to Accelerate Training and Adoption

It isn’t just velcro—numerous investments that were originally made to solve problems for the space program have yielded huge benefits here on planet Earth. Read on to find out how ultrasound simulation is charting the same path.

Let’s start with a little background on NASA’s approach to solving challenges related to space health

The Translational Research Institute for Space Health (TRISH) was established by NASA to solve the human challenges associated with deep space exploration— in short, to advance space health for astronauts. TRISH tirelessly pursues and funds high-impact scientific research and technological solutions to help humans stay healthy and thrive on long space flights and expeditions.

As you can imagine, the traditional first step of the engineering process— defining the problem— becomes an order-of-magnitude harder in the context of an activity like space flight. This is why NASA and TRISH work to imagine almost every possible scenario when it comes to defining the problem. When you do that, something interesting happens. New solutions tend to emerge that may not have been considered with a more linear problem-solving approach. In other words, radical problems demand out-of-the-box thinking; and this out-of-the-box thinking begets an incredible range of possible solutions.

Now let’s dive into a real world example.

The Problem: Just-in-Time Ultrasound Training in Deep Space 

A scenario:

Imagine a human spaceflight crew that is nine months into a Mars mission. One of the astronauts suddenly grabs his chest and passes out. In a resource-constrained environment, what can the crew members do? 

There’s a flight surgeon back on the ground in Houston, but at that distance, it can take 40 minutes or more for the communications to make the round trip from Mars to Earth and back. There’s an ultrasound on board, but ultrasound images are difficult to interpret for humans with no medical training—not to mention being in space adds its own complexities to the problem. In this case, microgravity causes the human heart to physically change shape and alters the direction of blood flow. 

Given this challenge, how does an astronaut know whether the ultrasound’s image they are looking at is normal for someone who’s been in space for 9 months? How can we enable astronauts to quickly train themselves on how to perform ultrasound-guided diagnostics and procedures remotely, no matter where they are in the solar system?

A Daunting Set of Requirements for Ultrasound Simulation

After surveying the landscape of ultrasound training simulators available today, the team at TRISH quickly realized that existing commercial ultrasound simulators leave much to be desired. 

Most of these simulators work by taking a slice of a simple 3D model and adding some simple post-processing (noise, blur) to create ultrasound-like images and training experiences. None meet the realism requirements of NASA.

TRISH reached out to Level Ex in 2018 after their assessment was complete. Given our team’s track record of achieving complex computational fluid dynamics on an iPhone using adapted video game technology, their team wondered— was it possible to use a similar approach that would allow us to realistically simulate ultrasound in space, and in real time?

The requirements for this challenge were intimidating to say the least. To start, we knew we were going to need an incredibly realistic ultrasound training simulation. This is to ensure astronauts feel fully immersed and confident in carrying out an ultrasound-guided procedure— and most crucially, to ensure accurate technique and diagnostics can be achieved. 

Specifically, the simulation had to provide:

  • Realistic simulation of millions of ultrasound waves bombarding an array of tissues and participating media (in physics, participating media or mediums refer to the material or empty space through which signals, waves or forces pass.)
  • Accurate recreation of ultrasound artifacts— i.e. any interfering feature in an ultrasound image that doesn’t accurately represent the examined area (ringdown, acoustic shadowing, etc.)
  • Technological support for astronauts to be able to access real patient data, so images can be compared to ultrasound, MRI, or CT scans taken on earth before their space flight
  • Real-time, interactive performance (30 frames per second), with no perceivable lag
  • Optimized for mobile devices used in space— in this case, a lightweight, low-power tablet 

The Result

To support NASA’s requirements for a completely reimagined ultrasound training experience, Level Ex’s team of video game developers, physicians, and medical experts created the most realistic real-time ultrasound simulator available to date. 

Our simulator being used to scan the jugular vein in an early prototype for TRISH/NASA.

Our simulator being used to scan the jugular vein in an early prototype for TRISH/NASA.

Our ultrasound simulation technology for TRISH:

  • Runs at 30 frames per second, enabling better resolution and image accuracy
  • Scales across devices—from the low-end mobile phones of today to high-end desktops and Level Ex’s cloud gaming platform, Remote Play™ 
  • Features physics-based ultrasound simulation including simulated backscatter, wave propagation, speckle, and realistic interactions with a range of participating media
  • Includes a wide range of realistic ultrasound artifacts including ringdown and acoustic shadowing. These artifacts emerge from the simulation as opposed to being added on as a special-case visual effect as is common in today’s ultrasound simulators)
  • Can be configured to recreate any transducer parameters and layout— from POCUS (point-of-care ultrasonography) to IVUS (intravascular ultrasound; used for scenarios like evaluating the coronary arteries that supply the heart)
  • Is able to generate realistic ultrasound simulation from voxelized patient data. Note: Voxels are to 3D what pixels are to 2D

Thanks to our team’s brilliant work on this important advancement in space health, Level Ex was recognized with a prestigious award at Siggraph, the world’s most influential conference for computer graphics and visual effects. 

You can watch my live demo from the conference here. 

Our simulator is actually validated against phantom tissue analogues—this ensures the simulator generates the same result as an actual transducer:

Ringdown, acoustic shadowing, and other artifacts automatically emerge from the simulation, and are validated against ultrasound phantoms.

You can try a version of it here.

How Our Med Device Clients Are Using NASA-Approved Ultrasound Technology Today

Level Ex maintains the terrestrial rights to its work with NASA, which means that our medical device clients (including Philips, Boston Scientific, and others) are leveraging it to accelerate the adoption of their own ultrasound-based and ultrasound-guided products here on Earth.

This typically involves one (or both) of: 

  1. Quickly demonstrating the benefits of ultrasound-based and ultrasound-guided products
    And
  2. Training professionals on proper usage of a device across a range of scenarios

understanding that the benefits of a new ultrasound probe often requires “playing with it,” for example patient cases where the device offers the greatest benefit. Phantoms don’t cut it.


In this example, Level Ex’s simulation framework was used to demonstrate the benefits of Philips’ IVUS probe. Here, Philips shows how and why certain lesions are much better visualized with IVUS compared to x-ray fluoroscopy. 

By looking along the simulated IVUS probe, players are able to find a hidden thrombus (blood clot) that would otherwise be invisible under x-ray. Without proper identification via the best-suited devices like this one, blood clots may go on to embolize, causing a stroke. 

Making training accessible for all learners through a mobile device.

You can play through this example by downloading our Cardio Ex app- available on just about every major smartphone.

Other client examples include:

  • Demonstrating the benefits of ultrasound-guided procedures and diagnostics
  • Demonstrating the capabilities of specific transducers and software

In all cases, realistic ultrasound simulation allows the player to intuitively understand the benefits of the procedure or medical device in the context of relevant patient cases. This ensures the use cases for our client’s various solutions and medical devices truly shine. As an enormous added benefit, end users like ultrasound technicians can play through these experiences remotely without having to go through in-person training regimens or travel to simulation centers to learn how to use the device.

Popular Use Cases for Our Video Game Technology According to Real Med Device Clients

Ultrasound waves are a mysterious and confounding phenomenon to a human brain that has spent its entire life learning how to interpret visible light. Sound can cast a shadow, and it can bounce like light (sort of)—but reacts very differently in different materials (especially fluids and gasses). It can also echo—creating the appearance of objects that aren’t there.

To master ultrasound-guided procedures, medical professionals need to develop an intuitive mental model for ultrasound. Research shows that this kind of “brain training” is best achieved using video game technology and design.

Level Ex’s med device clients are reaping the benefits of this highly engaging, memorable and convenient approach to training thanks to our realistic real-time simulation technology. Medical professionals across the globe can now sharpen their skills and understanding of  medical device usage and best practice techniques across a range of patient cases. 

These skills include:

  • Configuring the transducer to achieve the best image quality
  • Ultrasound-guided needle placement and navigation
  • Diagnostic interpretation
  • Use of ultrasound-integrated software (for example: to size a stent, or to enhance an ultrasound image)

An emerging use case we’re now seeing is Just-In-Time training—wherein the clinician trains on a specific procedure or use case right before they need to perform it. This is a crucial scenario for NASA—one that is explored in the next blog post in this series.

In this example, a user is learning to size a stent using Philips’ software recreated inside Cardio Ex.

How it’s deployed:

Level Ex’s cloud-based simulation platform ensures all of the training examples above can be completed independently (e.g., on the user’s phone) or collaboratively (e.g., training sessions hosted onZoom, Microsoft Teams, in-person, or hybrid). Our clients often combine this type of training with their existing training programs—as a prelude to onsite training or as a follow up to keep skills sharp and drive retention of new information. 

Clients often integrate training into their existing single-sign-on (SSO) and learning management systems (LMS).  

Building such groundbreaking tech for NASA and TRISH to advance space health has been an incredible honor for our team. Using these latest ultrasound simulation technologies developed for NASA, our medical device clients accelerate training efforts and the adoption of their products, which ultimately contributes to better outcomes, shorter sales cycles and happier end users. By instilling an intuitive understanding of the benefits of an ultrasound (or ultrasound-guided) device, our clients are improving outcomes and reducing device abandonment thanks to space health-inspired training solutions.

Interested in learning more about Level Ex’s technology? Contact us.