November 15, 2021By Lance Baily

AWEXR Showcased the Latest Mixed Reality Technologies for Healthcare Education & Training

This past week in Santa Clara California the annual AWEXR took place bringing together academic researchers, industry executives, and institutional leadership. As the demand for remote and digital technologies has skyrocketed since COVID19 (please share your experiences in our “Impacts of COVID19 on Healthcare Simulation” survey to share experiences), attended the AWEXR event to witness the latest XR mixed reality technologies and share innovative best practices. Here we cover the healthcare track organized by Bob Fine of the IVRHA and VR in Healthcare Conferences and as well share more about the Virtual Reality VR, Augmented Reality AR, or Mixed Reality XR training companies and tools which hold promise for the clinical simulation space.

Presenters Covered the Catalyst of COVID to Healthcare XR

Thursday’s Healthcare Track was opened by organizer Bob Fine of the International Virtual Reality in Healthcare Association (IVRHA), who shared about his work to develop VR Healthcare organizations and conferences since 2017. Bob shared how COVID19 dramatically impacted the healthcare space, especially with regards to exploding demand for distance-learning tools such as augmented reality and virtual reality training headsets.

Sponsored Content:

Next up, Brendon Hale, Vice President, Health and Market Outcomes Research , Penumbra, Inc, explored why digital technologies and the data generated can change health and healthcare. Brendon argued that XR has an essential role in realizing the potential of digital technologies to redefine healthcare, but recommended that there is an opportunity to ensure we define the value proposition to reflect the full value of these technologies. Brendon recommended XR innovators in the industry that they “align actions of XR community across the entire continuum of patient care, in order to use XR to enrich the definition of medical evidence.” He also shared that while a large percentage of patients are comfortable sharing their digital data, they are not comfortable sharing that data with the likes or Facebook or Google.

Robert Louis, Chief of the Division of Neurosurgery and Director of the Skull Base and Pituitary Tumor Program at the Pickup Family Neurosciences Institute at Hoag Memorial Hospital, showcase his use of both VR training tools to better understand complex neurological anatomy (ie brain tumors) for both clinical intervention as well as patient education. He also demonstrated a real-time augmented reality overlay, utilizing Surgical Theater, to see visual reminders of patient anatomy in real-time. Through the use of cutting-edge neuroimaging and neuro-navigational equipment, Dr. Louis utilizes the concept of keyhole neurosurgery, minimizing the damage to surrounding brain, vascular and soft tissue structures.

Dr. Louis believes that most brain and skull base tumors can be resected through small openings or by utilizing naturally occurring orifices. This approach has been demonstrated to decrease post-operative pain, minimize neurologic complications and shorten the length of hospitalization, resulting in better outcomes for his patients. Since 2015, Dr Louis has been involved with the development and implementation of Virtual and Augmented Reality technologies for pre-operative simulation and rehearsal and intraoperative navigation. The 3-D VR/AR platform is provided by Surgical Theater and was developed based on flight simulator technology from F-16 fighter jets. This technology allows the surgeons to literally rehearse complicated operations in virtual reality; affording them the opportunity to visualize critical anatomy and navigate potential pitfalls. The results are making the operations safer and more effective for patients.

Under Dr. Louis’ guidance, Hoag Neuroscience Institute has become the highest volume center for Augmented Reality in Neurosurgery in the United States. Dr. Louis that “the advantage of VR is the ability to simulate multiple different approaches. In the virtual model we can do 2 plans, 3 plans, 5 plans and see what is best for the patient — and so because we have done the pre-op rehearsal I already know where the important surgical structures are going to be.” He continued that “We experience life in 3D, why should our medical care be any different?”, suggesting “we do not let pilots fly without training on simulation and we should not let people operate on our brains without simulation. Fighter pilot targeting in their HUD, the capabilities that augmented reality provides surgeons.

Sponsored Content:

Innovative XR Technologies Impacting Healthcare

In the exhibitor space, was on hand to witness the latest in AR, VR and XR technologies which are (or will soon) be impacting outcomes in healthcare education training. Here then are some of the favorite tools we saw at the event his year:

XR Management Tools: ManageXR and ArborXR

Simulation technicians, or simulation champions operating simulation technologies, will immediately understand how challenging it can be to manage dozens and dozens of manikins, laptops, or now — XR headsets. Imagine having to upload a new software update to 250 headsets across 3 campuses? Not only do new XR management systems like ManageXR and ArborXR help to oversee your fleet of XR device software updates, but also manage boot interfaces perfect for dealing with volumes of educators and learners. Imagine, for example, wanting to cut down the time it takes for learners to boot up a headset and get connected to the training material your institution wants to focus on? Locking down a headset so additional software can’t be installed onto it, or forcing the user to bypass setup screens and launch directly into XR training tools unique to your program will also dramatically cut down on technology adoption barriers and also further reduce tech support requests. Such tools will be absolutely mandatory for medical simulation programs in the future who need to roll out, manage, update, or track XR devices across a wider institution. Stay tuned for a dedicated article regarding these technologies soon!

HaptX Gloves Let Users Feel Virtual Reality Environments

Haptx provides haptic-feedback gloves are enabling clinical simulation users to feel the sensation of touch during immersive VR experiences. Our patented technology displaces your skin the same way a real object would. With 133 points of tactile feedback per hand, HaptX Gloves DK2 achieves a level of realism that other haptic devices can’t match. The use of haptics is crucial in the healthcare simulation industry as many clinical educators site “lack of touch” as a major barrier to training through Virtual Reality. The demonstration received helped us experience the tactile differences between touching something solid such as a fixed metal gear as opposed to something soft and sporadic like water droplets. We would love to see how the tool could be used in more surgical settings, especially with regards to punctures, insertions, and the feeling of different layers of epidermis.

OVR Technology Brings Olfaction Smells to Virtual Reality

Bringing realistic smells to VR training, OVR Technology is finding unique ways to provide reliable standardized olfactory signals to healthcare learners. In high-risk industries like defense, fire, oil & gas, and aviation, incorporation olfaction to VR training increases situational realism and promotes hazard ID analysis, resulting in fewer financial losses and negative health outcomes. OVR’s water-based scents (IFRA certified for safety) are dispensed and removed with precisely controlled intensity and timing based on both direction interaction and ambient proximity. This allows for purposeful scent interactions and combinations and prevents scent habituation or undesired mixtures over time. We can’t wait to smell more!

Fundamental Surgery Provides Realistic VR Training

Dubbed “the flight simulator for surgeons”, Fundamental Surgery is the only fully accredited (AAOS and Royal College of Surgeons, England) VR education platform, providing multiple modes of access, from standalone ‘all-in-one’ headsets such as the Oculus Quest, to tethered VR experiences and all through a single login and a single data repository. FundamentalVR provices remote proctoring, unlimited multi-user learning environment, and accelerating skills acquisition. Key surgical objectives and actions are given specific measurement requirements that are instantly assessed as the user interacts. Feedback is delivered using audible, visual and haptic mechanisms. All data on knowledge and skills development is securely held for later review. Such training allows clinical learners to receive flexible, 24/7 access to training in a realistic VR environment that enables for assessment, feedback and randomized patient changes!

Ekto One Simulator Boots Lets Users “Walk” in VR

Walking in VR can be limited by the dimensions of your defined physical “safe space” — that is unless you are using Ekto One Simulators Boots! These unique boots enable Virtual Reality participants to continue walking within the boundaries of a predefined space, but will “unlock” like ice skates to enable users to keep walking in place in the virtual space even without crossing real life’s physical limitations, like walking with just socks on a very smooth surface! These boots will let VR users simulate training environments of any shape and size, provide a realistic and intuitive user experience, and deploy with convenience and cost-efficiency!

OptoFidelity Buddy Enables for XR Headset Testing without Being a Pain in the Neck

Healthcare simulation champions creating innovative tools in Mixed Reality will no longer have to strain their neck just to test the limitations of their VR or AR software, with the OptoFidelity Buddy. This testing unit allows developers holds an XR headset device and then automatically rotates the perspective via a 360 gyroscope. BUDDY comes with an integrated vision module and 3 degrees of freedom, which ensures the best HMD performance, including motion-to-photon latency, content stability and pose-drifting. The system is based on non-intrusive measurement comparing the changes in the virtual world to the robotics. Measurement performance is produced by OptoFidelity’s proprietary vision module and robotics platform, which enable unbeatable repeatability, time synchronization and position-based triggering. With BUDDY you can even measure audio-to-video synchronization and so-called motion-to-audio-latency for spatial audio.

XR Mobile Storage Solutions Speed Up Distribution and Technology Management

Also present at the event was XR storage solutions provider Looking Glass XR, which has a unique mobile storage and charging cart for XR headsets called The VR Powerwall. Managing dozens of all-in-one mobile VR headsets is a logistical nightmare. As such, this company has developed custom VR software for healthcare educators and help them deploy, manage, and run their mobile VR hardware across a variety of environments.  The VR Power Wall works with any series of Pico VR, Oculus Quest and 2, Oculus Go, HTC Vive Focus/Focus Plus, the Hololens 2, and is designed to work with future mobile 6DoF VR headsets. The VR Powerwall is durable, fully wired, easy to clean, water / fire resistant, non-toxic and customizable! highly recommends anyone interested in the future of XR technologies to learn more about AWE and its ongoing global xr events!

Learn more about AWE here!

Sponsored Content: