Now We Can Render Ourselves Into Mixed VR Thanks to Imverse
What if we could render our own bodies inside our VR experience, creating a more seemless transition between the two worlds? Imagine the possibilities in healthcare simulation to be able to see your own hands inside your virtual training experience! Techcrunch just covered a new demo from Imverse, whose mixed reality engine can render the real you inside a VR headset so you look down and see your own arms and legs, as well other people objects inside the VR area. This groundbreaking volumetric graphics technology could make VR feel more comfortable and familiar.
About Imverse From TechCrunch
While there’s certainly some pixelation, rough edges and moments when the rendered image is inaccurate, Imverse is still able to deliver the sensation of your real body existing in VR. It also offers the bonus ability to render other objects, including people, allowing Bello Ruiz to shake my hand while he’s in a VR headset and I’m not. That could be helpful for bringing VR into homes where family members might need to share the living room without knocking into people or things, especially if someone’s trying to get your attention when you have a headset and headphones on.
The first experience built with the real-time rendering is Elastic Time, which lets you play with a tiny black hole. Pull it in close to your body, and you’ll see your limbs bent and sucked into the abyss. Throw it over to a pre-recorded professor talking about space/time phenomena, and his image and voice get warped. And as a trippy finale, you’re removed from your body so you can watch the scene unfold from the third-person as the rendering of your real body is engulfed and spat out of the black hole.
“This collaboration came out of an artist residency I did at the lab of cognitive neuroscience in Switzerland,” says Mark Boulos, the artist behind the project. “They had developed their tech to use in their experiments and neuroprosthesis.”