If you follow the development of 3D technology, you hear a lot of big claims. Here’s one: The fluid interfaces group in MIT’s Media Lab says their Oasis project “changes the way we create and experience digital content.” This time, though, the claim might be true.
Oasis is a tool that generates a virtual reality scene on top of a 3D scan captured using a Tango smartphone. To unpack that: The system captures depth data for a space, “detects obstacles like furniture and walls, and maps walkable areas to enable real-walking in the generated virtual environment.”
Put more simply, it takes a scan of your real space, and crafts a virtual reality experience that it fits into the space. This means you won’t have to worry so much about walking into walls. It also uses the depth data to make the VR experience interactive–by tracking real objects, it is able to include them in the VR experience.
Here’s how the fluid interfaces group describes Oasis:
Depth data is additionally used for recognizing and tracking objects during the VR experience. The detected objects are paired with virtual counterparts to leverage the physicality of the real world for a tactile experience.
Oasis can be used, for example, to create story spaces where friends and family can remotely participate in a session of storytelling around the campfire. The freedom to move around and interact with the virtual world allows for a new form of storytelling when combined with traditional narration techniques like vocalization, movement, and gestures. We call this human in the loop storytelling distinguishing it from current VR storytelling experiences where the software system is the storyteller.”
Watch the video, or check out the Oasis project page for more.