October 28, 2014

Intel Promises Computers & Tablets with Tiny 3D Sensors by Next Year

Dell Tablet

Intel’s RealSense 3D depth-sensing technology, announced at the Consumer Electronics Show in January, is finally making its way into devices. First up: the Dell Venue 8 7000 series tablet, which measures in at a paltry 6mm thick.

The RealSense sensor itself measures in at less than 3.5mm.

The incorporation of a depth sensor into the thinnest tablet on the market is another milestone in the development of consumer 3D imaging technology, which is becoming increasingly smaller and less expensive. 

How this RealSense technology works depends on where the sensor is to be implemented. The MIT Technology Review reports that a rear-facing RealSense sensor (like the one the back side of the Dell tablet) will use “twin cameras that gauge depth with stereovision, combined with an infrared camera to help fine-tune the results.” 

An Intel representative told SPAR that the camera array gathers depth information by capturing “different visual layers.” This ability to differentiate visual layers is why Cnet and other sources are reporting that the RealSense sensor will bring the benefits of light-field photography to consumers. Light-field technology, which uses a main lens accompanied by an array of microlenses positioned to capture depth information, allows anyone to re-focus a picture on their computer after it has already been taken. 

The RealSense sensor, which includes only three lenses, is not designed to be a true light-field camera. However, it can offer many of the same benefits — including refocusing functionality, layered filtering, and parallax effects — while providing a much higher picture resolution. More exciting than the uses the RealSense could have for mobile photographers, however, is Intel’s promise that the camera’s real-time depth-sensing will enable distance measurements. As Cnet has it, the sensor will allow “accurate distance measurements, both on its surface and within photos if the subject of the image is within a few meters of the camera lens at the time of capturing.”

In other words, you could take a picture of an object and the camera would capture its depth information instantly. You could then interrogate the photograph for measurements later. 

Said Intel CEO Brian Krzanich, “I like to think of it as an infinite number of layers you can separate out. You can do measurements, filtering, and a variety of other things.” What those other things are still remains to be seen.

As for other implementations of the RealSense technology, front-facing versions of the sensor (above the screen of your computer, for instance) will gather depth information using structured light sensors. much like the Kinect or Mantis Vision’s products. This front-facing RealSense, which works as an interface, reads and responds to the gestures of its users.

I wasn’t able to attain specs on the accuracy level of the measurements made using the RealSense technology, but I wouldn’t expect them to be mind-blowing–the technology is still young, and this particular implementation seems more focused on bringing the abilities of light-field sensors to smaller devices than with making those particularly precise. Obviously, you won’t see these used for surveying, but they could offer a way to perform (very) basic, quick-and-dirty distance and volumetric estimations in real-time.

What’s the next step for the tech? Apps, of course. Intel has released the technology to developers, who are no doubt cooking up ingenious uses for the extremely small depth sensors already. Add their numbers to the developers already at work with the Mantis Vision technology, and it becomes more likely that you could soon see an app that makes your job easier in a way you never expected.

Want more stories like this? Subscribe today!



Read Next

Related Articles

Comments

Join the Discussion