March 17, 2023

Are we wrong to dismiss the potential of Apple’s lidar-enabled devices?

I think it’s time we talk about the tablet in the room.

In January of 2010, Steve Jobs took to the Apple stage and announced an entirely new type of computer - the iPad. The rest of the day I joked with others about how silly of a name ‘iPad’ was, and thought that it probably wouldn’t amount to all that much. I was wrong, of couse (though I still think the name could have used some additional workshopping). 

In the professional space, tablets took their time being integrated into actual field workflows, with some detractors considering them too fragile or unwieldy to become an immediate replacement for a trusted clipboard and pencil. But as the iPad evolved, so did the software and project management suites that bloomed with this very goal in mind. If you don’t have to wait to go back to the trailer to send off a quick picture of something that needs attention, or if you can chat with someone not co-located with you more quickly, it saves time and effort. 

It’s clear, too, that Apple envisioned the iPad not only as a consumer device, but also a professional one as they released ‘Pro’ versions with additional memory, better camera and other upscale features that were meant for work. In 2020, a version of the iPad Pro was announced (with some surprise) that included an onboard lidar/depth sensor, and some phone models followed. Added there by Apple mostly to make their augmented reality applications have better fidelity by sensing their environment, the addition opened up a potential for developers to try and use it for more traditional scanning, unattached from gaming or other visualized uses.

While a handful of small scanning apps popped up immediately, much of that use is and was focused on objects that work best with the sensors’ reported 5 meter range. Rather than scanning rooms, users began scanning objects they wanted to model in 3D, and scanned everything from art to artifacts that they could walk around. 

As the apps evolved, the idea of using the on-board lidar for the iPad and iPhone as a professional tool fell a little bit out of popular favor, with draft and accuracy being significant enough that some thought it would never be good enough for real-world work. 

But I think it might be time to rethink that. 

Last year, one of the first scanning apps, SiteScape, was acquired by laser scanning company FARO. A few months later, Capturing Reality, another popular scanning app was scooped up by Epic Games. Matterport’s capture app got upgraded to also take in lidar information from the equipped phones. New companies have popped up, with some taking advantage of advances in AI to squeeze in additional accuracy and performance while scanning. 

What this signals, to me, is that there are two factors coming together at the same time. First, the need for more agile information gathering and reality capture is clearly a growing need. SLAM scanning from handheld or mobile scanners rapidly filled the market spaces where traditional terrestrial lidar was too cumbersome or slow, and the trade-off for accuracy is becoming less of a chasm to overcome and more like a small speed bump in captures. But there is a certain appeal to being able to do something in a ‘bring your own device’ type situation. The same device that you are, perhaps, already taking notes on or grabbing photographs on the jobsite could be the one that you also use to scan construction progress, or assign a task to someone.

In addition - the value of having a small lidar-enabled pro-sumer device is only beginning to be realized. Apple went all-in going after applications for lidar to help its autofocus and augmented reality - but there you only need a slightly better sense of where a couch is (for a Pokemon to hide behind), or where to focus on someone’s face in low light. This week, rumors have been released that Apple will be upgrading to a Sony lidar on the iPhone and iPad 15 Pro models. While it’s only a rumor for now, and there’s no specs out there, it is something that some users and app developers have been waiting for. 

We might finally get an answer to the question some of us have been asking - what could these devices do if the lidar was actually good?

Between the sensor upgrade and subsequent upgrades to the API that provides information for outside applications, I think we’re about to see some things shift.

Want more stories like this? Subscribe today!



Read Next

Related Articles

Comments

Join the Discussion