Geo Week News

June 7, 2022

Is Apple’s RoomPlan coming for Scan-to-BIM?

RoomPlan is the latest addition to ARKit that is turning heads in the reality capture world.

For those of us watching the latest 3D announcements live during Apple’s Worldwide Developers Conference (WWDC22), the news was underwhelming, and nothing lidar or 3D-related made the cut to be featured in the keynote. But some additional announcements - centered around ARKit and announced later in the day - have turned some heads.

ARKit is Apple’s development platform that’s a playground for augmented reality. The addition of lidar to phones and iPads was touted to improve AR applications, by being able to ‘sense’ the environment more accurately, to facilitate AR object placement. ARKit leverages iOS device cameras, sensors (including lidar) and internal inertial sensors to determine the position of the device, which is useful when creating AR content and apps. 

While Apple has yet to really lean in to take full advantage of the onboard lidar, some app makers have already taken the leap, developing 3D scanning applications that leverage ARKit to get better, more accurate data than photogrammetry can provide. 

But what’s new about this week’s announcement is that Apple has now launched RoomPlan, a new Swift API that uses the camera and lidar scanner on iOS devices to quickly and elegantly create a 3D floor plan layout of any room, even taking into account furniture present. Essentially, they’ve created a capability for using an iPad to create something pretty close to a 3D BIM model in just a few minutes. 

RoomPlan outputs in the universal scene description (USD) or USDZ file formats that are becoming the new popular format for 3D applications across a variety of software platforms, including NVIDIA’s Omniverse. The outputs from RoomPlan include dimensions of each component recognized in the room, such as walls or cabinets, as well as the type of furniture detected. The dimensions and placement of each individual component can be further adjusted when exported into various USDZ-compatible tools, such as Cinema 4D, Shapr3D, or AutoCAD. 

This new API was an unexpected addition - and might have some scan-to-BIM software developers a little nervous. The idea that we will one day be able to use hand-held devices to turn arounda floor plan of as-built spaces has been a dream for years, and several companies, have tried to tackle the process of translating a lidar point cloud into a 3D model - some through meshing, some through incorporating AI or additional photogrammetry. For Apple, the emphasis seems to have been on keeping the output clean and non-wonky, and creating something that is visually appealing as an end product. The use cases outlined in the announcement video above include interior designers, architecture, and real estate - which could mean that companies like Matterport are in the crosshairs here.

While there’s a lot of excitement about this launch, there are of course industry folks who are saying its accuracy can’t possibly be good enough to replace some of the higher-cost professional tools for reality capture. But as we’ve asked in this publication a few times - if you can get the answer fast enough and accurate enough for an application, is it overkill to pursue submillimeter precision? 

In addition, this is the first launch of this API, so it is possible that the tech demonstrated here, that brings together the inputs simultaneously rather than trying to get from a point cloud to a new product, will develop over time into something more accurate. 

There have been some encouraging tests being shared on LinkedIn and Twitter, and it will be interesting to see if this is as disruptive to reality capture as it appears it might be. 

Want more stories like this? Subscribe today!



Read Next

Related Articles

Comments

Join the Discussion