Header Banner
Next Reality Logo
Next Reality
Augmented & Mixed Reality News, Rumors & Dev Guides
nextreality.mark.png
Apple Snap AR Business Google Instagram | Facebook NFT HoloLens Magic Leap Hands-On Smartphone AR The Future of AR Next Reality 30 AR Glossary ARKit Dev 101 What Is AR? Mixed Reality HoloLens Dev 101 Augmented Reality Hololens How-Tos HoloLens v. Magic Leap v. Meta 2 VR v. AR v. MR

Google Adding Stereo Depth Support for Dual Cameras to ARCore Starting with Pixel 4 & 4XL

Mar 2, 2021 10:48 PM
Smartphone lying on a wooden surface.

In response to Apple's implementation of LiDAR sensors in iPad Pro and iPhone 12 Pro models, Google is looking to leverage the dual-camera setups in recent flagship devices as depth-sensing components.

In recent updates to Google's ARCore, as well as the app that installs the AR toolkit on compatible Android devices, the company quietly added mentions of support for "dual camera stereo depth."

The first devices to support this new capability will be the Pixel 4 and Pixel 4XL, which carry a 12.2-megapixel camera and a 16-megapixel telephoto lens on the rear and an eight-megapixel wide-angle camera along with Soli radar sensors upfront.

Support is slated to arrive "in the coming weeks," according to a recent update of the ARCore Supported Devices page on the Google Developers site.

Dual camera support can improve AR experiences using Depth API, such as Five Nights at Freddy's AR Special Delivery

Meanwhile, version 1.23.210260603 of Google Play Services for AR also lists "dual camera stereo depth on supported devices" in its changelog.

While details are sparse, conventional wisdom dictates that ARCore will utilize dual cameras to measure depth (in much the same way that Snap Spectacles 3 does) and apply the data with the Depth API for more realistic AR experiences.

The move is reminiscent of Apple's launch of ARKit in 2017 and Google's quick pivot of Project Tango (which relied on smartphone manufacturers to ship devices with depth sensors) to ARCore, which, like ARKit, was able to detect horizontal surfaces based on machine learning estimates from a standard smartphone camera sensor.

Now, it is Apple that is bringing depth sensors via LiDAR technology to mainstream smartphones. Taking the available hardware already in the wild and repurposing it is a bit of a savvy move for Google to match its rival's capabilities, though the results won't be as precise as LiDAR or having other dedicated depth sensors would be. Perhaps a greater detriment to this approach is the fact that Depth API is currently only supported on about of third of ARCore-eligible devices

Google typical rolls out such ARCore capabilities with more fanfare, so we'll keep an eye out for more on this. If I'm placing bets, Snapchat, which is an early adopter of LiDAR and Depth API for AR Lenses and has experience with dual camera stereo depth, is a candidate for the first to take advantage of the new skills.

Cover image by Tommy Palladino/Next Reality

You already know how to use your phone. With Gadget Hacks' newsletter, we'll show you how to master it. Each week, we explore features, hidden tools, and advanced settings that give you more control over iOS and Android than most users even know exists.

Sign up for Gadget Hacks Weekly and start unlocking your phone's full potential.

Related Articles

Comments

No Comments Exist

Be the first, drop a comment!