News: Google Brings Persistent Content Support to Cloud Anchors, Expands Augmented Faces to iOS

Google Brings Persistent Content Support to Cloud Anchors, Expands Augmented Faces to iOS

With Android 10 hitting the streets (at least for those mobile devices that get quick updates) and the public release of iOS 13 dropping on Sept. 19, Google is releasing an update on Thursday to ARCore that adds some fantastic new benefits to its cross-platform capabilities.

For its Cloud Anchors API for multi-user augmented reality experiences, Google has integrated persistent content support, enabling users to "save" AR content in real-world places for others to discover and interact with.

"Imagine working together on a redesign of your home throughout the year, leaving AR notes for your friends around an amusement park, or hiding AR objects at specific places around the world to be discovered by others," said Christina Tong, product manager for augmented reality at Google, in a blog post.

Image by Sybo TV/YouTube

One of the first apps to feature the new persistent content capabilities of Cloud Anchors is Mark AR, an app from developers Sybo and iDreamSky that gives users the freedom to generate and share AR content with others in public spaces.

"Reliably anchoring AR content for every use case — regardless of surface, distance, and time — pushes the limits of computation and computer vision because the real world is diverse and always changing," said Tong. "By enabling a 'save button' for AR, we're taking an important step toward bridging the digital and physical worlds to expand the ways AR can be useful in our day-to-day lives."

Alas, the persistent content support is limited in availability at launch, but Google is accepting applications for developers interested in testing the capability.

In addition, the ARCore team has implemented some improvements to Cloud Anchors which improve the multi-user set up process with better anchors and visual processing.

"Now, when creating an anchor, more angles across larger areas in the scene can be captured for a more robust 3D feature map," said Tong. "Once the map is created, the visual data used to create the map is deleted and only anchor IDs are shared with other devices to be resolved. Moreover, multiple anchors in the scene can now be resolved simultaneously, reducing the time needed to start a shared AR experience."

Some of the apps that take advantage of Cloud Anchors include Google's Just a Line, NASA Jet Propulsion Laboratory's Spacecraft AR, and Childish Gambino's Pharos AR. Now, users who may have been underwhelmed with the multi-user experiences of those apps can dive back in and see if the new efficiencies of the API give the apps better mileage.

In addition, Google made good on its promise from I/O by officially expanding its Augmented Faces API to iOS, mirroring the cross-platform distinction of Cloud Anchors. Augmented Faces matches ARKit functionality, which gives developers the ability to create Snapchat-like selfie effects.

"Earlier this year, we announced our Augmented Faces API, which offers a high-quality, 468-point 3D mesh that lets users attach fun effects to their faces — all without a depth sensor on their smartphone," said Tong. "With the addition of iOS support rolling out today, developers can now create effects for more than a billion users."

In addition, Google added a feature familiar to developers working in Snapchat's Lens Studio by offering templates for face effects to make the creative workflow less complicated.

With the latest updates, ARCore catches up with ARKit in the area of persistent content. It still lags behind ARKit 3, particularly with regard to Apple's People Occlusion, Motion Capture, and dual-camera support, but, by offering cross-platform capabilities, Google is doing its part to break down the walls between the mobile OS gardens.

Cover image via Sybo TV/YouTube

Get AR Updates Daily

Stay up to date with Next Reality's newsletter

Be the First to Comment

Share Your Thoughts

  • Hot
  • Latest