Apple's first AR wearable might not arrive until 2022, but we may have an early preview of what it will be like to wear a pair of its smartglasses via iOS 14.
Clues found in the code for the next major version of Apple's mobile operating system point to an app with the codename of Gobi that overlays augmented reality data over products in a retail setting.
The app would activate the AR experience through either the scanning of a QR Code or detection of a nearby iBeacon or AirTag, Apple's rumored AR-enabled Tile competitor. The underlying technology for the app, which is reportedly undergoing testing at Apple Stores and Starbucks, would also be available to third-party developers via an SDK or API.
Once the AR experience is activated, the app could display product information, prices, and other data in the camera view of a compatible iPhone or iPad, according to a review of the code by 9to5Mac.
With superpowers like multi-user experiences, persistent content, image and object detection, and more already available in ARKit, along with user-friendly development tools in Reality Composer, Apple already has many of the AR capabilities that would facilitate such AR experiences.
Furthermore, startups like Dent Reality have already demonstrated the ability to pull off the AR interaction that Apple reportedly has planned.
Nonetheless, having this futuristic experience available as a standard iPhone app would bring one of the sci-fi abilities imagined by Hollywood (as well as the famous "Hyper-Reality" short film by Keiichi Matsuda) to life ahead of the smartglasses themselves.
Cover image via Keiichi Matsuda/YouTube
Comments
No Comments Exist
Be the first, drop a comment!