During the keynote presentation at the WWDC 2021, Craig Federighi, senior vice president of software engineering for Apple, unveiled Live Text, a new camera mode coming to iOS 15 that delivers much of the same functionality that Google Lens offers for Android smartphones and Google Photos.
- Don't Miss: Apple's AR Spaces in Updated Clips App Uses iPhone & iPad LiDAR to Give You New AR Video Powers
Live Text will enable iPhone users to point their capture photos and interact with text. In addition to copying text, users can search based on selected text or call a phone number recognized within the image. The same functionality will be available via the Photos app on iOS and macOS Monterey. Live Text also replicates Google Lens's ability to recognize pet breeds, plants, products, art, and landmarks.
In addition, in iOS 15 Apple is debuting its own version of the AR walking navigation mode we've seen in Google Maps Live View.
Like Live View, Apple Maps will provide AR navigation prompts in the camera view as iPhone users navigate on foot from point A to point B. The AR mode will launch to a limited set of cities, namely London, Los Angeles, New York, Philadelphia, San Diego, San Francisco Bay area, and Washington, D.C., later this year, with more coverage expanding over time.
Apple founder Steve Jobs once famously quoted Picasso, "Good artists copy; great artists steal." Google Lens and Live View are easily two of the more useful AR functions Google has introduced over the past few years, so it's on-brand for Apple to want the same functionality in iOS. Moreover, adding these features as part of iOS keeps iPhone users from having to favor third-party apps.