Apple Bursts onto Augmented Reality Scene with ARKit

Jun 5, 2017 08:21 PM
Jun 5, 2017 09:14 PM
636322623307863592.jpg

During the opening keynote of their Worldwide Developers Conference today in San Jose, Apple introduced the ARKit for the new iOS 11 that will bring augmented reality apps to millions of compatible iPhones and iPads.

ARKit will allow developers to leverage motion tracking, surface detection, and ambient light scaling capabilities using the camera, CPU, and GPU, with no additional hardware necessary. ARKit will also support Unity and Unreal engines.

Craig Federighi, Apple's senior vice president of software engineering, demonstrated the extent of the augmented reality features on stage with a test app.

Using the camera, Federighi showed how an iPhone can detect a horizontal surface and place virtual objects on it. The objects reacted to one another, with a virtual lamp projecting shadows behind the other objects. He noted that ARKit uses ambient light to estimate scale.

636322625009901925.jpg

Craig Federighi, Apple’s senior vice president of software engineering, demonstrates ARKit capabilities coming to iOS 11.

Several apps have already begun taking advantage of the features, including offerings from IKEA and Lego as well as enhancements to Pokémon Go.

For a guest demonstration, Federighi yielded the floor to Alasdair Coull from Wingnut AR, a new development company founded by The Lord of the Rings director Peter Jackson. Coull displayed an animated scene, rendered by Unreal Engine 4, of a post-apocalyptic outpost that covered the entire on-stage table.

He brought the scene to life, with outpost's miniature inhabitants running around the landscape, enemy aircraft flying in from the horizon and dropping bombs on the outpost, and the outpost returning fire. At one point, one unlucky villager jumped off the edge of the table.

636322625890214577.jpg

Alasdair Coull from Wingnut AR sets the stage for an animated augmented reality battle.

By enabling augmented reality capabilities with existing devices already in the field, Apple jumps into the pool with a cannonball. iOS 11 will support iPhones 5S and SE and up, as well as iPads Air and mini 2 and up, giving Apple a massive installed user base.

"The new Apple ARKit will ease the transition for consumers into the more sophisticated aspects of AR by reaching millions of Apple users and getting them familiar with the technology. We should expect to see a larger AR ecosystem develop around software and products to provide enhanced capabilities to the Apple AR features. This is the first step in creating an immersive AR experience for consumers," said David Goldman, Vice President of Marketing at Lumus, an augmented reality optics company, in a statement to Next Reality.

However, it remains to be seen whether Apple will it be able to match the spatial mapping or depth sensing of HoloLens or Tango, with ARKit relying on an estimation of space, lighting, and scale instead of dedicated sensors. In addition, ARKit documentation notes that it will only detect horizontal planes in its beta version. No wall detection for now.

Along with ARKit, Apple also unveiled new machine learning image recognition capabilities. Dubbed Core ML, the new API will empower developers with the ability to recognize faces, landmarks, language and more. With Core ML, Apple is able to match capabilities recently introduced by Samsung and Google, though Federighi notes that Core ML is faster than its predecessors.

iOS 11 beta will be available for download today. Documentation for both ARKit and Core ML are now available on Apple's developer portal.

Cover image via Apple

Comments

No Comments Exist

Be the first, drop a comment!