Dev Report: A Breakdown of the What Apple's New ARKit Can Do for iPhones & iPads

A Breakdown of the What Apple's New ARKit Can Do for iPhones & iPads

At Apple's yearly event, the World Wide Developers Conference, the tech giant finally announced their decision to enter the augmented reality space. Through adding basic AR functionality to the beta release of Xcode 9, the development environment for Mac computers, as well as their line of iOS devices, the company has said they understand the importance of the tech.

Rumors have spread for some time about Apple, hiring lots of new engineers to boost their AR game. Even recently hiring Jeff Norris, pulling him not only away from NASA, but also taking him away from working on some amazing projects for the Microsoft HoloLens. So with all of these rumors and great expectations, the announcement of some new AR functionality added to the iOS SDK was a little underwhelming.

Unreal and ARKit in action.

So What Exactly Does the New ARKit Add to iOS?

Well, let's unpack that.

Using only the RGB cameras and other sensors built into iPhones, along with a technique called visual-inertial odometry, or combining information from the device's motion-sensing hardware and computer vision through the device's camera, Apple has added a version of world tracking and virtual object anchoring that is comparable to some of the depth sensors that exist. Considering the lack of additional hardware the tracking is really impressive.

Image by Jason Odom/Next Reality

In addition, through the use of hit-testing, world tracking can understand the scene it sees. At least, on a basic level. Using plane detection the iPhone can find flat surfaces in the camera view which will allow the user to place a virtual object on a table for example.

ARKit also does ambient light estimation, which allows the application to more accurately match its virtual lighting to the lighting in the area. This will provide a more seamless and immersive AR experience for the user.

Along with all of these new additions to their SDK, they have added in Unity and Unreal support. Unity has a bitbucket repository up with a plugin to enable all of the added ARKit functionality to Unity applications for iOS.

So while the revelations presented at WWDC may not be the new AR wearable some of us were hoping for, and a few expected, they have in a single move created the world's largest AR platform.

In a recent conversation, when asked about his thoughts on the rumors of the upcoming Apple reveal, the president and general manager of Vuforia, Jay Wright, let me know that this was what he expected, saying, "Somethings going on, we've seen signs long enough — enough's gone in, something ought to come out. I think we should expect further validation of AR going to mainstream."

For a developer, what this move means, is that finding real use-cases that have some value, whether for instruction and training, education, or entertainment, on a hardware platform with "hundred of millions of iPhones and iPads" leads to getting that power bill paid and clothes on the children. This move by Apple further confirms that AR is becoming a much larger part of the zeitgeist.

Will you be developing AR apps for iOS? Let us know in the comments below.

Just updated your iPhone? You'll find new emoji, enhanced security, podcast transcripts, Apple Cash virtual numbers, and other useful features. There are even new additions hidden within Safari. Find out what's new and changed on your iPhone with the iOS 17.4 update.

Cover image by Jason Odom/Next Reality

Be the First to Comment

Share Your Thoughts

  • Hot
  • Latest