News: FaceTime for iOS 13 Uses ARKit to Help Callers Pretend They Are Maintaining Eye Contact

FaceTime for iOS 13 Uses ARKit to Help Callers Pretend They Are Maintaining Eye Contact

You can now add feigning eye contact to the list of ways that augmented reality is improving our lives.

In the official release of iOS 13 this fall, FaceTime will gain a new feature called Attention Capture, which corrects the users' eye gaze to appear as though they are maintaining eye contact with parties on the other end of the video call.

Evidently, the secret sauce to this magic trick is none other than Apple's native augmented reality toolkit, ARKit. According to Dave Schukin, co-founder of Observant AI, the new Attention Capture feature uses ARKit's face-tracking abilities and the iPhone's TrueDepth camera depth-sensing powers to virtually modify the user's eye position.

Images by Dave Schukin/Twitter (1, 2)

Schukin, whose company develops technology that tracks a driver's eye gaze, shared the discovery on Twitter. By passing an object in front of his face, warping around the eyes reveals that FaceTime is using a virtual mask to create the illusion of undivided attention.

In response to another Twitter user, Schukin confirms that the effect is basically like a Snapchat filter, but a more accurate one thanks to the TrueDepth camera on the iPhone X series.

If we have room for another analogy, it's kind of like a high-tech version of drawing eyes on your eyelids so your teacher doesn't think you're sleeping in class.

Moreover, it is a very clever application of augmented reality in general, where the ultimate benefit is not the technology itself, but the outcome it generates.

In other words, while face filters are fun, give me something truly useful, like Attention Capture!

Cover image via Apple/YouTube

Get Daily AR News

Get Next Reality's newsletter for all the latest.

Be the First to Comment

Share Your Thoughts

  • Hot
  • Latest