You may soon add feigning eye contact to the list of ways that augmented reality is improving our lives.
In beta versions of iOS 13, FaceTime contained a new feature called "Attention Correction," which corrects the user's eye gaze to appear as though they are maintaining eye contact with parties on the other end of the video call. However, in the stable release of iOS 13, as well as subsequent iOS 13 updates, the new FaceTime feature was missing. Apple finally brought it back in the stable iOS 14 version.
Evidently, the secret sauce to the magic trick was none other than Apple's native augmented reality toolkit, ARKit. According to Dave Schukin, co-founder of Observant AI, the new Attention Correction feature used ARKit's face-tracking abilities and the iPhone's TrueDepth camera depth-sensing powers to virtually modify the user's eye position.
Schukin, whose company develops technology that tracks a driver's eye gaze, shared the discovery on Twitter during iOS 13's beta phase. By passing an object in front of his face, warping around the eyes reveals that FaceTime is using a virtual mask to create the illusion of undivided attention.
In response to another Twitter user, Schukin confirmed that the effect was like a Snapchat filter, but a more accurate one thanks to the TrueDepth camera on the iPhone X, XS, and 11 series.
If we have room for another analogy, it's kind of like a high-tech version of drawing eyes on your eyelids so your teacher doesn't think you're sleeping in class. Moreover, it is a very clever application of augmented reality in general, where the ultimate benefit is not the technology itself, but the outcome it generates.
In other words, while face filters are fun, give me something truly useful, like Attention Correction! We're not sure if Apple plans on adding the feature back into iOS in a future update, but why not, especially since it had an off switch in FaceTime's settings.