Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon

‘FaceTime Attention Correction’ in iOS 13 Beta 3 uses ARKit to fake eye contact

Save for later
  • 3 min read
  • 04 Jul 2019

article-image
On Tuesday, Apple released iOS 13 beta 3, which came with an interesting feature called FaceTime Attention Correction. This feature aims to fix a long-standing issue of maintaining eye contact in FaceTime calls with the help of augmented reality.

Mike Rundle, an app designer was the first to spot the feature while testing the latest iOS 13. 

https://twitter.com/flyosity/status/1146136279647772673

Back in 2017, he predicted that this feature will be a reality in “years to come.”

https://twitter.com/flyosity/status/1146136649883107328

While FaceTiming, users naturally tend to look at the person they are talking to instead of looking at the camera. As a result, to the person who is on the other side, it will appear as if you are not maintaining eye contact. This feature, when enabled, adjusts your gaze so that it appears to be on camera. This helps you maintain eye contact while still letting you keep your gaze on the person you are talking to. 

Many Twitter users speculated that the FaceTime Attention Correction feature is powered by Apple's ARKit framework. It creates a 3D face map and depth map of the user through the front-facing TrueDepth camera. It then determines where the eyes are and adjusts them accordingly. The TrueDepth camera system is the same camera system used for Animoji, unlocking the phone, and even the augmented reality features we see in FaceTime.

https://twitter.com/schukin/status/1146359923158089728

To enable this feature, one can go to Settings > FaceTime after installing the latest iOS 13 developer beta 3. On Twitter, people also speculated that it is only available on iPhone XS, iPhone XS Max, and iPhoneXR devices for now. It is unclear whether Apple plans to roll out the feature more broadly in the future. It would be interesting to see whether this feature works when there are multiple people in the frame. 

https://twitter.com/WSig/status/1146149222665900033

Users have mixed feelings for this feature. While some developers who tested this out felt that it is a little creepy, others thought that this is a remarkable solution for the eye contact problem. 

A Hacker News user expressed his concern, “I can't help but think all this image recognition/manipulation tech being silently applied is a tad creepy. IMHO going beyond things like automatic focus/white balance or colour adjustments, and identifying more specific things to modify, crosses the line from useful to creepy.

Another Hacker News user said in support of the feature, “I fail to see how this is creepy (outside of potential uncanny valley issues in edge cases). There is a toggle to disable it, and this is something that most average non-savvy users would either want by default or wouldn't even notice happening (because the end result will look natural to most).” 

OpenID Foundation questions Apple’s Sign In feature, says it has security and privacy risks

Apple gets into chip development and self-driving autonomous tech business

Declarative UI programming faceoff: Apple’s SwiftUI vs Google’s Flutter