In order to make eye contact with the other persons you would need to look both at the device’s camera so that the conversation partner perceives you at looking in their eyes. However, if you do this you won’t be able to see him or her because you can’t look in the same time at the iPhone’s screen and into its camers. It’s an angle difference that prevents the natural eye contact.
FaceTime Attention Correction Feature
How It Works
Starting with iOS 13 Apple has a solution for this small inadvertence. Through a combination of hardware and software improvements compatible iPhone and iPad models are able to use the front facing TrueDepth camera to capture an ARKit depth map and artificially “adjust” your eyesight to create the impression that you’re looking at the camera even if your actually gazing at the iPhone’s screen.
How To Set Up
The FaceTime Attention Correction feature is optional. It’s currently not enabled by default. You can turn it on from Settings -> FaceTime -> FaceTime Attention Correction and disable it whenever you want.
You need to own an Apple device equipped with TrueDepth camera hardware and enough image processing power. This means that this feature is compatible with iPhone XR, iPhone XS, iPhone XS Max and the 2018 iPad Pro models. Your device needs to run iOS 13 or later and iPadOS or later.
Fact: the FaceTime Attention Correction feature doesn’t need both participants to own compatible devices.
Ever since the announcement of this new feature an interesting debate has started on the online forms. Is FaceTime Attention Correction actually making the conversation more personal, or it’s actually faking the way you look at your conversation partner. Will we end up in the near future video calling someone and appearing to make eye contact while doing something completely else in the background? Do you plan to use this feature or will keep it disabled?
Related: Want to find out more about iOS 13? Read here about the much awaited Dark Mode!