Apple iPhone 12 Pro gets a new LiDAR-based feature as a part of iOS 14.2 beta
Called ‘People Detection’, this feature aims to help visually challenged users to know how far they are from other people.
Apple iPhones 12 Pro are two of the most powerful smartphones by Apple right now. And while they boast of several new features, there's one hidden feature which has come to notice now. Called ‘People Detection', this feature aims to help visually challenged users to know how far they are from other people. Although the hardware was always there in the new iPhones, the feature has been activated now as a part of iOS 14.2 beta update. The hardware that the smartphone uses is LiDAR and the wide-angle camera to detect the distance between two human beings.
It is worth adding that since the tech used here is the LiDAR sensor, it will be limited to iPhone 12 Pro and 12 Pro Max as they only have the particular sensor. iPhone 12 Mini and 12 miss out on LiDAR. If you are on iOS 14.2 beta and are using the iPhone 12 Pro, you can use this feature from inside the Magnifier app.
Also read: Not easy to get iPhone 12 camera repaired as it requires Apple-exclusive tool
mobile to buy?
However, besides the LiDAR sensor and the wide-angle camera, the handset also uses augmented reality (AR) and machine learning to detect users and inform the owner.
As mentioned by TechCrunch, this new ‘People Detection' feature comes as an extension of people occlusion capability of Apple's ARKit.
When the feature finds someone in close proximity, it makes a sound in stereo corresponding to the direction of that particular person. You can set a particular tone of your choice as well. In addition, Apple is also giving a haptic pulse option that goes faster as the person approaches you. It' is not there in Apple Watch yet.
Follow HT Tech for the latest tech news and reviews , also keep up with us on Twitter, Facebook, Google News, and Instagram. For our latest videos, subscribe to our YouTube channel.