iOS 14 quietly took a big step towards Apple Glasses and we missed it
Apple announced a host of updates coming to its various operating systems at the Worldwide Developers Conference (WWDC) 2020, and there was no mention, whatsoever about any new hardware. However, people who have followed WWDC for a while will know that hidden among software announcements are little tips that reveal quite a lot about what’s coming next for devices.
On the surface it was all about iOS 14, Apple’s move to ARM and its own silicon chips etc, but under it all there was a clear hint towards Apple Glasses that you might have missed.
Apple barely mentioned Augmented Reality (AR) while talking about iOS 14 or the iPadOS but if one pieces together some individual announcements then a clearer picture starts to emerge. For example - spatial audio on the AirPods Pro, location-based AR tools for devs, App Clips, ‘hand pose’ detection for Apple Vision and 3D icons.
With so many little hints and clues, all you need are those smart glasses to make it all come together.
For starters, the ARKit 4. ARKit is a set of tools for AR app developers and Apple says that it powers the world’s largest AR platform - iOS. You would not recognise iOS as an AR platform since that tech is still very, very nascent. However, one of ARKit demos show the new ‘location anchors’ which point towards how things are going to change for iOS 14 and iPadOs 14.
These location anchors allow apps to place AR creations like signposts, statues, even game characters at specific locations in the real world. This is Apple stepping outside and in all these locations, people wearing the Apple Glasses can see those virtual signposts and objects.
Besides a game like Pokemon Go, we have not really seen AR taking a step out in the real world. Ikea was just moving virtual furniture around in your living room. Apple has already put LiDAR tech in its new iPads that is meant for this.
On iOS 14 and iPadOS 14 devices, the ARKit 4 can put together geographic coordinates and high-res map data from Apple Maps. ARKit engineer Quinton Petty calls this ‘visual localisation’ which means that you will be able to precisely locate your device in relation to the surrounding environment more accurately than you could pull off with just GPS data. This makes sense for an AR experience. Also, Apple said that all its location-based AR uses advanced machine learning tech that run on your device and there is no processing in the cloud and nor is Apple sending any images to itself, or anyone else.
Besides these ‘location anchors’ another subtle nod to the AR theme and therefore smart glasses was the spatial audio feature that is going to come to the AirPods Pro. This update brings 3D sound to your earbuds. Why would earbuds need 3D sound unless you are watching a lot of Dolby Atmos flicks with your AirPods in.
The real benefit of this update could come with AR with your phone giving you audio nods to Maps giving you directions or AR on the Apple Glasses.
In the case of App Clips, it was all about immediate benefits like paying for something without actually downloading the full app, but the ultimate aim seems to be to help you launch AR experiences by scanning real-world objects. Then there was ‘hand pose’ recognition for gesture controls in Apple’s Vision framework, ‘scene geometry’ in the ARKit 4 that uses LiDAR to categorise different objects and materials etc.
Apple didn’t talk about AR at WWDC 2020. But Apple did.