Friday 26 June 2020

How iOS 14 quietly took a big step towards the Apple Glasses

Apple’s annual WWDC keynotes are like an easter egg hunt for tech fans – you don’t always get big hardware reveals, but hidden among all the software announcements is a trail of evidence that collectively reveals a lot. And so it was at WWDC 2020. 

On the face of it, this year's show was all about Apple Silicon, iOS 14 and Craig Federighi’s continually impressive hair. But connect the dots and you’ll clearly see the exciting silhouette of the Apple Glasses.

Naturally, the always-secretive Apple barely mentioned Augmented Reality (AR) explicitly when talking about iOS 14 and iPadOS 14. And as impressive as Apple CEO Tim Cook’s Space Grey glasses were, they weren’t smart (as far as we could tell).

Like Clark Kent, all Apple needs now is a pair of smart glasses to complete the look.

But piece together some of the slightly puzzling individual announcements and the Apple Glasses picture starts to emerge: a spatial audio update for the AirPods Pro, location-based AR tools for developers, ‘App Clips’ that conveniently serve you little pop-ups of digital info, ‘hand pose’ detection in Apple’s Vision framework, and even new 3D icons that look ideal for AR.

There’s no doubt about it – the AR chess pieces are assembling right across Apple’s ecosystem with iOS 14. And like Clark Kent, all Apple needs now is a pair of smart glasses to complete the look.

The Invisible Glasses

Talking of Superman, perhaps the most overlooked and impressive demo at WWDC was one that soared over a digital San Francisco in a preview of ARKit 4. ARKit is Apple’s set of software tools for AR app developers that, as Apple claims, “powers the world’s largest AR platform, iOS”.

iOS 14

(Image credit: Apple)

You might not be aware that iOS is an AR platform because, well, the tech is still very much in its toddler phase. But a particular ARKit 4 demo, which showed the kit’s new ‘location anchors’, revealed how quickly that’s about to change with iOS 14 and iPadOS 14. These ‘location anchors’ let apps place AR creations – like statues, game characters or giant signposts – to very specific locations in the real world. In other words, Apple’s AR is stepping outside.

This means that everyone in those locations, some of whom may soon be wearing Apple Glasses, can wander around the same virtual creation and experience it in the same way. Which is a huge deal. Aside from Pokemon Go, true AR has largely been stuck indoors shifting around virtual IKEA furniture. And while virtual home shopping will certainly become big, AR’s move into the great outdoors with iOS 14 is a big leap that paves the way for Apple Glasses.

On location

Perhaps the most exciting thing about ‘location anchors’, though, is the tech behind them. On iOS 14 and iPadOS 14 devices, ARkit 4 can crunch together your geographic co-ordinates with high-res map data from Apple Maps.

iOS 14

(Image credit: Apple)

According to Apple ARKit engineer Quinton Petty, this process – which Apple calls ‘visual localization’ – means you’ll be able to “precisely locate your device in relation to the surrounding environment more accurately than could be done before with just GPS”. This is crucial for a good outdoor AR experience, not to mention other smartphone apps. It’s also where Apple’s approach deviates from rivals like Google and Niantic, the maker of Pokemon Go.

Whereas Niantic recently started collecting 3D visual data from its players, raising privacy concerns, Apple said at WWDC that its location-based AR uses advanced machine learning techniques run “right on your device” and that “there’s no processing in the cloud, and no image is sent back to Apple”. Which neatly fitted Apple's wider privacy theme better than an Airpod slotting into its charging case.  

Treasure maps

Been wondering why Apple keeps persisting with Apple Maps? It’s the foundation for the AR layer Apple is building on top of the real world, rather than just another way to help you get to the supermarket – even if those new cycling directions in Apple Maps on iOS 14 do look incredibly handy.

iOS 14

(Image credit: Apple)

Naturally, there is still a lot of digital surveying to be done. Right now, those ‘location anchors’ are only available in the San Francisco Bay Area, New York, Los Angeles, Chicago and Miami, with “more cities coming through the summer”. This is because much of the localization accuracy appears to be based on Look Around data, which is Apple Maps’ equivalent of Google’s Street View. 

It’s going to take a while to make it global, but iOS 14 is a big step towards Apple Glasses (which are expected to arrive in either March 2021 or 2022) and an outdoor AR experience that’ll see your smartphone apps and games leap into the real world.

The missing pieces

While ‘location anchors’ were the most explicit nod to Apple’s AR plans at WWDC 2020, there were a lot of more subtle nods to the theme too.

The AirPods Pro have a new spatial audio feature, for example, that will bring 3D sound to your favorite true wireless earbuds. Which sounds a bit puzzling, unless you watch a lot of Dolby Atmos films with your AirPods. Still, the real benefit could eventually come with AR, with your phone either giving you simple audio nods to Maps directions or working with Apple Glasses for a truly immersive AR experience.

iOS 14

(Image credit: Apple)

In a similar way, iOS 14’s new ‘App Clips’ feature – which lets you preview small parts of full apps without downloading them – could have some immediate benefits, like quickly paying for your smart scooter (above). But the ultimate aim feels more like it’ll be helping you launch AR experiences by scanning real-world objects.

There were countless other hints at WWDC 2020 too – some incredible 'hand pose' recognition for gesture controls in Apple's Vision framework, new 'scene geometry' in ARkit 4 that lets a lidar sensor automatically categorize different objects and materials, and as AR developer Lucas Rizzotto pointed out on Twitter, a new 3D design language looks ideal for augmented reality and Apple Glasses. 

Considering Apple hardly mentioned AR at WWDC 2020, it was an impressively loud statement for such a ‘quiet’ show. Who knows, by the time WWDC 2021 comes around, Tim Cook might be wearing some considerably smarter spectacles.



from Future - All the latest news https://ift.tt/3fV6Baq

No comments:

Post a Comment