Apple ARKit 4

UPDATE: ARKit 5 is here! What’s new?

 

Here it is. The latest version of Apple’s Augmented Reality framework, ARKit 4, has just been announced at WWDC 2020We reviewed ARKit 3 last year and we were looking forward to what this new version could bring.  

We haven’t been able to get a firsthand test drive yet (spoiler alert: we will). These are our thoughts based on what has been revealed during the WWDC 2020 related talks 

Let’s see what’s new in ARKit 4 and for Augmented Reality app development on iOS. 

 

Contents

 

Location anchors

Ferry Bulding - ARKit 4 Location AnchorsARKit location anchors allow you to place virtual content in relation to anchors in the real world, such as points of interest in a city, taking the AR experience into the outdoors.  

By setting geographic coordinates (latitude, longitude, and altitude) and leveraging data from Apple Maps, you can create AR experiences linked to a specific world location. Apple calls this process “visual localization”, basically locating your device in relation to its surroundings with a higher grade of accuracy.  

In the example shown by Apple, the virtual “Ferry Building” sign is associated with the real building location. The code that makes it possible looks like this: 

 

All iPhone & iPad with at least an A12 bionic chip and GPS are supported.   

This is one neat feature that has the potential to enable things such as virtual art/statues or virtual graffiti in public spaces. Could you imagine experiencing Godzilla in the streets of Tokyo through your phone?   

Unfortunately, since location anchors require Apple Maps data, the set of supported cities is initially pretty low. The only supported cities at launch are San Francisco, New York, Los Angeles, Chicago, and Miami. Let’s hope Apple can update this list quickly to include locations such as Boston, Paris, Barcelona, and basically the rest of the world. 

 

Depth API

ARKit Depth APIThe new ARKit depth API, coming in iOS 14, provides access to valuable environment depth data, powering a better scene understanding and overall enabling superior occlusion handling.   

It relies on the LiDAR scanner introduced in the latest iPad Pro.   

The new Depth API works together with the scene geometry API (released with ARKit 3.5), which creates a 3D matrix of readings of the environment. Each dot comes with a confidence value.  All these readings combined provide detailed depth information, improving scene understanding and virtual object occlusion features.  

By the looks of this investment at the API level, it seems reasonable to expect Apple to include a LIDAR in the new iPhone version as well.    

 

Improved Object Placement

ARKit 4 Object PlacementThe LiDAR scanner is bringing more improvements to the table of AR development. In ARKit 3, Apple introduced the Raycasting API, which allows placing virtual objects with ease on real-world surfaces by finding 3D positions on these. Now, thanks to the LiDAR sensor, this process is quicker and more accurate than before.  

Raycasting leverages scene depth to understand the environment and place virtual objects attached to a real-world plane, such as a couch, a table, or the floor. With the LiDAR sensor, you don’t need to wait for plane scanning to spawn virtual content, as it does it instantly.   

This added to the improvements of Object Occlusion make virtual objects appear and behave more realistically.  

 

 

Face Tracking

ARKit 4 Face trackingFace tracking detects and tracks faces with the front camera, allowing you to create AR experiences with that input, such as placing face filters or virtual animations in real-time.  

At first, this feature was only available on devices with a TrueDepth camera. With ARKit 4, its support is extended to devices without this camera but requires at least an A12 bionic processor.  

Most features that face tracking brings are available for all supported devices, such as face anchors or face geometry. However, to leverage complex depth data, you still need a TrueDepth camera. The TrueDepth camera can now track up to 3 faces at once, providing a wide range of possibilities to front-camera AR experiences (something I’m sure we’ll see applied to Memojis).   

 

Video Materials

RealityKit - ARKit 4 - Video MaterialsRealityKit is Apple’s framework that allows you to simulate and render 3D content to implement in your AR app. Introduced last year with ARKit 3, it’s a great tool that makes creating high-quality AR experiences easier. Now, RealityKit is also getting new features with the arrival of iOS 14 and ARKit 4. One of them is Video Materials.

In a nutshell, Video Materials let you use videos as a texture source and audio source. Using videos as a source, you can have textures that change over time, simulate effects in 3D objects, or even bring images to life. In the example shown by Apple, we can see a glowing effect all over the virtual object, thanks to Video Materials. 

In terms of audio, Video Materials allow you to implement spatialized sound features to your AR experience, improving its realism. You can make a virtual object a spatialized audio source by applying a video material to it. As the audio is emitted from a specific location (in this case, the virtual content itself), you are able to create an even more immersive AR experience.

 

Final thoughts

So this is it. ARKit 4 is finally here with improved performance and new capabilities that leverage Apple’s newer & most powerful devices. It’s nice to see that Apple keeps investing in AR development and we are looking forward to testing all these features ourselves.   

The caveat is that many of these new features are entirely based on hardware only available in the latest devices, limiting a widespread adoption.   

It seems Apple is laying out the foundations for new generations of devices, which will include LiDARs. It wouldn’t be a surprise to see it added to the new iPhone Pro models coming later this year, and beyond 2020 they may be part of the much-rumored Apple AR Glasses.   

It’s not part of the ARKit API, but there’s also a bunch of new and exciting features in the Vision API. If you are interested in AR development on the iOS platform, you should check them out too, along with RealityKit. 

Also worth mentioning the writing is on the wall that SceneKit is dead. We have seen no news of investment from Apple in WWDC 2020 in SceneKit. On the one hand, it’s good to see the building blocks of mobile augmented reality consolidate in the iOS platform, but on the other hand, as developers it does generate some uncertainty/angst that being a relatively new field changes are bound to happen and you may find yourself betting on the wrong horse. I guess it goes with the territory.

In case you may be looking for augmented reality mobile solutions, or in need to implement ARKit 4 in your mobile app don’t hesitate to get in touch with us We’d love to help 🙂 

 

Let’s talk!