ARKit 5 logo

ARKit 5 — Apple’s framework for developing augmented reality experiences on iOS & iPadOS — has just been announced at WWDC 21. We are glad to see that Apple keeps investing in the AR field, bringing more features and improvements every year.

As we did with ARKit’s past versions, we share our thoughts about it as mobile AR developers.

What’s new in ARKit 5? What does it mean for augmented reality development on the Apple ecosystem?

Let’s dive into it!

 

Enhanced Location Anchors

Location anchors ARKit 5Introduced in ARKit 4, Locations Anchors let you place virtual objects outdoor in relation to anchors in the real world, at a specific latitude, longitude, and altitude.

It basically allows the user to experience virtual content more realistically. You can move around the object and see it from different perspectives, as you would do with real ones.

For instance, imagine witnessing the epic fight between Godzilla and Kong in Hong Kong, like if these two big fellas were actually there. I’m already hyped just by picturing the moment.

Since Location Anchors leverage data from Apple Maps, the only supported cities at the launch were San Francisco, New York, LA, Chicago, and Miami. Fortunately, ARKit 5 now introduces the feature to more than 25 cities across the U.S., including Boston, Houston, Atlanta, and several more. Besides, it is expanding to global regions as well, starting with London. You can check out the complete list of supported locations over here.

With the arrival of iOS 15, the updated Maps app introduces turn-by-turn walking directions (in selected cities) by using AR and Location Anchors. It’s an interesting use case of this technology applied to navigation.

As it happens with other ARKit capabilities such as plane detection, Apple added a guided onboarding process. Now users can see a coaching overlay that displays an animation of how to use features that leverage Location Anchors. Here’s the code behind it:

 

And this is the result:

 

Apple also shared some recommendations to work with Location Anchors. First, you can leverage recording and replay to improve development times by using Reality Composer. This comes in handy with outdoor content as you don’t need to go outside every time just for testing purposes.

Moreover, make sure to adjust the coordinates of the virtual content relative to the real anchor. For instance, it wouldn’t look good if the AR object overlays the actual building, thus hindering the user’s field of view. So check those altitude values, and put the AR content to float!

To see how you can integrate Location Anchors, take a look at our previous post.

 

App Clip Codes Integration

App Clip CodesMoving forward, Apple introduced the integration of App Clip Codes with AR experiences.

In a nutshell, an App Clip is a small part of an app that can be used when needed. It’s fast, lightweight, and it doesn’t require you to download the full app at once.

App Clip Codes allows the user to launch the App Clip experience with ease by scanning it. Some App Clip Codes can be scanned with the Camera or by placing the device close to the code. In a way, it works almost like the popular QR ones.

The integration of App Clip Codes with ARKit lets developers create AR experiences that can be launched instantly without installing any app. For retail, this could be useful for showcasing products in a more immersive way.

Take a look at Apple’s example:

There’s also another great use case of App Clip Codes and ARKit for tools or machines. It’s now easier to display instructions on operating a specific device, pointing out the buttons that need to be pushed.

App clip codes use case

App Clip Codes not only allow integrations with other tracking capabilities to enhance AR experiences over objects, but it also makes it easier and faster to launch those experiences.

You can get a first-hand test drive by downloading Apple’s sample project.

 

Extended Face Tracking Support

Face Tracking ios 15Face tracking allows you to track and detect faces with the front-facing camera, place virtual content over them, and enable face animations in real-time.

With ARKit 4, its support was extended to devices without a TrueDepth sensor, as long as they have an A12 Bionic processor or later.

Now, ARKit 5 leverages the latest iPad Pro front camera capabilities by adding an ultra-wide field of view support.

It’s good to see improvements not only for the rear cameras. It’s selfie time!

The code to implement ultra-wide FOV support on your app looks like this:

 

Improved Motion Capture

Motion Capture received some improvements as well. Introduced with ARKit 3, Motion Capture allows you to track human movement and use it as input for AR experiences.

In iOS 15, and for devices with at least an Apple A14 Bionic processor such as the iPhone 12, Motion Capture supports even more body poses and gestures. Rotations are more accurate, and you can track these movements from a larger distance.

There’s been an improvement in tracking the motion of limbs as well:

If you compare it with previous ARKit versions, you can see how smooth it works now.

 

Final thoughts

ARKit 5 enters the stage, and it’s clear that Apple keeps working on improving the performance of its AR development framework.

By leveraging the latest iPhones and iPads capabilities, developers are now able to create enhanced augmented reality experiences that happen outside (thanks to Location Anchors extended support), launch faster (with App Clips), and use the human body as an input, more accurately than before.

It’s also worth mentioning there have been improvements to RealityKit, Apple’s framework for simulating and rendering 3D AR content. For instance, the new RealityKit 2’s Object Capture API allows devs to create 3D models by using iPhone photos. Check out how this works here.

Are you looking to implement an AR experience on your mobile app? Drop us a line! 

 

Let’s talk!