User level: basic

Since its appearance on the market, Augmented Reality has given us the possibility to create immersive and exciting experiences for mobile technology users. Used both for big business systems and everyday use apps, AR can save us time, for example, we can check out how our dream sofa would look in our apartment, whether the selected paint color will match the interior, or even quickly try on new makeup.

ARKit 3 on iPhone X and newer offers powerful face tracking capabilities. Using the right components we can easily prepare apps for trying on jewelry, makeup, or glasses. In this sample, we’ll focus on the last option – making an AR app for trying on glasses with a simple frame color change.

ARKit 3 requirements & setup

AR face tracking with ARKit 3 requires the A11 bionic chip introduced in iPhone X, therefore we’ll need this device or a newer one to make our sample work. To access all the features ARKit 3, like multiple face tracking, we’ll need to use Unity 2019.1 or later for our project.

Once we created our project, we need to install the necessary ARKit packages. Since July 2019 ARKit for Unity is part of the AR Foundation package so we need to open the Package Manager from the Window menu and install the following packages:

  1. AR Foundation
  2. AR Subsystems
  3. AR Face Tracking for all the required face tracking scripts
  4. XR Legacy Input Helpers for the Tracked Pose Driver script

After the installation is a good point to switch the project to the iOS platform and set the player settings. ARKit 3.0 requires a minimum SDK version 11.0 so go to Other Settings in the Player Settings window and set the Target minimum iOS Version to 11.0. 

Next, select the Architecture dropdown in the same window and select ARM64, since ARKit supports only 64-bit devices. Another thing you will need to define is the Camera Usage Description – this is the text that will be used when the app asks for permission to use the device camera. You can set it to any text you want, like “ARKit 3 Face Detection”.

AR session objects

When the project is properly set up, we can start creating the necessary AR session objects. First off, we create a GameObject in our scene we’ll call ARSessionand we add two components to it – one AR Session script and AR Input Manager script.

Next, we create an AR Session Origin object, with an AR Session Origin script and an AR Face Manager. The face manager script is very important since ARKit normally doesn’t use the frontal camera of the device – it’s only used in face tracking/identification scenarios. Notice that the AR Session Origin has an empty field for a Camera script. 

Let’s move the Main Camera inside the AR Session Origin object and drop a reference to it in the AR Session Origin script. We will need to add three scripts: 

  • Tracked Pose Driver (In the Tracked Pose Driver, we need to change the Pose Source to Color Camera tick the Use Relative Transform settings.)
  • AR Camera Manager
  • AR Camera Background

You noticed how the AR Face Manager in the AR Session Origin script has an empty field for a Face Prefab? That’s the prefab we will use to visualize our glasses based on AR Face data. 

The AR Glasses Prefab

We’ll have to create a new GameObject, we called ours simply GlassesPrefab. What we need to do first with our prefab is to attach two components to it. The first one is an ARFace component which will provide us with the detected face point data. The second is our script called ARGlassesController which will be responsible for positioning the glasses according to data updated by ARFace.

Glasses ARKit

Once we’ve done that, we’ll need to create a GameObject inside the prefab that will have our glasses mesh attached to it, which we’ll manipulate from the ARGlassesController. The final shape of the prefab should look like this. It’s essential to store the manipulated mesh as a child object inside our prefab. The prefab’s position is controlled by the ARFaceManager and the ARFace vertices are local points for features about the detected face.

Our ARGlassesController looks like this:

using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.XR.ARSubsystems;

[RequireComponent(typeof(ARFace))]
public class ARGlassesController : MonoBehaviour
{
	[field: SerializeField]
	public static ARGlassesController Instance { get; private set; }

	[field: SerializeField]
	public Transform ModelTransform { get; private set; }

	[field: SerializeField]
	public Material FrameMaterial { get; private set; }

	private ARFace ARFaceComponent { get; set; }

	private const string MATERIAL_COLOR_SETTING_NAME = "_Color";
	private const int AR_GLASSES_PLACEMENT_VERTICE_INDEX = 16;

	public void ChangeFrameColor (Color color)
	{
		if (FrameMaterial != null)
		{
			FrameMaterial.SetColor(MATERIAL_COLOR_SETTING_NAME, color);
		}
	}

	protected virtual void Awake ()
	{
		if (Instance == null)
		{
			Instance = this;
		}

		ARFaceComponent = GetComponent();
	}

	protected virtual void OnDestroy ()
	{
		Instance = null;
	}

	protected virtual void OnEnable ()
	{
		ARFaceComponent.updated += TryToUpdateModelStatus;
		ARSession.stateChanged += TryToUpdateModelStatus;
		TryToUpdateModelStatus();
	}

	protected virtual void OnDisable ()
	{
		ARFaceComponent.updated -= TryToUpdateModelStatus;
		ARSession.stateChanged -= TryToUpdateModelStatus;
	}

	private void TryToUpdateModelStatus (ARFaceUpdatedEventArgs eventArgs)
	{
		TryToUpdateModelStatus();
	}

	private void TryToUpdateModelStatus (ARSessionStateChangedEventArgs eventArgs)
	{
		TryToUpdateModelStatus();
	}

	private void TryToUpdateModelStatus ()
	{
		bool isFaceVisible = GetFaceVisibility();
		ModelTransform.gameObject.SetActive(isFaceVisible);

		if (isFaceVisible == true)
		{
			ModelTransform.localPosition = ARFaceComponent.vertices[AR_GLASSES_PLACEMENT_VERTICE_INDEX];
		}
	}

	private bool GetFaceVisibility()
    {
		return enabled == true && ARFaceComponent.trackingState != TrackingState.None && ARSession.state > ARSessionState.Ready;
	}
}

 

ARKit3: ARFace data and face detection

What happens is very simple – once the ARFaceManager detects a face, it sends new ARFace data. Since our controller is connected to the updated event of ARFace, whenever a new set of face point coordinates is detected the script updates the position of the glasses. 

The vertice array indexes always reference the same characteristic points on a human’s face, the only thing that changes is the overall positioning and face mesh, depending on a person’s facial features. We set our glasses to stick to vertice index 16 which provides the following result:

arkit glasses example 2 arkit glasses example

Color controller

We want to change the color of the frames using a ColorController script attached to buttons in a simple UI, that’s why we made ARGlassesController a singleton – whenever it’s available on the scene, the ColorController can access it.

using UnityEngine;

public class ColorController : MonoBehaviour
{
    [field: SerializeField]
    public Color Color { get; private set; }

	public void SetFrameColor()
	{
		if (ARGlassesController.Instance != null)
		{
			ARGlassesController.Instance.ChangeFrameColor(Color);
		}
	}
}

Each color button on the scene has the SetFrameColor function connected to its Click event, so whenever we click a button, we can change the frame color to the one defined in the controller.

Conclusion

I hope the above instructions prove helpful in preparing your AR face tracking app. This is just a simple example, but ARKit Face Tracking can be used to create much more complex applications, not only for fashion and cosmetics but also for industrial purposes. Nowadays Apple gives us access to very advanced and powerful tools but we’re excited to see how this technology will evolve in the future.

Share with:


Unity Developer with more than 12 years of experience in the industry. He's been developing VR & AR applications since 2018.