Augmented Reality (AR) has taken the tech world by storm, and Apple's ARKit framework has been at the forefront of this revolution. ARKit empowers developers to create captivating AR experiences that seamlessly blend digital content with the real world. By offering tools for motion tracking, object detection, and real-time 3D rendering, ARKit has opened up a realm of possibilities for iOS apps.
In this blog, we'll dive into the fascinating world of ARKit and explore how it can be harnessed for face detection and tracking, unlocking innovative features and applications.
Getting Started
To kickstart your ARKit journey, fire up Xcode and create a new project. Begin by adding an ARSCNView to your storyboard. This view acts as your window into the AR world, merging real-world imagery from your device's camera with digital content.
Once you've added ARSCNView to your project, make sure to connect it to your view controller for seamless integration.
Face the Future with ARFaceTrackingConfiguration
With ARSCNView in place, it's time to introduce face tracking. In the ViewDidLoad() method of your view controller, add the following code:
This snippet sets up the necessary configurations for face tracking and initializes the AR session, allowing you to view your own face within the app.
Delegate Methods: Your Bridge to AR
To fully harness ARKit's capabilities, implement the delegate methods provided by ARSCNViewDelegate. These methods enable you to manipulate and augment the user's face.
Here are the key delegate methods:
Create a extension for <span class="text-color-code"> ViewController </span> and conform <span class="text-color-code"> ARSCNViewDelegate </span> to it.
Now add these two methods.
These methods grant you access to SCNNode, allowing you to add points, lines, or overlays to the user's face. Currently, we're adding points to represent facial features.
Unlocking the Magic: Calculating Real-World Distances
While adding points is fascinating, the real magic lies in what you can do with these points. ARKit's face tracking points enable you to calculate real-world distances between facial features like eyebrows, nose, or face width.
To calculate eyebrow lengths, for instance, you need to define specific face points. Left eyebrow points might include 200, 210, and 420, while right eyebrow points could be 850, 648, and 767.
To make these calculations, convert 3D world points to 2D points and then measure the distances. We've created handy extensions to simplify this process.
Extensions for ARFaceAnchor and SCNVector3
Our ARFaceAnchor extension allows you to convert 3D face points to CGPoints within your ARSCNView. We've also provided extensions for <span class="text-color-code"> matrix_float4x4 </span> and <span class="text-color-code"> Float </span> for additional functionality.
As you can see here in facePointsArray variable, we have added all the face points for our left and right eyebrow.
Last extension is to convert the values in cm or inches as required.
Putting It All Together
With these extensions in place, you can now calculate the real-world length of facial features. For example, you can find the length of the left eyebrow by measuring the distances between the specified points. This capability opens up a world of possibilities for creating innovative AR applications.
Now replace the delegate methods to this to calculate the lengths of left and right eyebrows.
Finally, to calculate the distance between two SCNVector3 points. We need to add one method to do so.
In the <span class="text-color-code">didUpdate</span> node method, we have converted the face points from 3D to CGPoints using our <span class="text-color-code">ARFaceAnchor Extension</span> we previously created. After getting the array of CGPoints we have calculated the distances between each point using our VectorHelper’s distance method.
As we have previously seen, the first 3 points are for the left eyebrow and last 3 points are for the right eyebrow. In our didUpdate node method, we are calculating the distance by getting the distance between first and second point and then adding it to the distance between second and third point.
Now it's time to run your iOS App and see the magic. Now you should be able to get the length of your left and right eyebrow. You can play around with these points and change these points to calculate other lengths like nose, eye or face height, etc.
Expanding the Horizons of ARKit
ARKit offers endless opportunities for creativity. You can overlay GIFs onto eyebrows, draw lines from the nose to the eyes, or explore numerous other enhancements. To add a GIF as an overlay, simply create a UIImageView from the GIF and set it as the content for faceGeometry’s first material.
To do so, we need to add this line to <span class="text-color-code">nodeFor anchor</span> method.
To convert any gif to image. There is an extension you can create.
Refer to this gist for the code.
Now you can just copy this extension and use this to create a UIImage out of any gif.
To do so, use this line of code.
To convert GIFs to images, we've provided a convenient UIImage extension. Now you can easily convert any GIF into an image for ARKit applications.
The Final Act: ARKit's Boundless Potential
As you can see, ARKit is a powerful tool that can transform your iOS app into an AR wonderland. Whether you're measuring facial features or adding awesome overlays, ARKit's capabilities are limited only by your imagination.
So, go ahead, dive into the world of ARKit, and start crafting AR experiences. The magic of augmented reality is at your fingertips.
Thanks for joining us on this ARKit adventure!
Send us a quick query