Go to previous page

Augmented Reality on iOS Devices with ARKit

Face detection
#
iOS
#
Augmented Reality
Himanshu Joshi
Himanshu
iOS Engineer
September 19, 2023

Augmented Reality (AR) has taken the tech world by storm, and Apple's ARKit framework has been at the forefront of this revolution. ARKit empowers developers to create captivating AR experiences that seamlessly blend digital content with the real world. By offering tools for motion tracking, object detection, and real-time 3D rendering, ARKit has opened up a realm of possibilities for iOS apps.

In this blog, we'll dive into the fascinating world of ARKit and explore how it can be harnessed for face detection and tracking, unlocking innovative features and applications.

Getting Started

To kickstart your ARKit journey, fire up Xcode and create a new project. Begin by adding an ARSCNView to your storyboard. This view acts as your window into the AR world, merging real-world imagery from your device's camera with digital content.

Once you've added ARSCNView to your project, make sure to connect it to your view controller for seamless integration.

Face the Future with ARFaceTrackingConfiguration

With ARSCNView in place, it's time to introduce face tracking. In the ViewDidLoad() method of your view controller, add the following code:

	
Copy
guard ARFaceTrackingConfiguration.isSupported else { fatalError() } sceneView.delegate = self let configuration = ARFaceTrackingConfiguration() sceneView.session.run(configuration)

This snippet sets up the necessary configurations for face tracking and initializes the AR session, allowing you to view your own face within the app.

Delegate Methods: Your Bridge to AR

To fully harness ARKit's capabilities, implement the delegate methods provided by ARSCNViewDelegate. These methods enable you to manipulate and augment the user's face.

Here are the key delegate methods:

Create a extension for <span class="text-color-code"> ViewController </span> and conform <span class="text-color-code"> ARSCNViewDelegate </span> to it.

Now add these two methods.

	
Copy
func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? { guard let device = sceneView.device else { return nil } let faceGeometry = ARSCNFaceGeometry(device: device) let node = SCNNode(geometry: faceGeometry) faceGeometry?.firstMaterial?.transparency = 0.0 for x in 0..<1220 { let text = SCNText(string: "\(x)", extrusionDepth: 1) let txtnode = SCNNode(geometry: text) txtnode.scale = SCNVector3(x: 0.0002, y: 0.0002, z: 0.0002) txtnode.name = "\(x)" node.addChildNode(txtnode) txtnode.geometry?.firstMaterial?.fillMode = .fill } return node } func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) { guard let faceAnchor = anchor as? ARFaceAnchor, let faceGeometry = node.geometry as? ARSCNFaceGeometry else { return } for x in 0..<1220 { let child = node.childNode(withName: "\(x)", recursively: false) child?.position = SCNVector3(faceAnchor.geometry.vertices[x]) } faceGeometry.update(from: faceAnchor.geometry) }

These methods grant you access to SCNNode, allowing you to add points, lines, or overlays to the user's face. Currently, we're adding points to represent facial features.

Unlocking the Magic: Calculating Real-World Distances

While adding points is fascinating, the real magic lies in what you can do with these points. ARKit's face tracking points enable you to calculate real-world distances between facial features like eyebrows, nose, or face width.

To calculate eyebrow lengths, for instance, you need to define specific face points. Left eyebrow points might include 200, 210, and 420, while right eyebrow points could be 850, 648, and 767.

To make these calculations, convert 3D world points to 2D points and then measure the distances. We've created handy extensions to simplify this process.

Extensions for ARFaceAnchor and SCNVector3

Our ARFaceAnchor extension allows you to convert 3D face points to CGPoints within your ARSCNView. We've also provided extensions for <span class="text-color-code"> matrix_float4x4 </span> and <span class="text-color-code"> Float </span> for additional functionality.

	
Copy
extension ARFaceAnchor { struct VerticesAndProjection { var vertex: SIMD3<Float> var projected: CGPoint } func convertVectorsToCGPoints(to view: ARSCNView) -> [VerticesAndProjection] { let facePointsArray = [200, 210, 420, 850, 648, 767] var verticesArray = [VerticesAndProjection]() for x in facePointsArray { let vertex = geometry.vertices[x] let col = SIMD4<Float>(SCNVector4()) let pos = SIMD4<Float>(SCNVector4(vertex.x, vertex.y, vertex.z, 1)) let pworld = transform * simd_float4x4(col, col, col, pos) let vect = view.projectPoint(SCNVector3(pworld.position.x, pworld.position.y, pworld.position.z)) let p = CGPoint(x: CGFloat(vect.x), y: CGFloat(vect.y)) verticesArray.append(VerticesAndProjection(vertex: vertex, projected: p)) } return verticesArray } }

As you can see here in facePointsArray variable, we have added all the face points for our left and right eyebrow.

	
Copy
extension matrix_float4x4 { public var position: SCNVector3 { get{ return SCNVector3(self[3][0], self[3][1], self[3][2]) } } } extension Float { func metersToInches() -> String { return String(format: "%.2f", self * 39.3701) } func metersToCentimeters() -> String { return String(format: "%.2f", self * 100) } }

Last extension is to convert the values in cm or inches as required.


Putting It All Together

With these extensions in place, you can now calculate the real-world length of facial features. For example, you can find the length of the left eyebrow by measuring the distances between the specified points. This capability opens up a world of possibilities for creating innovative AR applications.

Now replace the delegate methods to this to calculate the lengths of left and right eyebrows.

	
Copy
func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? { guard let device = sceneView.device else { return nil } let faceGeometry = ARSCNFaceGeometry(device: device) faceGeometry?.firstMaterial?.transparency = 0.0 let node = SCNNode(geometry: faceGeometry) return node } func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) { guard let faceAnchor = anchor as? ARFaceAnchor, let faceGeometry = node.geometry as? ARSCNFaceGeometry, let sceneView = renderer as? ARSCNView else { return } let points: [ARFaceAnchor.VerticesAndProjection] = faceAnchor.convertVectorsToCGPoints(to: sceneView) DispatchQueue.main.async { let leftEyebrowDistance1 = VectorHelper.distance(betweenPoints: SCNVector3(points.map{ $0.vertex }[0]), point2: SCNVector3(points.map{ $0.vertex }[1])) let leftEyebrowDistance2 = VectorHelper.distance(betweenPoints: SCNVector3(points.map{ $0.vertex }[1]), point2: SCNVector3(points.map{ $0.vertex }[2])) let leftEyebrowTotalDistance = (leftEyebrowDistance1 + leftEyebrowDistance2).metersToCentimeters() let rightEyebrowDistance1 = Float(VectorHelper.distance(betweenPoints: SCNVector3(points.map{ $0.vertex }[3]), point2: SCNVector3(points.map{ $0.vertex }[4]))) let rightEyebrowDistance2 = Float(VectorHelper.distance(betweenPoints: SCNVector3(points.map{ $0.vertex }[4]), point2: SCNVector3(points.map{ $0.vertex }[5]))) let rightEyebrowTotalDistance = (rightEyebrowDistance1 + rightEyebrowDistance2).metersToCentimeters() print(leftEyebrowTotalDistance + "cm") print(rightEyebrowTotalDistance + "cm") } faceGeometry.update(from: faceAnchor.geometry) }

Finally, to calculate the distance between two SCNVector3 points. We need to add one method to do so.

	
Copy
struct VectorHelper { static func distance(betweenPoints point1: SCNVector3, point2: SCNVector3) -> Float { let dx = point2.x - point1.x let dy = point2.y - point1.y let dz = point2.z - point1.z return Float(CGFloat(sqrt(dx*dx + dy*dy + dz*dz))) } }

In the <span class="text-color-code">didUpdate</span> node method, we have converted the face points from 3D to CGPoints using our <span class="text-color-code">ARFaceAnchor Extension</span> we previously created. After getting the array of CGPoints we have calculated the distances between each point using our VectorHelper’s distance method.

As we have previously seen, the first 3 points are for the left eyebrow and last 3 points are for the right eyebrow. In our didUpdate node method, we are calculating the distance by getting the distance between first and second point and then adding it to the distance between second and third point.

Now it's time to run your iOS App and see the magic. Now you should be able to get the length of your left and right eyebrow. You can play around with these points and change these points to calculate other lengths like nose, eye or face height, etc.

Expanding the Horizons of ARKit

ARKit offers endless opportunities for creativity. You can overlay GIFs onto eyebrows, draw lines from the nose to the eyes, or explore numerous other enhancements. To add a GIF as an overlay, simply create a UIImageView from the GIF and set it as the content for faceGeometry’s first material.

To do so, we need to add this line to <span class="text-color-code">nodeFor anchor</span> method.

	
Copy
let gifImageView = UIImageView(image: gifImage) faceGeometry?.firstMaterial?.diffuse.contents = gifImageView

To convert any gif to image. There is an extension you can create.

Refer to this gist for the code.

Now you can just copy this extension and use this to create a UIImage out of any gif.

To do so, use this line of code.

	
Copy
let gifImage = UIImage.gifImageWithName(“gif_file_name”)

To convert GIFs to images, we've provided a convenient UIImage extension. Now you can easily convert any GIF into an image for ARKit applications.

The Final Act: ARKit's Boundless Potential

As you can see, ARKit is a powerful tool that can transform your iOS app into an AR wonderland. Whether you're measuring facial features or adding awesome overlays, ARKit's capabilities are limited only by your imagination.

So, go ahead, dive into the world of ARKit, and start crafting AR experiences. The magic of augmented reality is at your fingertips.

Thanks for joining us on this ARKit adventure!

Recommended Posts

Open new blog
Maybelline and Microsoft teams virtual try-on technology
#
beauty-tech
#
collaboration
#
innovation

Microsoft Teams up with Maybelline’s AI 'Makeup' Filters

Sohini
August 17, 2023
Open new blog
Code Review
#
Coding
#
software

The Essential Code Review Best Practices for Quality Software Development

Sujay
October 1, 2024
Open new blog
Pink Rainbow
#
Design
#
product

Designing for Feasibility: Safe Dribbling, Rainbows, and Minimum Viable Design (MVD)

Achin
April 17, 2023