AR Face Tracking for iOS [Backstory]
Not long ago Apple released the 3rd generation of iPhones with the TrueDepth camera. You can learn more about the True Depth camera from the Apple’s video on Face Tracking with ARKit.
The new camera contains a dot infrared projector which detects and tracks your face by mapping 30,000 invisible dots. Above biometrics, initially seen as a prime purpose of this update, the TrueDepth camera and face tracking with ARkit unleash way more powerful and interesting use cases of Face AR experiences. It recognizes even tiny facial movements and emotional expressions, allowing developers to convert them to live avatars and emojis or overlay augmented reality face filters with stunning accuracy.
There is a lot of potential behind the True Depth camera, and we already integrated ARKit face tracking into Face AR platform to allow developers to create the most fascinating face-based AR experiences for iOS.
Also Read: Getting Started With Creating Face-based AR Experiences
However, face tracking with ARKit impacts only a new generation of iPhones with the TrueDepth camera including iPhone X, XS, XR, or iPad Pro. As for older iPhones which include 6 plus, 6s, 7, 8 and older, face tracking imposed some limits to Face AR experiences and required more “efforts” from users in terms of face position and lighting conditions.
Still, many iOS users don’t have the TrueDepth camera and therefore developers could not realize all the potential of AR face tracking apps to the fullest while users can’t experience all its benefits. It especially matters for sophisticated face AR experiences like face-based AR games, realistic virtual try ons or complex AR face filters which millions of people love and use daily.
To help solve that, we’ve optimized our iOS Face Tracking SDK for better distance support and more stable performance with different lighting conditions. See the video above to estimate the difference or contact us if you’d like to try our new iOS face tracking for developing your Face AR apps.
iOS Face Tracking SDK Performance Improvements
The data below provides our rough test estimates of AR face tracking iOS performance and can be used for your reference only but not as final data for guaranteed results when used in your apps. We provide the trial period to all our technology for you to assess face tracking performance deliverables within your iOS apps.
- Distance. AR face tracking supports 2.5-3 meters average.
- Process photo time. Avg 0,5 seconds for most technologies for the front camera.
- Angles. Wider face angles with 90-degree support and camera rotate.
- Occlusion. Faster face detection and more accurate face tracking with sunglasses and glasses.
- Lighting. More stable AR face tracking with bright sun and street light.
- Performance. Min 30 FPS on iPhone 6s and higher.
Other Face AR SDK Improvements
- Updated switcher for full body segmentation neural networks (iOS)
- Banuba Viewer UI changes
- Switch Action Units to use new face recognition algorithm (iOS)
- Switch triggers to use Action Units (iOS)
Fix
- Physics behavior in effects on devices with ARKit (iOS)
- Effects render on low-level Android devices