Real-time Webcam Face Tracking
Our webcam facial tracking technology is based on a JavaScript API for detecting the user's face and rendering interactive 3D graphics in real-time within a browser. It doesn't require third-party plugins — just the web and the camera that users need to experience AR.
You can test it yourself straight away by heading to our Web AR technology page. The same camera face tracking features can be embedded into any web app or website.
Core Components of Webcam Facial Tracking Technology
The core of our web face tracking is a neural network built with Tensorflow Lite. It's an open-source deep learning framework for on-device inference and should not be confused with Tensorflow.js.
On the surface, Tensorflow Lite might not seem a perfect solution for webcam face tracking. Still, one can not ignore its advantages. Among such, reduced size and fast inference on devices with limited compute and memory resources. Tensorflow Lite has a large community, and Google itself uses it for Web AR in Google Meet.
However, it is not the neural network player only that makes the success of the final result. The right neural network architecture, efficient data loading and its effective processing play a crucial role. Let us touch upon each aspect and explain what makes our web facial tracking unique.
How We Accelerated In-Browser Face Tracking Performance
Neural network optimization
Unlike Tensorflow.js or other libraries, face tracking with Tensorflow Lite requires adapting the network architecture to the specifics of the XNNPACK delegate. We managed to optimize it and get a manifold increase in performance.
The webcam facial tracking library coming as part of the SDK delivers stable 30 fps on the camera and render (Chrome, MacBook Pro 15", 2017). It’s enough for real-time face filtering, in-browser makeup try on and virtual try ons.
Fast data processing
Our webcam facial tracking SDKs are written in C++ with a minimum of external dependencies. For the web, we use the JavaScript wrapper called via Emscripten. It compiles C++ code into WebAssembly, and runs it on the Web.
Emscripten supports fast data input and output via Canvas/WebGL and provides OpenGL support over WebGL. In the case of Chrome, we additionally have a 2x speedup due to the parallelization of mathematical calculations through SIMD instructions.
Effect playback
Fast face tracking performance is not all that makes a good user experience in Web AR. The JS face detection neural network is just one of the components of the pipeline for processing a frame from input to effect drawing. For smooth 3D graphics rendering on the web we use the EffectPlayer, another SDK component, running on OpenGL.
As a result, our face tracking library combines a lightweight core and fast performance on the web. It’s achieved with
- fast input/output via WebGL
- good performance of adapted neural networks
- fast playback of effects on WebGL
In practice, this allows precise placement of AR masks, virtual makeup, backgrounds, etc.
4 Steps to Add Web AR Face Tracking to Your App
Our AR SDK for JavaScript allows you to access face detection and tracking technology via the JavaScript augmented reality library. It exports different APIs for Web AR development.
Face tracking with JavaScript works with browsers with WebGL 2.0 support. It runs on any device, no app required. Neither external dependencies nor server are needed. The library is a perfect fit for mobile browsers being GPU accelerated and managing operations on a WebGL backend.
Steps to integrate the face tracking features into your website with JavaScript
- Request Web AR SDK by filling out our website form.
- Apply the trial token. You'll receive it together with the web ar package.
- Download the face effect example. The example includes a set of Face AR features like background removal, beautification and AR makeup. Each feature is based on face tracking and can be called with ar js method.
- Refer to the JavaScript code samples in the API methods to design the needed AR face tracking experience within your web environment.
4 Use Cases of Adopting a Webcam Face Tracker
Our web AR face tracking opens up a myriad of augmented reality scenarios. Here are the most common ones.
Replacing the background and applying the AR mask Glasses using the Banuba SDK
Virtual try on
You can embed the web AR face tracking with JavaScript to the e-commerce website for real-time virtual try on. Customers can test sunglasses, jewelry or hats in the real-size dimension before purchasing.
Makeup try on
Combined with neural networks for lips segmentation or hair segmentation, the face tracking API allows showcasing lipstick and hair color products via the web camera.
Webcam filters like Snapchat
You can overlay 3D objects, face filters, beauty effects and virtual backgrounds letting users enjoy a high-quality web AR experience. The experiences can be captured and shared with in-browser photo and video recording features.
Photo booth
You can build AR Photo booth experiences with real-time webcam video processing and photo capturing features. The best part, they can run offline.
With web AR face tracking, faces must be detected robustly under different conditions like low lighting, different angles, and occlusion. We managed to achieve this by leveraging high-level computer vision technology to bring augmented reality to web platforms, applications, and sites.