Be the first to test new features of Face AR SDK!
Glitter Texture for Virtual Lipstick Try On
Demo of Virtual Lipstick with Glitter Texture
The makeup try-on app development has many pitfalls from the technology standpoint. One of them is the ability to transfer realistic colors and showcase a variety of makeup textures to give consumers a comprehensive try on experience on mobile. To address this demand, we continue to extend our Makeup SDK and API with new features and possibilities.
The new Glitter Texture features a combination of small particles with a sparkling effect. The technology builds on the lips segmentation neural network, which segments the lips area and an algorithm, which adds granularity and shine effect to the lipstick texture.
Platforms
- iOS, MacOS: photo, real-time
- Android: photo, real-time (coming soon)
You can integrate a virtual lipstick try on feature into iOS and Android apps to allow consumers to test real products in AR or empower your beauty and photo editors with more cool makeup looks— metallic, matte, gloss and now, glitter too.
Unity Face TouchUp
Unity Face Touchup Demo Video
To improve the user look on the camera, we extended our Unity face tracking and filtering technology with real-time beautification. The new Beauty Scene for Unity allows you to test and preview the possibilities of face retouch and understand which code triggers a specific functionality.
Also Read: Unity Face Beautification To Auto-Enhance The Camera Image
Unity Beauty Effect features
- Makeup: adds eyelashes, eyeshadows and blush
- Soft skin: blurs the skin to make it smooth
- Eye flare: whitens eyes, intensifies the color of iris and adds a flare for an expressive look
- Teeth whitening
- LUT: improves the color of the image
The technology is fully compatible with Augmented Reality Video Filters and Animated Backgrounds in Unity. Try all our Unity SDK possibilities to create the best possible camera experience for your users.
Ruler (Distance to phone)
Demo of the Ruler Feature on iOS
Another new feature which is now available in our SDK is a virtual Ruler. It estimates a distance between a user face and camera in real-time. The technology uses face tracking and computer vision algorithms which tracks the face area size to measure distance.
Application fields
- Healthcare. Notify the user if he or she is too close to the screen to prevent a negative effect on eyesight.
- Photo and video apps. Advise on the correct ratio of the face area in a photo.
- Engagement analytics. Developers can gain additional information about user interaction with the app or camera to improve the user experience.
Distance supported
~2-3 meters
Platforms
iOS, MacOS, Android, Windows, HTML5
Learn how to add Virtual Ruler into your app with Integration Tutorial.
Other Improvements
- Display FPS stats in desktop viewer app
- Updated tflite for windows to 2.3
- Both tflite runners creation on first request
- Text texture enabled by default
- Android: Ability to Override Detected Resolution
Fix
- iOS: incorrect BG work on photo in landscape
- Unity: fix aspect on mobile devices
- Delayed camera start
- Repacking errors
- Front camera flip
- 'Face not found' message after load photo from Gallery
- Added handling of IllegalArgumentException to prevent crashes dependent on surface configuration
For the full list of bug fixes and improvements, visit our Release Note Page.
Explore how our Face AR SDK can empower your app with amazing camera features.