DeepMotion's Animate 3D now has Face Tracking. Our AI-powered motion capture is now more complete with the ability to capture a full-body with facial expressions. This new feature gives our users more control over expressing their vision by quickly and easily generating 3D face animations in minutes from a single video. No special hardware is needed allowing any video captured on any device to be used to generate your 3D face animations.
Our AI tracks face features including blinking, expressive mouth motions, eyebrows and head positions with markerless tracking, no dots necessary. To complement this new feature, we recently launched our half-body tracking and tight headshots which will further enable tracking of the irises and higher fidelity features of the face. You can also stick with full-body tracking that will still give you general expressions of your character’s face despite it being further away. Because we do not require any special hardware or markers, having a clear video with an unobstructed face will be important for better results.
We recommend trying our face motion capture with the Default Character or with a Custom Character created via the built-in character creator. User-uploaded custom characters can still be used for face tracking when they are set up with the standard ARKit blendshapes.
All users now have access to this feature, available in the Animation Settings when creating a new animation. Flip the toggle and the animation will generate as it previously did, but now with expressions! The animation downloads will include the full-body motion data plus facial BlendShape weights based on ARKit Blendshapes. Check out our FAQ to learn more about how to use Animate 3D Face Tracking for your projects.