Animate 3D: Face Tracking FAQ
DeepMotion's Animate 3D Face Tracking is markerless motion capture from any RGB camera - no software or hardware needed. Check out the FAQ for how to integrate it into your projects!
Animate 3D Face Tracking is markerless motion capture from any RGB camera. Full-body tracking with face is now possible - no hardware or software needed! Check out our full announcement here.
How do I access the new Face Tracking Feature?
The new Face Tracking Toggle is located under the ‘Animation Output’ settings. Simply turn it on and generate your animations as normal.
What kind of video do I need to upload?
You can upload any video that is full-body (head-to-toe), half-body (head-to-waist), or headshot (full face within frame). The larger and more clear the face is, the more accurate your results will be.
What tracking quality will I get with a full-body video vs half-body video?
- Full-body tracking is further away so eye tracking is turned off, but can still capture general facial expressions, mouth movements and head position.
- Half-body tracking is closer up with more detail so we can do eye/iris tracking, with higher fidelity facial expressions, mouth movement and head position. Closer to the face will generate better quality. TIP: Be sure to keep the full head within the frame of the screen for best results.
How do I download the Face Tracking animation?
The Face Tracking animation will be included as Blendshapes within the normal default animation export, or already retargeted onto your custom character . You can then retarget them to a character of your choice.
Can I use the Custom Character feature with Face Tracking?
Our default characters work the best with face tracking however you can use your own character with a 39 Blendshape subset of the the 52 ARKit blendshape standard, or you can use the custom characters generated by the built-in avatar creator.
See the Face Tracking Technical Specifications below.
What do I need to retarget the face animation?
If you use our default characters to create the facial animations you can retarget the blendshape weights conforming to the ARKit Blendshape standard in the animations to your own characters in your favorite DCC tools by yourself. If you use Custom Characters to create the facial animations and your custom characters have a face rig that contains the 39 Blendshape subset of the the 52 ARKit Blendshape standard, the downloaded animations in the .FBX or .GLB will already be retargeted to your custom characters.
Face Tracking Technical Specifications
Our face tracking output uses a subset of the 52 ARKit Blendshape standard. Our specific set-up includes 39 Blendshapes total and rotations on one head and two eyeball joints. You can use the full standard but have to make sure that the Blendshape Specifications below are followed for your animation retargeting and custom characters to work correctly.
Custom Character Face Tracking Requirements
Name Your Blendshapes Correctly: Custom characters with the below 39 ARKit blendshapes can be used for face tracking, the full standard 52 Blendshapes can also be used if desired. When face tracking is enabled, Animate 3D will apply the blendshape weights according to the blendshape names. Make sure that your model’s blendshape names are exactly the same as the ARKit standard. If not, rename them before you upload to Animate 3D.
Joint Setup: Your character rig needs to have a head joint and two eyeball joints. The eyeball joints should control the rotation of your eyeball mesh, which should be looking straight ahead by default.
Full-Body Humanoid Characters Needed: We currently only support full body custom characters, and don’t support head only ones. This is because face tracking is now a supplement to body tracking, and we don’t support capturing face alone. This means your character needs to also satisfy our custom character requirements for body tracking. You can check out our Custom Character FAQ here.
These 39 Blendshapes are a subset of the ARKit blendshapes: