VentureBeat sat down with DeepMotion’s CEO, Kevin He, to discuss the company’s pioneering work on motion intelligence, physics simulation and AI. Check out the excerpt below:
Instead of individually animating characters or avatars, the company leverages AI to do some of the heavy lifting. Kevin He said the team’s long-term vision is to treat the real world “as a resource that we can use to train AI. Like in the TV series Westworld, the AI can observe how humans do everyday things, and the AI will learn how to do the movements itself.”
The following demo was created to illustrate further the recent work being done on the new Motion Brain, discussed in the article. The demo utilizes the Perceptive Motion Brain to track real-time body motion from a single camera with 3 point tracking, as well as the new Generative Motion Brain to generate “on-the-fly” realistic motions for the NPC. The real-world user interacts with the NPC, controlled by the Generative Motion Brain, using physics simulation. The result is an alive, procedurally animated character responding to digital touch with no trackers needed.
If you're interested in learning more about this demo, contact us today!