What is the difference between Neuron and other character animation methods?
Interaction and speed. Alternative methods for character animation fall into a few categories: keyframe
animation, motion capture, and some procedural techniques like inverse kinematics. These methods are
useful for creating crafted canned animations (or in the case of IK, approximating procedural motion),
but can be time consuming and costly. Neuron allows for entirely procedurally generated, physically
interactive character simulation.
Will DeepMotion make traditional animators obsolete?
Far from it. DeepMotion empowers animators to focus on the details and expressive touches that make
animation great, while handling the basics of movement and interaction—which can be quite tedious.
Photoshop didn’t eliminate photographers; it gave them tools to create great pictures faster and more
easily. We want to empower creatives to take their talent even farther with AI assistance.
What are the applications of DeepMotion Neuron?
Primary use-cases include interactive character simulation in MR, VR, AR, Gaming, as well as character
animation in Film, Visual Effects, Crowd Simulation, Rapid Prototyping, 3D Avatars, and Digital Emojis.
We are also exploring Neuron use-cases in the realms of Robotics, Medical Visualization, Industrial
Training, and more.
What about non-bipedal characters?
While our technology can support training for additional character types like quadrupeds, the
first release of Neuron only supports bipedal characters. Users can still leverage the Unity
SDK to create multi-legged physically simulated characters (tutorials on how to do
this are available through our
What do you mean “Interactive Characters”?
Our “Simulation Rig” imbues the character with physical attributes and constraints. The
physicalization each character undergoes simulates the joints and muscles typically found in
a real world human. We add torque, limitations to joint rotation (for example, in the elbows
and knees), etc. Our simulation rig also optimizes some functional objectives, like maintaining
balance, mimicking the way people learn to stay upright. This is the basis for real-time,
physical interaction between end-users and characters. For example, a Neuron character will
stumble on the impact of a push.
How does Neuron stitch multiple behaviors together?
After training your character the desired motion skills, we do a second round of training to
create a “Motion Brain”. Using machine learning, the Motion Brain defines connections between
different behaviorials to allow transitional locomotion. These simulated transitions look like
blended animations at runtime.
Everyone Says They Use "AI", What Techniques Or Algorithms Are You Using?
We employ a variety of machine learning and deep reinforcement learning techniques, in addition
to robotic algorithms for physical character modeling. Our behavior training algorithm is not an
open source algorithm and will remain proprietary. However, those interested in the science behind
our technology can review the work of our Chief Scientist, Libin Liu. See our blog on his latest
How efficient/performant are characters trained using DeepMotion?
Neuron character control files are fairly performant. We anticipate users being able to simulate
upwards of 10 characters in real-time on PC, and upwards of 2 characters on mobile.
My Characters Have Different Body Types, Can I Use The Same Behavior On Both?
Yes. You will still need to retarget the motions to both characters to ensure the motion is scaled
to their bodies. Characters do need to have bodies that are physically possible for training to
work as intended.
Can I Create My Own Training Data?
Yes! Upload your own .bvh file to train your character on custom data.
What is Neuron compatible with?
Neuron cloud training is compatible with rigged biped character FBX/GLTF
files containing 60+ bones. Users will also get access to our runtime SDKs for Unity and Unreal.
What is your licensing model?
Neuron users will pay a one time fee per behavior license. These behaviors can be used on
unlimited characters and across multiple projects, with a nominal cloud training fee charged
for retargeting or fine tuning a neuron behavior. However, each account will be restricted to
one user. Our runtime package will be free for Neuron Indie users. Neuron Enterprise includes
support services and custom training, please contact [email protected] to discuss
What is the timeline for Neuron's BAAS release? When will I have access?
The Neuron BAAS cloud release is slated for Q1 of 2019, backers of this campaign will be
invited to join prior to the full launch and will additionally receive an exclusive Unity
demo package for early testing. Perks like the free retargeting will be redeemable once early
adopters get early access to the BAAS platform.