DeepMotion Avatar is a physics-based character engine designed to transform traditional keyframe animations into lifelike simulations. Configure 3D assets in a matter of minutes using our simulation rig editor and import your newly interactive characters into your favorite game engine. Reduce the cost of producing complex animations using our self-balancing, autonomous character rigs, and soft body simulation. Optimize real world machinery by running articulated physics-based simulations and take VR to the next level using 3 point tracked, full body avatars. Apply to participate in our DeepMotion Avatar closed alpha, currently underway.
Enhance presence in your experiences by forgoing floating torsos for interactive humanoids. Achieve lifelike embodiment with our 3-6 point tracked simulation rig. Using as few as 3 positional trackers (headset plus hand controllers), you will be able to place your users in immersive environments, complete with natural lower body movements and collision detection. Driven by our inverse dynamic algorithm, this tool can help developers realize immersion in VR. Rather than traditional inverse kinematics, our 3 point tracking technology provides highly realistic physics-based interaction with objects and other players.
Whether you’re an indie developer or a major studio, you can now bring lifelike interactive characters to your games and experiences in a matter of minutes. Allow your users to play tug-o-war with interactive pets, or defeat villains with the real force of their hand. Powered by our machine learning algorithm and biomechanical model, the DeepMotion Avatar simulation rig will prepare any character asset for live interaction during runtime in augmented, mixed, or virtual reality.
Use our intuitive pipeline to author self-balanced, self-walking characters, driven by the first principle of physics. Transform your 3D model into an autonomous agent and produce natural-looking locomotion by simply running a simulation. Coming in 2018: Use Avatar’s Self-Balancing Simulation with our forthcoming Neuron product to expand your character’s motor skill repertoire; our machine learning algorithm will train your characters to navigate their world using any set of skills or movement styles.
The Articulated Physics Engine is Avatar’s basal feature: imbuing your character with interaction and physics-based properties of movement. Take advantage of our Engine’s robust joint simulation and rigid body physics, bolstered by comprehensive modifying parameters for each joint. Model and simulate mechanical, industrial and robotic vehicles, passive ragdolls, as well as characters stable enough to self-balance as they navigate the digital world.
Transform your static 3D model into a biomechanically simulated character using our easy to use, drag and drop editor. Configure your character in 10 minutes or less to any of our simulation rigs and export your updated FBX file for immediate use. Adjust hundreds of skeleton, mass properties, joint, and motor parameters for further customization. Streamline iteration with our Live Sync feature. Adjust your simulation rig from Avatar’s SimRig Editor and see the changes reflected with the configured character inside your game engine, immediately.
Previously a tool only available to major animation studios, simulate in real time the behavior of deformable material like flesh, fat, and muscle to enrich your characters with true-to-life secondary motions. Ditch costly keyframe animation cycles for procedural soft body dynamics.
Create a multitude of realistic, interactive agents using Avatar’s scalable character engine. Author physically simulated characters with speed, all whilst maintaining variation and quality in the final character animations. Reduce the cost and time of rapid prototyping and injecting character movements with unexpected details in order to simulate diverse crowds.
Create a digital double for real-time physical feedback with our Biomechanical Model Simulation tool. From joint and muscle stress to skeletal pressure mapping, this Avatar feature opens the door for motion analysis, biomedical simulations, health education, and virtual training.