What is Root Motion and Why is it Important?
Root Motion
Root motion is a technique in 3D animation in which movement is keyed relative to a root bone that determines the speed, movement, and rotation of an animation. This means for example, a walk cycle's speed will be governed by the pace of the animation itself, rather than in code. Using code to determine an animation's movement is known as Treadmill animation, as these animations are usually done "on the spot". The difference in practice can be seen in the following demonstrations - the left being Treadmill animation (on the spot), the right being an animation created with root motion in mind.
These are animations I made for our game's old robot before redesigning and remaking it from scratch
So why are we using Root Motion?
The use of Root Motion in games is useful as it adds an extra layer of believably to a game's characters if done well. In many games one will notice that characters glide around the environment, their animations just filling in the visual illusion of walking or running. If one takes advantage of Root Motion, they can pace a character's animation to match the cadence of their walk, with realistic peaks and troughs in their movement speed that occur with bipedal movement - or any other form of movement. Additionally, the use of Root Motion allows for a character to reiterate animations without resetting to their initial position, and instead continues from where their previous animation ended - an example of this would be an infinite walk. Further reasons for an animator to adopt root motion are explained in this video by Birdmask Studio on Youtube [1].
Rigging our G.S.A. Tainted Robot Character
In order to create animations for our game's character - the General Service Bot [Tainted] (also known as the G.S.A. Tainted), I first needed to create a rig and skin the character's many meshes to it. Character rigs allow for movements and poses to be made and are necessary when trying to animate any 3D character (excluding those made up of single primitives).
Using Blender 2.8, I created an armature that allowed for full articulation of the robot's body. To allow for root motion to be used with this character, it needed to have a bone from which to child all subsequent bones to. This was placed at [0, 0, 0] in the scene and was labelled as "Root". Next, I moved on to creating a rig for the rest of the model. This went smoothly generally, as I had already learned the process through rigging the old, scrapped character - but there were a few issues / challenges I encountered during this process. Firstly, I had no notion as to how one would go about rigging a piston (as seen in the middle part of the character's left leg), so I had to do some research. Thankfully, I found a video on the topic using an old version of Blender. Thankfully Blender's fundamental functionality is the same as it was before, and I was able to apply the video step-by-step in the newer version. The result of this process can be seen here:
The next challenge I faced had to do with how I would rig the Robot's left arm. The base of the robot's arms are intended to be bellows which deform according to the arm's opening angle. I had a lot of trouble getting this to work as intended, and needed to find a way to compromise. I first tried to rig the bellow by creating bones along the entire length of the bellow and then connecting it to a bicep bone, but I failed to skin this in a way that got the result what I wanted as the shoulder pad would clip through the bellow. I tried a number of times to rectify this by merging the shoulder pad and the bellow, but to ultimately no avail. The next method that I tried (and kept) was to rig the arm in a fashion that made the bicep bone act as the hinge for the bellow, with the shoulder pad and bicep parts being skinned to it. The results of this attempt were a nice compromise after a few days of frustration. The setup and results of this approach can be seen below.
Once I had figured these out and fixed some skinning issues with some of the character's small tentacles, the rig could be considered complete. I tested it a few times to ensure that it would move as intended, and created the following video in order to show the rig's capabilities to my team.
Reflection
During the process of rigging this character specifically I learned a number of things about IKs, bone weight, and bendy bones in Blender. These were all thanks to the inclusion of organic forms in the character's design - the tentacles (the arm specifically), which was something that I had never really tried rigging until now.
There are some things that I believe could have been added or improved with this character's rig had I more time to research and experiment with to allow for proper implementation. These could be:
- Physically rigged preliminary tentacles (i.e. the torso & neck tentacles) that respond to physics and flop around in a realistic fashion.
- Rigged & animated preliminary tentacles that cycles independently of the character's movement.
Otherwise, the character was rigged enough to allow for all of the required animations to be made and added to the game using root motion.
Creating Animations for our G.S.A. Tainted Robot Character
Animating the Character's Walk Cycle
In accordance with the requirements of my team-mate Jason's A.I. code, I needed to create a number of unique animations for the robot character. These were:
- Left Turn (90 Degrees)
- Right Turn (90 Degrees)
- Investigate
- Walk
- Chase
- Attack
- Detect Player
For each animation, each pose of the sequence would be blocked out, with vague timing in mind. Looping back over the pose-to-pose sequence, the timing would be adjusted in order to suss out a suitable pace and distance traveled. Next came the in-between key-framing, wherein the transitions between each pose would be given more detail and structure allowing for smoother, more believable movement. It was at this point that reference material would be utilized to the best of our ability, needing a keener eye to identify the little details in movement. Like all animations done for this project, the frame rate chosen for use in our game was 60 fps.
The sequence was animated on the spot, without any transformations key-framed on the character's root bone. This was to allow for animation to be done without the hassle of following the character around the blender space, and mitigating potential errors. Once the on-the-spot animation was complete, the Root bone would then be paced in the forward axis by an amount matching the pace of the character's walk. However, this would give the character a 'gliding' effect. To mitigate this I would halt the the character's foot in place when it made contact with the ground. This was to give the illusion that the character is pushing themselves forward with whatever foot was currently planted on the ground.
The result of the walk animation can be seen here:
The process as outlined above was applied to all of the following animations, due to the success I had in creating the demo walk cycle. Over a Sprint of 2 weeks, I went through the list and animated the rest - one after the other, in Blender. I did this in the same project file, resetting the character's transform between each animation, in order to reduce the amount of file management I would need to do and in order to export the Robot character as .FBX with all of its animations attached. I would then separate these animations in Unity for use with the Animation Controller component and Jason't A.I.
The animations play out in the following order:
- Walk
- Turn Left
- Detect Player
- Chase
- Attack
- Investigate (on the spot)
- Turn Right
- Investigate (root motion)
Importing the Animated Character into Unity with Root Motion Enabled
In order to bring my animated robot character into our Unity game project, I first needed to export the robot mesh and rig as .FBX with the animations baked onto it. I exported it from Blender with the following export settings.
I needed to ensure that the robot would be exported with the Armature (rig), with a scale of 1 with units applied, and facing the positive Z axis. This ensured that the scale in Unity would match that of Blender's, and that the robot would be facing a forward direction in-scene. I decided to disable Leaf Bones and NLA strips because for use Unity, these extra settings serve nothing else but to needlessly inflate the file size of the export.
Once I brought the Robot into Unity, I needed to change a number of settings on the import for root motion to work properly. Firstly, on the Rig tab, I set the Avatar Definition option to "Create From This Model", then chose the root bone of the character's armature as the Root node.
Next, I moved over to the Animation tab and separated the individual animations (highlighted in red) into clips from the baked animations that came with the .FBX import. I did this using the timing sheet that I had created when making the animations in Blender.
And just like that, I had root motion animations have been imported into Unity. To demonstrate that the animations worked with root motion, I played the animation previews to see if each animation loop would continue from the position that the previous loop ended.
The walk animation looping in the Unity preview window with root motion enabled
Reflection
All in all, this process as a whole taught me a LOT about animation for games. From root motion, to animating in Blender, to importing an animated character into Unity, I've become a lot more comfortable with this key area of game development.
Looking back, the improvements that I would make would have to be to do with the animations I made for the robot in blender, all of them tied to the rig changes I mentioned in the reflection section of "Rigging our G.S.A. Tainted Robot Character". These being:
- Adding physically based armatures and constraints for the periphery tentacles on the robot - essentially ragdoll physics.
- OR individually rigging all of the periphery tentacles and animating them separately from the main character - jiggling motions.
Other changes I would make relate to how the character's physical traits are communicated in their animations. I would:
- Add more character to the walk & chase animations in order to make it look less balanced overall and so it moves with more weight.
- Create 180 degree turn animations in both directions (L & R) in order to expand the movement options for the Robot A.I.
- Create a wider array of attack animations that make use of more of the robot's extremities.
- Investigate procedural animation and look into how it could be implemented to generate robot walking and blends between animations.
References & Resources
[1] Birdmask Studio - published 14 Mar 2018 - "Should You Use Root Motion?" - https://www.youtube.com/watch?v=j7XZ3Q8JNfM
[2] Medhue - published 18 Mar 2015 - "Medhue Elephant & Unity Root Motion" - https://www.youtube.com/watch?v=d5z9dEnE4DE
[3] 25games - published 4 Jul 2018 - "Unity Root Motion - in 6 easy Steps | Unity 2018.1 and Blender 2.79 | Tutorial" - https://www.youtube.com/watch?v=SsHCkK4iou0