Here’s a look at my experiments with iPiSoft Mocap Studio. The video below is my iPisoft contest entry for the month of August. I downloaded the 30 day trial, picked up a couple of used Kinects for under $30 each and set about experimenting. I tried a single Kinect at first and the results were promising, but as expected, when joints were occluded by my body as I turned away from the Kinect, it was unable to track them.
Next I tried the 2 Kinect setup and found that I could get really good results by placing one in front and the other behind myself, basically at 180 degrees from each other. I did some really frenetic motion, jumping up and down, turning around etc to really see if I could get a good capture. I let the shot process, and it looked really good!
I added some minor jitter removal, imported an fbx file of my target character in Mocap Studio and transferred animation from the default rig to a custom character that I designed and built from scratch. From there I had an FK skeleton with my character skinned to it with the mocap keyframes on it, exported as an fbx. Here’s where you could just merge it into a scene with your character rig, and the keys would apply to the bones. Instead, I wanted the keys on the animation control objects that drove the skeleton so that I could use it within my pipeline just as I would any other keyframed animation.
I created scripts for this purpose in 3DS Max. Note, these were specifically for the custom rig I used within my pipeline, but the concept should carry over to any tools. This is what the scripts did:
1. Renamed joints that had the same pivot as my control objects to the control object name.
2. Set Animate or Auto Key on, and aligned each joint on the mocap rig to the corresponding joint on my animation rig on each frame of animation.
3. Deleted any unwanted keyframes, such as scale keys or transforms on joints that should only rotate.
Next I used a Leap Motion Controller with Brekel Hands to record some finger movement. This gave me an fbx file of a skeleton with animated fingers. I used a script to transfer that motion to my animation rig as well. After that, I saved out the animation and reloaded it on to a clean rig to remove any unwanted joint transforms that may have occurred during the process.
This got me a clean animation rig with the motion capture data faithfully placed on my animation controls, ready to refine further. For this test, I didn’t want to change the mocap. I removed a few keyframes where the arms went too close to the body then thought about what minimal animation I could add to bring the character to life on top of the motion capture.
1. I ran a cloth sim to give the clothes some nice secondary motion.
2. I set baked keys on the eye control object and simply offset the keys so that the eyes would lead all movement.
3. Lastly, I set some keyframes on the face rig to drive morph targets/blendshapes and give the character some quick facial animation.
I added some lights and rendered the final shot of my mocap test.