3.12.13

Further Clean Up

Here is the result of the clean up after making those required adjustments and in the sequence of the film. I attempted to blend between the second capture of the rushed walk and the first capture but the result was not very appealing so the next best option was to adjust the first capture to fit with the previous walk and action. Although after viewing it in the sequence some of the actions aren't aligning with the set, as he rushes off he walks off the path.

1.12.13

Constraints


In this sequence, James is holding a pencil in one hand and he has a backpack that he is clutching to with the other hand. To constrain the objects to James while he walked they each required a joint and a rigid bind. In MotionBuilder, a child/parent constraint was applied to the joints of each object. The left wrist effector was then connected to the pencil constraint and the chest effector for the backpack.

Scenery & Camera Test


From Maya we connected the scene through the "send to" options to test the environment and the cameras. This allowed us to adjust the direction of the action in relation to other objects and to achieve the desired result. Above is an example of this with some further clean up and blending of takes. The cameras in the scene were animated in Maya and the pause in James' motion is due to timing and the camera being from his point of view. The rushed walk away has some criticism because of its pace, the unusual arm movement and the relaxing feel at the end of the shot. The next step would be to possibly slow down the rushed walk a little and then extend the shot to feel as if the character is completely following through with the rushed walk. The current rushed walk is a part of the recaptured data and a solution could be to blend it with the previously captured data.

Recapture

We decided to try and enhance the motion of the character by recapturing some shots. With the first attempt we lacked with experience in directing and after applying the capture to the character some of the action felt too realistic and more natural than we envisioned. So with the second attempt we aimed to achieve more personality from the actor that would help to reduce the overly natural feel to the motion. 

Character Walk Clean Up

With my assigned shot from the motion capture I began cleaning up the shot by straightening out the initial walk. As you can see below, we were restricted to the spatial set up of the motion capture and this resulted in having the actor walk in a circle/square which would allow us to develop the full walk within the film. Using "Story mode" within MotionBuilder I was able to cut the square/circle walk into sections and then realign them using the "Match" setting within story mode. From this point it was a matter of fixing any jerking or jumping actions and creating a smooth blend between the separated shots. 



2.11.13

Shadow Production Capture

For our short animated film we have decided to try and use motion capture to accompany our animation. We were required to setup the capture volume and each member of the short film was to direct a scene. This required preparing any props to help motivate the actor and to make him aware of the spacial boundaries within the digital world. Motion Capture also called for each director to think about sound and timing cues and if the actor had to react to those aspects. 


Our roles within creating the short film also came into play. I was required to develop the rig for the character which proved difficult at first as the initial rig created in Maya had issues as it was an fk ik rig and transferring to MotionBuilder was not successful, so a basic fk setup was required.

Simple fk setup for our character
Once this rig was imported into MotionBuilder it had to be characterized and a control rig was developed for it. This character was then merged with the captured data and the bone structure that was imported from Motive. After assigning the source for the main character to the data the scene could be played back and the designed character would follow. From there to make any adjustments in the animation the Motive actor and marker/optical sets were baked to the control rig.
 



1.11.13

Motive & Motion Builder

We looked at how MotionBuilder fits into the pipeline after trajectorizing and cleaning up our captured data in Motive. By trajectorizing the data, we filled gaps where data was not 100% captured, such as when a marker was lost/undetected it created a spline curve between the point before it was missing to when it returned. We were able to export the mocap data as a .fbx format which was readable through MotionBuilder. Within MotionBuilder we were able to set an actor to the marker set and then apply a MotionBuilder character to that actor. The result is below.



From this point, MotionBuilder is used to correct any actions and body parts that have gone askew. Further animation can be applied using animation layers and any data that was not recorded such as fingers and foot rolls can be added. MotionBuilder is also used to blend from one take to another to get a complete action.  

First Capture and Setup


For our first day of motion capture we setup the camera tripods and cameras and linked them to the laptop with the Motive software. Each tripod held two OptiTrack camera's with a total of 18 camera's for the capture. All except one used only infrared to capture data and the exception was able to provide a live low quality feed as well. The camera's were connected to a USB hub which was connected to the laptop via USB cables. The hubs were also synced to one another and required their own power source. 


Layout of the MoCap setup

Through Motive we were able to determine where the camera's should be pointing and if there were any objects reflecting or producing infrared light. Once clear, we were able to start the wanding process to calibrate the camera's and the capture space. Next we set up the character in Motive and aligned the markers on the actor to correspond with the software.

Overall the setup was fairly straight forward and the software UI is clear and simple to use. Next we were able to play around with capturing footage by testing the limits of the system by pushing the capture region boundaries and directing the actor in different actions. It was good to test and introduce props, where we discovered a chair for example could occlude certain markers and cause issues. This was good practice not only to make sure props didn't reflect infrared but to make sure the actor does a T-pose before and after each recording.
Here's a behind the scenes video of Juan acting like a monkey:



Introduction to Motion Capture

In the first week we were introduced to the roles and the tasks within those roles for motion capture. From directors to actors and the animators. Luke introduced us to some of his work and the clean up and editing required once the footage had been captured. We also spoke about achieving the desired result to reduce the amount of clean up and edit by giving the motion capture actors props, tools and direction to work with. Some examples that we looked at are below, they include the penguins in Happy Feet and giving the actors/dancers direction as to the behaviour of penguins when they move. The actors/dancers then had to adapt that to the dance numbers.






We then went on to look at other uses of motion capture, which is largely prominent in high end games development. These included the Uncharted games and Beyond: Two Souls. Here we were able to see the props and scenes built to guide the actors. In Uncharted they produced the feeling of walking in sand by giving the actors crash mats to walk on when capturing the mocap footage.