16 April 2014

No Spring Chicken....

A bit more progress on the digging character, this time he's rather worryingly doing the chicken.....



This is really just an exercise to test the bind on the character but it was peculiar enough to make me render it out and present it as part of the workflow of this project.

I downloaded the mocap data from the Carnegie Melon University database.... it's pretty extensive and most importantly it's free for anyone to play with!!! The data hasn't been cleaned up or adjusted and, as you can probably see from the rendered animation, I've not bothered doing any of that for this test as I am just checking the bind.

Initially I wanted to rig the character with IK, FK and a third skeleton that could be controlled by mocap data but I have decided to forgo the IK and FK parts and have opted to have only mocap driving the main body of the rig.I did still bind a version of the character with a manual rig and IK FK switching on the arms and legs.... I will probably only use it for posing still renders rather than animation but it was a good refresher exercise in rigging and binding.

The hands and face of the character do still rely on manual keyframes which, in this video I have not demonstrated but please believe me... they work. I have used a couple of simple lines of MEL script to drive the mouth and the eyes, it uses noise over time to drive the rotation of the joints... I felt that the lack of movement ni the face of the last video was a bit unsettling... I suppose that in it's own way this new video is equally, if not more unsettling but it's not because of the face being motionless.

The next stage for this build is the blend shapes for the facial expression and geometry correction in extreme poses.

6 April 2014

Texturing & Re-targeting......

So, a progress report on my character is probably about due so here goes.....

Since my last post I've made quite a bit of progress on the old guy character that I'm working on. I spent a little more time on sculpting and then moved onto texturing the mesh.

As I'd been using Mudbox for the sculpting I thought I might as well investigate it's texturing tools as well, these allow you to paint directly onto the model in 3D space.... this workflow was a revelation for me. For years I have created 2D textures in Photoshop using the UV maps that I have made in Maya as a template, this workflow is fine but even the best UV maps can distort the finer detail.

Mudbox still uses the UV map as a template but being able to paint directly onto your model means that what you see is what you get, there's no need to jump between Photoshop and Maya to check that the texture is fitting properly or to tweak UV's after the texture has been applied. Obviously, 3D painting tools are not a new thing, it has actually been possible to paint textures in Maya for many generations of the software but the functionality has been more suited for roughing out rather than detail work. The painting tools in Mudbox, are fantastically responsive, working on layers allows the build up of texture and being able to use reference images as colour stencils is genius.....


The above image illustrates a visual evolution of the shader network that I have used in Maya for this character. Until this point I had never really used the Sub Specular Scattering (SSS) shader so I figured that with all of the other new knowledge that I have crammed into my brain through this project, a little more development wouldn't hurt.

The SSS shader works like skin, different layers allow light to penetrate and scatter as it interacts with the material. Rather than my getting bogged down trying to explain it all here's a link to an explanation from Autodesk.......

Mentalray Fast SSS Tutorial

Another area that I've been looking at since my last post is joint re-targetting through the Human IK (HIK) system in Maya. Since this character is going to be used as a motion capture puppet I figured it would be prudent to begin to develop my understanding of how this system works.

I figured it would be more entertaining to see the old guy move than a naked rig so I quickly bound a very rudimentary skeleton to the mesh and defined it as an HIK character. The fundamentals of the HIK system are pretty straight forward and the GUI (see below) pretty much guides you through the process of setting up the character.


It is as easy as selecting a joint on your rig and then assigning it to the relevant indicator on the GUI. So, after defining my simple rig as Character 1, I defined a mo-cap example rig as Character 2 and used it as a motion source for Character 1.... it's really that simple.

Below is a render of the driven motion..... Like I said the rig is very rudimentary and there was no attention paid to setting the joint influence but it kind of works. I find that because there is no facial or finger animation the resulting motion is rather unsettling, dead eyes and limp hands give it a peculiar reanimated appearance reminding me of Overtime, a dark tribute to Jim Henson from Supinfocom



The actual rig that I eventually apply will have facial controls as well as hand controls which will be animatable through keyframes... I don't think I want to start looking at facial mocap for this project but who knows I might end up doing just that!!