6 April 2014

Texturing & Re-targeting......

So, a progress report on my character is probably about due so here goes.....

Since my last post I've made quite a bit of progress on the old guy character that I'm working on. I spent a little more time on sculpting and then moved onto texturing the mesh.

As I'd been using Mudbox for the sculpting I thought I might as well investigate it's texturing tools as well, these allow you to paint directly onto the model in 3D space.... this workflow was a revelation for me. For years I have created 2D textures in Photoshop using the UV maps that I have made in Maya as a template, this workflow is fine but even the best UV maps can distort the finer detail.

Mudbox still uses the UV map as a template but being able to paint directly onto your model means that what you see is what you get, there's no need to jump between Photoshop and Maya to check that the texture is fitting properly or to tweak UV's after the texture has been applied. Obviously, 3D painting tools are not a new thing, it has actually been possible to paint textures in Maya for many generations of the software but the functionality has been more suited for roughing out rather than detail work. The painting tools in Mudbox, are fantastically responsive, working on layers allows the build up of texture and being able to use reference images as colour stencils is genius.....

The above image illustrates a visual evolution of the shader network that I have used in Maya for this character. Until this point I had never really used the Sub Specular Scattering (SSS) shader so I figured that with all of the other new knowledge that I have crammed into my brain through this project, a little more development wouldn't hurt.

The SSS shader works like skin, different layers allow light to penetrate and scatter as it interacts with the material. Rather than my getting bogged down trying to explain it all here's a link to an explanation from Autodesk.......

Mentalray Fast SSS Tutorial

Another area that I've been looking at since my last post is joint re-targetting through the Human IK (HIK) system in Maya. Since this character is going to be used as a motion capture puppet I figured it would be prudent to begin to develop my understanding of how this system works.

I figured it would be more entertaining to see the old guy move than a naked rig so I quickly bound a very rudimentary skeleton to the mesh and defined it as an HIK character. The fundamentals of the HIK system are pretty straight forward and the GUI (see below) pretty much guides you through the process of setting up the character.

It is as easy as selecting a joint on your rig and then assigning it to the relevant indicator on the GUI. So, after defining my simple rig as Character 1, I defined a mo-cap example rig as Character 2 and used it as a motion source for Character 1.... it's really that simple.

Below is a render of the driven motion..... Like I said the rig is very rudimentary and there was no attention paid to setting the joint influence but it kind of works. I find that because there is no facial or finger animation the resulting motion is rather unsettling, dead eyes and limp hands give it a peculiar reanimated appearance reminding me of Overtime, a dark tribute to Jim Henson from Supinfocom

The actual rig that I eventually apply will have facial controls as well as hand controls which will be animatable through keyframes... I don't think I want to start looking at facial mocap for this project but who knows I might end up doing just that!!

No comments:

Post a Comment