Debugging Motion Builder Scripts in PyCharm (3.4)

So, you want debug python in MoBu with PyCharm but either aren’t sure how to set it up or missed some detail along the way.  Well, this post should help.

Note: Much of this information holds true for Maya and other applications with integrated python. More >

Python FBX SDK, brief intro…

I had the opportunity to crack open the Python FBX SDK the other day.  Doing so happily validated many assumptions I had made confirming that I can read\modify\write .fbx files without going through a DCC.   While not hard to overcome, I thought it might be nice to have some of the initial items I stumbled on collected in a single post.

First, get the Python FBX SDK  here:

Second, be aware that due to its c++ underpinnings it relies heavily on iterators to build up collections.  This was a little foreign to me as I’m used to looping through existing loops.  For example, to build a list of properties on an object one needs to enter a while loop until an invalid property is returned…

props = []
prop = obj.GetFirstProperty()
    while prop.IsValid():
        if prop.GetFlag(FbxPropertyAttr.eUser):
            props.append( prop )

        prop == obj.GetNextProperty( prop )

Third, the python exposure of FBXProperties has no “get()” method. So, you have to cast it into a appropriate property type that does.  I found a post on Autodesk’s Area that covers this well.

Fourth, Python FBX SDK is highly enumerated.  An example of this is in the code above where I inspect a property to see if it is a custom, user defined property.

if prop.GetFlag(FbxPropertyAttr.eUser):

Well, that’s about it for now.  I’m pretty excited about the doors opened by being able to leverage this library in standalone python scripts.

THQ is no more…

The studio I work for, Volition, has been auctioned off with the winning bid going to Koch Media. While we’ve seen this coming for a while and I’m thankful Volition has made it through these troubled times I’m still in shock and have many questions.

Here is the official THQ letter to its employees:

More >

Exporting to FBX from MAX (Morpher Issues)

I’ve been looking into porting assets to Maya, MotionBuilder, Softimage via FBX. One problematic area I ran into early on was that of the ‘Morpher’ modifier…

To maintain quads, be sure to do the following:

  1. All mesh objects should be of class ‘Editable_Poly’
  2. In the exporter options, “Preserve Triangle Orientation” should not be checked.

Morpher compatibility…

  1. It seems that only the top most MORPHER modifier is exported with FBX. The others are ignored.Furthermore,it seems that morpher modifiers are limited to 100 channels. While you can set channels above this in mxs, the results are not as intended and I’ve only seen one channel above 100 work and it seems to be any index over 100.
  2. All morph targets need to exist in the scene as objects and targeted via the MORPHER ui. Failing to do this will result in the channel names being truncated to the first letter of their name. IE “Jaw_open” and “Jaw_back” both become “J” and “J” respectively. A script below helps address this by extracting and re-targeting morph target objects embedded in the MORPHER modifier.

More >

Facial Animation R&D

For years now, we’ve been using morph target based facial animation on our games at Volition. At the start of a new project there is a typical “bones vs. morphs” debate that occurs and we’ve always stuck to what we know. In the past, I’ve been on the recipient side of reviewing the results of debates and proposals. On a recent project, I had the opportunity to dive into this topic more hands on and I was quite surprised to hear myself recommending a ‘bones based’ approach…

You need to install or upgrade Flash Player to view this content, install or upgrade by clicking here.

NOTE: Normal map is inverted and wrinkle maps are not setup or shown here.
  • I made this simply to get an animated facial bone setup that we could use to test in our engine.  It is by no means final or indicative of quality. The animation is rough as quality of the performance was not under evaluation here.
  • This video is a screen capture from within 3ds Max of a 44 facial bone  setup (counting head, jaw, and tongue too).
  • I setup and animated the head in Softimage:Face Robot.
  • I did the eyes and head movement by capturing mouse tracking. The eye blinks were captured by setting keys during playback.
  • Lipsync was done with SI’s lip sync tool and it did a very nice job imo.

Head (work-in-progress) model courtesy of Ben Eoff.

More >