In|Framezģ Papers

"Tea-time" talk for "real-time" animators

By Homam Bahnassi

This article is divided into the following sections:


Most of us, who apply 3d scenery to cinematic productions, are used to work with certain advanced techniques during the productionís workflow. Bad news is that 90% of these techniques arenít applicable when it comes to real-time games production.
This is mainly caused by the limitations set by todayís hardware. Because of this, we have to obey to these limitations if we really want our work to see daylight through the real-time gamesí window.
Yes, these are transient days that we have to live until the hardware gets advanced (and purchasable) to the point it can do real-time ray-tracing and such complex calculations, thus delivering just what we imagine, with out any limitations. And until that time, some of us might be already in the graveyard with Walt Disney!
So, if you do want to have some real-time productions before your time actually comes to an end, youíll have to master those dumb limitations to the degree that allows you to gain good quality, while still conforming to the limitations set.

Optimization, optimization, optimization. Thatís what itís all about! And thatís what weíre gonna talk about today.

Introduction (again)

Of all the games pipelines I worked with, it was XSI which I felt most comfortable with. This, coupled with the evolution of the dotXSI file format, provide a consistent way of transporting data to and from the gameís 3D graphics engine.
Well, yes. There are a few glitches that lie here and there, and Iíll address some of them here in this article as we face them.
This article will draw some broad guidelines that you should put in mind when youíre creating media for real-time games. Along with this article, Iíve attached an XSI project that contains a bunch of scenes for a real-time optimized character made by one of my fellow workers for some 3D game. Make sure you
download it and use it to apply the techniques weíll talk about nearly.
So, letís get started...

Know Thy Limitations

Just like cinematic production, real-time games content requires modeling, animating, technical directing, and all the other stuff youíre usually used to do.
However, in the games world, you have much more emphasis on the optimization process, as well as a new interesting operation called: Real-time Shaders setup. And the third (and perhaps most important operation), is to prepare the content for exporting to the 3D engine.
Before you start your project, have a talk with the projectís Lead Programmer, get him a small drink, and discuss all the things that he can (and canít) process from your content. For example, does the 3D engine supports animation? (Of course it should!), if so, to what degree? Does it support cubic-interpolated FCurves? or just linear FCurves? I think you got the idea.
This is so important, because it gives you a chance to plan how you should be constructing the scene, so if you canít use a specific technique, maybe you can find a way to work around it, and so on.

Oh, one more thing. Through out the rest of the article, there will be a certain phrase that Iím gonna repeat over and over, so I decided to get a short form of it. So, when you read 'TFELY', it means that "The Fucking Engine Limits You".

Polygons Vs. NURBS

Specifying what type of geometry to use is usually a flavor of the modeler. Based on his skills, and the modelís typology, he might find that modeling a carrot with NURBS is easier and faster than doing it with plain polygons.
But generally speaking, when it comes to real-time, poly-meshes are winners. Thatís because most of the engines don't directly support NURBS, so they might have to do the tessellation back to polygons in real-time, which is not a speedy operation (one game that used to work directly with NURBS is Messiah).
However, the low-level 3D APIs themselves (like Direct3D) donít support NURBS rendering. So ultimately, NURBS must be tessellated. And if you do this tessellation early, you might have a better opportunity to perform further optimizations.
In XSI, you can always convert any NURBS-mesh to poly-mesh at any time, but remember that youíll loose the UV-coordinates, so always do texturing after tessellating your NURBS.
Anyhow, this is one TFELY case.
You can open the file "model.scn" and meet Jack the Thief after the Ďmagic wand-o-optimizationí enchanted him and removed all extra faces from him, replacing them with a render-mapped texture.

Envelope waits.. err, weights

One day, I finally managed to arrange that long-awaited tea talk with our 3D Engine programmer. And we managed to come up with a new major TFELY; which lies in the skinning operation.
Most low-priced graphics hardware in todayís market allow only up to 4 skinning deformers per mesh. For example, this is the case with NVIDIAís Geforce256 and GeForce2 chipsets.
If you still want to squeeze hardware acceleration for skinned meshes from these cards, youíll have to divide your mesh to smaller subsets that are only affected by a maximum of 4 deformers at once. When thatís the case, you need to keep an eye on the polygonsí normals when chopping the subsets to keep them unified and smooth.
Unfortunately, in XSI 2 (or earlier) when you divide polygons it regenerates the normals for the affected polygons, which causes visible seams at the disconnection areas.

Discontinuity in lighting, caused by normals facing different directions

This requires another workaround for the workaround! You can do this process in SI|3D where you can control normals with 'edit shading normals' or use the ''normals modifying'
add-on for XSI 3 to fix things up.
In "envelope.scn" you can see two versions of the same character. One with separated subsets and enveloped with the maximum of 4 bones, and the other is a one-piece mesh with about 22 deformers.

When youíre done chopping and skinning your mesh, youíre left with one additional little step; which is about deformersí assignments.
XSI assigns the effecter of the chain you select as a 'zero-weight deformer' even although you didnít select it as a deformer. So you should remove it manually to get rid unnecessary deformers on your mesh (which are very bad, believe me).
Hereís a simple script that automates this step for you. Read the notes in the script header for more info.

Each time you do enveloping; XSI adds a bunch-o-clusters to your mesh to store weight data. Some of these clusters are visible and you can access them directly from the explorer, such as the "EnvelopeWeightsCls" clusters.
However, there are other clusters that are hidden and cannot be selected directly, like the "EnvelopeSelCls#", which are used to color code the mesh vertices to aid in identifying which vertices are weighted to which deformer.
The count of these clusters depends on the number of deformers, which means you could get about 34 additional clusters for a standard enveloped character.
Try to count how many clusters on the same poly-character from "envelope.scn" by using the following script: "cluster_info.vbs".
Leaving these additional clusters to be exported with the scene would cause the exported file to get really huge, thus longer load/processing time (which will make the programmer less than happy with your work).
What we need to do, is simply delete all the additional clusters that our real-time application won't make use of; such as the color-coding clusters. Unfortunately deleting such clusters from each mesh is painful, especially that they're hidden and thereís no direct way to access and delete them like any other clusters. So you're stuck with scripting to accomplish this dumb task.
You can use the following script "filter_cls.vbs" (which I wrote while working in the DOTT demo) for filtering and deleting these additional clusters.
Apply the script to the enveloped character in "envelope.scn" to filter unwanted clusters. After finishing optimizing your enveloped mesh, you need to test it to be sure that you didnít do anything wrong before moving on to the next step.

Dealing with Animation

In XSI, animation is way too alien to be implemented in todayís real-time games. Although game engines are starting to implement simple animation mixers,they still wonít get to the level of complexity that XSIís animation mixer is.
Bottom line is, you can't port the same animation techniques to your real-time application.
Mainly what level of animation you can export depends on the engine and the file format. Considering dotXSI, you canít blame the file format, because theoretically it can handle all animation data thatís relevant to a real-time game through its standard templates or through additional custom parameters. So, this stands to be another TFELY!
Speaking of custom parameters, these guys act just like the joker for dotXSI templates. They work for everything!
Anytime you need to export an animation for unsupported piece of data, all you have to do is to plot it to a custom parameter, and leave the rest to the damn engine.
For example, version 3.0 of
DirectSkeleton accepts animation for any common parameter (e.g. materialís colors, cameraís roll, SRT, Shape Animation ...etc) plus an open conduit to pass custom parametersí animation through user-defined custom functions.
So we need to plot the entire animation mixerís data, expressions, constraints and scripted ops to standard function curves prior to exporting the scene.

Letís test this operation right away, open the file "animation.scn", then add the "Hard_Run" animation clip from the XSI_NET local library into the "ManSkeleton_Compatible" mixer.
We need to convert the animation back to FCurves using the 'apply action' command. So we choose the action source "Hard_Run" from the explorer, and press 'apply_action' in the motion module.
Next, we delete the mixer just to make sure no mixer data will be exported.
After doing animation plotting, weíre left with one additional task. That is, optimizing the resulting FCurves. Use the FCurve cleaning (fitting) tools in the animation editor to get lighter FCurves. Yep! That should make the programmerís day happier.

Mental Ray Shaders won't shade

Even though todayís gamersí hardware does support shaders in one way or another, theyíre way from being able to do the heavy computations that Mental Ray shaders can do.
Real-time shaders even have limits on the number of instructions they can execute!
Since you should only depend on the OpenGL or DirectX shaders (whichever the engine supports), it would be a good practice to remove all mental ray shaders from the render tree, because you might be tricking yourself with something you wonít be able to see in the final production.
Decoupling MR shaders will also reduce the final fileís size by omitting the additional shadersí many parameters (the simple phong shader itself has about 20 different parameters!).
Note that if you're exporting your scenes to dotXSI version 3.0, the entire render tree set will be cancelled and the shaders will be replaced by an SI3D material (soft_material) node.

A side effect of exporting to dotXSI 3.0 (or earlier)

You could use some scripting here if you want to automate the "MR to Real-Time" shaders replacement operation, thus making your life much easier.

Write a script that automates this operation

Package the stuff, weíre traveling

As you noticed above, many of the pre-exporting chores can be automated by using simple scripts. Once youíve written them, you can packageíem all together in one single script for easy and managed usage.
And if you're bitched with a programming language like C++, you can do a compiled version and use it as a custom exporter for your engine.
This step is really important especially if there are too many animators working on the same project, and theyíre all involved in preparing scenes for real-time production.

For Whom The Bell Tolls

Youíve done all the cleaning and optimization chores already, and you finally exported the file.
Now you throw it in the programmerís face, and say: "Hereís your work, kreskin. Iím out!"
Then you happily leave your desk and get back home feeling that the whole world is smiling at your face. Suddenly, your mobile rings in your pocket aggressively warning you of an upcoming disaster. You make a look at the calling number, and start to sweat, itís your boss!
What does he want? Should you just cancel him out? Finally, you make your mind and answer the call.. - "Hello?"
and suddenly all hellís gates open up in your face:
innocently you reply:
- "What?"
the storm keeps roaring:
- "Yes?"
the world slowly fades to black in your eyes and you fall unconscious

On the bright side, if Kim was a little bit more patient, he wouldnít have fallen in that situation.
He forgot to view and debug the results.
This is a very important and vital task, which can be done either by using XSI Viewer for viewing the data you export, or by using a custom viewer thatís based on the engine you work on.
In some cases viewing isnít enough, you may need to open and debug the file manually to check some specific errors; in this case you could use
XSIDump to track down the error and fix it.

Now, open "export.scn" and use the 'dotXSI export' command to export this file to dotXSI. Notice how we replaced the bones with nulls to make sure that the scene is compatible with the dumbest engine around.
Check the file, and make sure it looks correct. Once thatís done, you can throw the file in the programmerís face again. And this time, if your boss shouts at you, just say:
"LISTEN MISTER! IíM NOT THE ONE TO SHOUT AT WHEN THAT DUMB PROGRAMMER CANíT IMPORT THINGS RIGHT! THE FILE IS CORRECT AND YOU CAN SEE IT YOURSELF!" And this time, it is the programmerís day that will turn dark. Isnít it his fault after all?

Die programmers, die! (just kidding)