FBX SDK skinned animation

Started by
39 comments, last by Dirk Gregorius 5 years, 10 months ago

After having completed static model FBX import into my engine using the sdk now I'm going to start round TWO of trying to get skinned animations to work with FBX SDK. Does anyone have any _current_ tips to share on this topic? Last time I tried, I tried for weeks and gave up after debugging got very difficult and I could find only a few examples of what to do, a good one being ViewScene in the FBX SDK sample folder. 

Advertisement

What are you looking for? Just asking for "tips on this topic" isn't helping :)

Why do you need the FBX SDK? I though it was more for import/export kind of things, not for using the format. I know you can use FBX models, animation and all, without it, using some other (maybe simpler to use?) libraries such as assimp.

I'm looking for more of a traditional guide going step by step or some general tips on how to import an FBX with skinned animation.  I mention the Viewscene example in the samples but it doesnt have a bindpose when you open the file up in a 3d editor and instead uses a "key" system. I'm just wondering more about how this works when everything I read about skined animation is about 'inverse bind pose this, inverse bind pose that'

 

Regarding assimp I found that I could use assimp but it had many shortcomings and wasnt flexible enough for my needs and I had to write and look for a bunch of assimp specific code which got more dated as years went on.

On 9/5/2018 at 5:48 AM, mrMatrix said:

I mention the Viewscene example in the samples but it doesnt have a bindpose when you open the file up in a 3d editor and instead uses a "key" system.

If you are using bones/joints then you have a bind pose; there is no other to derive the "from here to there" movement of vertices if you don't have a "from."  Keyframes are a typical part of boned animations.  An alternative form of animation is vertex morphing/blend shapes, but that is not common due to the amount of data required and processing.  It is typically reserved for facial animations.  For boned animations you will have a bind pose which you can always retrieve from the FBX SDK (or error otherwise).

 

But before that we need to back up.  General tips?

  1. Your engine should never be directly loading FBX files.  FBX is a transport format.  Your actual job is to convert that format to the format that your engine will load.
    1. Would you be planning to send out every game with a legal note on your use of the FBX SDK?  Why would you bloat your users' downloads with this?
    2. Would you be planning to ship games with actual FBX files that every kid can open, view, and steal?  Plus they have data that you won't ever use, so the file sizes are insanely huge, and load times are ridiculously slow.
    3. This is obvious insanity.  If you are actually directly linking to the FBX SDK from your engine, holy hell stop what you are doing.  Make a tool that links to your engine's SDK (or can be a special build directly using some of your engine's modules) and the FBX SDK and export to a file format that is made for your engine.

 

 

As for loading animations, the general approach is:

  1. Get the bind pose.
  2. Go to the animation you want to export.  Multiple can be layered into single file.  For each track (you would expect 1 track to handle 1 component of rotation from T=0-5, then another track to handle a different rotation component from T=5-7):
    1. First, store the times of all the keyframes.  Let's say the animation is nothing but a base bone which does not move and an arm bone 2 feet away that first swivels up (from fully horizontal to 45 degrees) and then over to the left (or whatever; the point is it does 2 basic motions in 2 directions—imagine raising your right arm to 45 degrees and then bringing your hand to your chest, all without moving the elbow).
      • For this kind of rudimentary animation you could expect a key frame at T=0, then a key frame at T=5 seconds where the pose has the arm raised to 45 degrees, and then a 3rd keyframe at T=7 seconds with the hand swiveled over to the chest (hand raises for 5 seconds, then moves to chest during 2 seconds).  These are the minimum keyframes necessary for the motion, but there could actually be more keyframes for a number of reasons, and it is often the case that many keyframes are unnecessary.  If the actual motion is exactly the same, you could eliminate all keyframes but these 3, and I will get to that later.
    2. Go to each keyframe in your sorted list and have the FBX SDK evaluate the entire scene position at that time.  FbxAnimEvaluator::GetNodeGlobalTransform().
      • FBX can do a full scene evaluation and give you the positions of every bone at any time in the animation.
      • So you have a list of times which are sorted.  Go to each in order, evaluate the scene positions via the FBX SDK, and get the locations of all bones.
        • You have a choice to simply store the matrix that describes the bone orientation, but I heavily regretted doing this and there is no point in anyone ever again making the same mistake.
          • The rotation can be decomposed at run-time to interpolate between orientations (interpolation is what moves your bones between keyframes), but this is expensive and inflexible.  You will find later that the matrixes does not have all the information you need to make your animations more dynamic.  For example it will be impossible to correctly override all of the individual elements of the matrix (position.x, position.y, scale.z, rotation.y, etc.) to create a system in which you can have POS.XYZ, SCALE.XYZ, and ROT.XY all interpolated via the decomposed matrices while you manually override only ROT.Z.  You will find that you can get some of them to work (especially POS.XY or Z), but you will find rotations impossible to handle correctly in some edge cases.
        • So store the matrices only if you plan to do the most basic crappy animations ever.  If you are just going to run a static animation in front of some businessmen, have fun, but for real work you need to store the actual components of the node transforms and combine them yourself the same way the FBX SDK does: https://help.autodesk.com/view/FBX/2017/ENU/?guid=__files_GUID_10CDD63C_79C1_4F2D_BB28_AD2BE65A02ED_htm
          • It's extremely easy and reliable despite looking complex.  You can do all of the testing while you are working on loading the data, since you can combine them yourself and check to see if your final matrix matches theirs.
        • Now you have all of the components as vectors and scalars (NOT as matrices) and you have tested code you know combines them into the final matrices.  This is run-time code.  It can be optimized later to skip scales that are all 1 etc., but work on a reference implementation before you make a fast implementation.
          • At run-time you can now override correctly any of the properties that go into the transform, so now your system much more closely matches what an artist wants to control, which is obviously vital, but more importantly it retains the concept of tracks, which animate individual scalars.  If you had stored just the matrices, your animations would be basically "baked."  You no longer work with the concept of tracks, and focus on purely animation via matrix interpolation, which of course gives rise to the limitations mentioned above.

This is the basic idea behind simply accessing the data.  All of this goes into your own custom model file format.  Nothing here should be done inside your engine (except the code that recombines the properties to get a final matrix, which will be shared between your engine and your model converter).
Note again we do not export matrices, we export the values along tracks, and retain a track-based animation system.  When you want to play this animation back, update the tracks (they will interpolate between key frames and give you a single scalar back) for each individual bone/joint element (POS.X, ROT.Z, etc.) and then use those values to generate your final matrix.

Before moving on, you will quickly find a lot of useless data there.  You need to trim your file sizes down and also make your run-time faster, so:

  • Trim your model file size down by eliminating redundant key frames.
  • Make your run-time faster by only using linear interpolation between key frames.  FBX supports numerous types of interpolation, so to avoid run-time checks you need to reduce them all to only linear interpolations (no run-time branching to check the type of interpolation, and linear is the fastest type of interpolation).

Eliminating Redundant Keyframes

I had you first store the times of keyframes for a reason.  It makes this part extremely simple.  After you have stored all the times of the keyframes in step 3, now just add a bunch of times spaced apart by a fixed amount that you decide.  For example you may already have T=0, T=5, and T=7, now add T=0.1, T=0.2, T=0.3, etc. (you can decide how densely to pack in these fake times, but typically the more the better, because virtually all of them are about to be removed, and the more dense they are the fewer artifacts your results will have).

If you use std::set<double> for this then your times are automatically sorted and duplicates are eliminated.  The end result is that you have T=? for every 0.1 (or whatever) seconds in the animation, and (critically) you have the actual keyframe times all in this set.

Now you are going to go over this list of times and once again use FbxAnimEvaluator::GetNodeGlobalTransform() to evaluate the positions at all of these times, but this is an extra step, not #4 yet.
This time, your goal is to eliminate times from this set by determining if they are redundant.  A keyframe (B) is redundant if you can interpolate between keyframes A and C and get the same result as B.  You can see where this is going.
This time going over your times, you need to examine 3 T values at once.

Let's say A=0.2, B = 0.3, and C=0.5:

  1. Get the final scalars on your track for A and C by having the FBX SDK evaluate the scene/track at these times.
  2. Interpolate by yourself to derive your own value for B using linear interpolation.
    1. If A=0.2, B = 0.3, and C=0.5, then from A to C is 0.3 seconds, and B is 0.1 seconds into that, or 33.3333333%.  So you are interpolating to 33.3333333% between A and C to derive your own value for B.
  3. Now have the FBX SDK evaluate the track at T=0.3 (B's time) and compare your values.  If your values are close enough (use your own epsilon compare, and you get to decide what epsilon is: Smaller values leave more keyframes in the file but replication the animation via linear interpolation more accurately; higher values lose run-time accuracy but create smaller files) then B is redundant and should be eliminated.
    1. If you eliminate a value, you try again from A.  B will have been removed, so C will have slid over to become your new B, and what was previously D will be your new C.  All of this happens automatically simply by removing B from the vector so there are no special cases except to check that you are not too close to the end of the list.
    2. If you do not eliminate a value, then you advance to the next time value and repeat.  This means your previous B becomes your new A and you repeat.  Simple as that.

When you are done you should find that in our simple example you have just reduced the whole set of times back down to the 3 keyframes, but for larger animations it is extremely common that you will end up with more or fewer keyframes, and both are important.  If you have ended up inserting T values that were not there before, that will be explained below, and if you eliminated keyframes that were originally part of the data then you have just removed data that was originally redundant.  This could happen even if you didn't add a bunch of fake keyframe times to your std::set<>.  this same algorithm, without adding your fake T values, would allow you to remove redundant keyframes that exist in the FBX file (and these are common, ESPECIALLY with motion-capture data).

Only Use Linear Interpolation

Actually you will have already prepared to do this by adding the fake T values in the above algorithm.  The fake T values allow you to examine the animation tracks with a fine comb.  At each T value you are checking to see if linear interpolation between the previous and following T values result in the same value as your current T value.  This wording is not accidental.  If your linearly interpolated value does not match the value given by the FBX SDK, then you have determined that your run-time playback of the track, which as we said only uses linear interpolation, will not give you a result with the accuracy you desire.  Then you keep that T value and move on.
By the end of the algorithm, you have kept only the keyframes that, when played back purely through linear interpolation, give you a result that matches "close enough" (based on your epsilon).

So there is nothing left to do.  Play the track back via only linear interpolation.


Once again, an engine does not load an FBX file.  That is bonkers.  You are responsible for going through an FBX file, taking out the data you want, loading your custom file via your engine, and playing back the animation yourself, which you do by attaching tracks (which you code from scratch) to scalars on your model (such as POS.X) and coding your tracks to give you back interpolated values between keyframes as you yourself tell them to "tick" (advance by a given amount of time).

Once all of your custom tracks, which you definitely wrote yourself and 14,000,605% did not come from the FBX SDK, because we have very very firmly established that it is not directly connected to your engine, have ticked and written an updated scalar value into whatever properties have tracks assigned to them, create your final matrices and render.


L. Spiro

I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid

15 hours ago, L. Spiro said:


Once again, an engine does not load an FBX file.  That is bonkers.  You are responsible for going through an FBX file, taking out the data you want, loading your custom file via your engine, and playing back the animation yourself, which you do by attaching tracks (which you code from scratch) to scalars on your model (such as POS.X) and coding your tracks to give you back interpolated values between keyframes as you yourself tell them to "tick" (advance by a given amount of time).

Once all of your custom tracks, which you definitely wrote yourself and 14,000,605% did not come from the FBX SDK, because we have very very firmly established that it is not directly connected to your engine, have ticked and written an updated scalar value into whatever properties have tracks assigned to them, create your final matrices and render.
 

 

I'm using a custom XML file as my file format. Before I was using the TLang1991 article as a reference as well as the OGLdev assimp fbx article to store 'baked transforms'. I tried to get it to work for weeks and my results were quite poor. Thus, I appreciate very much your reply - but I have a few questions :

 

The function you mention, FbxAnimEvaluator::GetNodeGlobalTransform().is not present in the 2019 FBX SDK Viewscene example. Could you give an example of its use? 

 

In the link with the transformation equation WorldTransform = ParentWorldTransform * T * Roff * Rp * Rpre * R * Rpost -1 * Rp -1 * Soff * Sp * S * Sp -1how are these values determined using the FBX SDK? Do you have some sample code to go along with your arm example scenario? Are LclTranslation / LclRotation / LclScaling the right options to use in your opinion?


void FBXtoAbj::ProcessSkeletonHierarchy(FbxNode *inRootNode)
{
    //cout << "inRootNode->GetChildCount() = " << inRootNode->GetChildCount() << endl;

    for (int i = 0; i < inRootNode->GetChildCount(); ++i)
    {
        FbxNode *currNode = inRootNode->GetChild(i);
        ProcessSkeletonHierarchyRecursively(currNode, 0, -1);

        //each 4x4 matrices that contain
        currNode->LclTranslation.Get(); //TRANSLATION
        currNode->RotationOffset.Get();
        currNode->RotationPivot.Get();
        currNode->PreRotation.Get();
        currNode->LclRotation.Get(); //ROTATION
        currNode->PostRotation.Get(); //get inverse
        currNode->PostRotation; //inverse
        currNode->RotationPivot.Get(); //inverse
        currNode->ScalingOffset.Get();
        currNode->ScalingPivot.Get();
        currNode->LclScaling.Get(); //SCALE
        currNode->ScalingPivot.Get(); //inverse
    }
}

 

I have a hard time conceptualizing how I will put this into my gbuffer at runtime to get my object to actually move - I know this sounds bad, but I should be using a timer of some sort...but how?

Don't do this. You would also need to take limits and scale propagation rules into account. The function you are looking for is on the node itself. Here is an example how to extract the skeleton. Note that I implemented this as iterative pre-order traversal https://en.wikipedia.org/wiki/Tree_traversal. But normal recursion is fine too!

void RnModel::Read( fbxsdk::FbxScene* Scene )
    {
    int Count = 0;
    FbxNode* Stack[ FBX_STACK_SIZE ];
    Stack[ Count++ ] = Scene->GetRootNode();

    while ( Count > 0 )
        {
        // Pop FBX source node
        FbxNode* Node = Stack[ --Count ];

        // Inspect node attributes
        FbxNodeAttribute* Attribute = Node->GetNodeAttribute();
        if ( Attribute && Attribute->GetAttributeType() == FbxNodeAttribute::eSkeleton)
            {
                // Get name without 'namespace'
                FbxString Name = Node->GetNameOnly();
                
                // The global node transform is equal to your local skeleton root if there is no parent bone 
                FbxAMatrix LocalTransform = Node->EvaluateGlobalTransform();
                
                // Do we have a parent bone, if yes then evaluate its global transform and apply the inverse to this nodes global transform
                if ( FbxNode* Parent = Node->GetParent() )
                    {
                    FbxNodeAttribute* ParentAttribute = Parent->GetNodeAttribute();
                    if ( ParentAttribute && ParentAttribute->GetAttributeType() == FbxNodeAttribute::eSkeleton )
                        {
                        FbxAMatrix GlobalParentTransform = Parent->EvaluateGlobalTransform();
                        LocalTransform = GlobalParentTransform.Inverse() * LocalTransform;
                        }
                    }
              
                // DON'T get the geometric transform here - it does not propagate! Bake it into the mesh instead! This makes it also easier to read the skinning from the clusters!
              
                // Now you can decompose
                FbxVector4 LocalTranslation = LocalTransform.GetT();
                FbxQuaternion LocalTranslation = LocalTransform.GetQ();
                FbxVector4 LocalScale = LocalTransform.GetS();
            }

        // Recurse
        for ( int Index = Node->GetChildCount() - 1; Index >= 0; --Index )
            {
            RN_ASSERT( Count < FBX_STACK_SIZE );
            Stack[ Count++ ] = Node->GetChild( Index );
            }
        }

 

    // Now read meshes (simply iterate geometries in the scene)

    for ( int Index = 0; Index < Scene->GetGeometryCount(); ++Index )
        {
        if ( FbxMesh* Mesh = FbxCast< FbxMesh >( Scene->GetGeometry( Index ) );)
            {
            // Export your mesh and bake the geometric transform into the vertex attributes! 

            FbxNode* Node = Mesh->GetNode();

            FbxVector4 GeometricTranslation = Node->GetGeometricTranslation( FbxNode::eSourcePivot );
            FbxVector4 GeometricRotation = Node->GetGeometricRotation( FbxNode::eSourcePivot );
            FbxVector4 GeometricScaling = Node->GetGeometricScaling( FbxNode::eSourcePivot );
            FbxAMatrix GeometricTransform( GeometricTranslation, GeometricRotation, GeometricScaling );
            }
        }
    }

 

After considering it for a while I realized my latest version of my engine does not use FbxAnimEvaluator::GetNodeGlobalTransform().  That was for getting the matrices for "baked" animations which sucked in my first engine.  I had considered putting a disclaimer on that post suggesting that it's been a while and I may have scrambled slightly some details.

You only need to store the bind pose (to calculate the "from" of from -> to) and the tracks.  The bind pose can be a tree of matrices because it is only used for rendering, so in that one case you can just get the "baked" set of final matrices for each bone in the bind pose.

Aside from the bind pose, the only other things you need are:

  1. Your own run-time bone class with all of the individual necessary scalars (which you may elect to group into vectors, but not matrices).  This means your bones all have a local translation vector, a rotation offset (another vector I believe), a local scaling vector, etc.
    1. Normally these properties are not considered specific to bones.  In FBX they are on every single node, because they all form the most basic properties of every single object in your scene.  If you ignore this fact, you will find that you are unable to correctly animate objects without bones.  I believe in my current engine I use CEntity to store all of this and CActor to implement the object hierarchy ("entity" and "actor" are the common names for the most basic class from which all other objects in the scene inherit, though most engines use just one or the other—I wanted to separate the parent/child relationship into its own class and the "these are the properties and functions to use them that all objects in the scene have" into another, so I used both, as "CActor : public CEntity").
    2. So if you follow my way, CEntity will have all the base properties needed to position anything in the scene.  This is where your local position vector goes, your local scaling vector, etc., and CActor will add the parent/child relationship you need to correctly depict the bone structure (but note again this not specific to bones; everything but CEntity inherits from CActor so all objects in the scene can be parented).  CActor, because it introduces the concept of parenting, also introduces the idea of using matrices to represent the final orientation of the object, thus it uses the local rotation, local scale, offsets, pivots, etc., to generate the final local matrix I mentioned before which should match FBX's, and from there uses its parent/child structure to generate "final" world matrices (where myWorldMatrix = myLocalMatrix * parentLocalMatrix).
      CBone (or whatever) would, like every other object, inherit from CActor and add anything specific to bones.
  2. Your own tracks.  This is why you do not have to evaluate the global positions.  By using proper animations rather than baked ones you only need to store the track data, processed only by the algorithm above to remove redundant key frames etc.

 

Putting This Data Together

  1. Go over each track.
    1. Each track should be connected to exactly one scalar.  That means a float, uint32_t, int32_t, or bool.  I believe I used templates, but note that interpolation always happens as a float and then is cast back to the scalar type, which means your track itself can store all of its keyframe values in floats.  A bool track would have floats with values of only 0 or 1, and you could interpolate a value to 0.2 which would be converted to true (note that "(bool)0.2f" is 0 (false), but your track system should use a logical conversion for this ("return interpValue /* 0.2f */ ? true : false; // returns true)).  Tracks in Maya/FBX/whatever can be attached to bool properties.
    2. So if you have to update the full position, you need 3 tracks.  One connects to POS.X, one on POS.Y, and one on POS.Z.  This matches the FBX SDK exactly and also how all 3D authoring software works.
  2. Advance each track by however many microseconds you need for the update you are doing.
    1. Never accumulate time in float or double; they drift on accumulation, and even when using uint64_t you can accumulate errors if your game loop is wrong.  float can be used to represent the current delta, but any objects that need to accumulate all the tick time they have had so far should tally the microseconds in unsigned integers.
    2. When a track is advanced by some microseconds, it determines its current track time (it loops if necessary, stops if necessary, advances backward, advances forward, does nothing if paused, etc., depending on its settings/state) and from there determines the value of the scalar it wants to produce via linear interpolation between the previous and next keyframes.
    3. My tracks use a pointer directly to the scalar they are intended to update.  As an optimization you often want to use dirty flags to avoid updating large data sets if there have been no changes to the data inside them, so my tracks also take an optional pointer to a uin32_t and a value to |= into that value if it does change the value of the scalar (in other words if it updates the scalar then it also sets some bits in a dirty flag).  For example if your rotation has not been modified then you don't want to generate the new rotation matrix later, so you could make a dirty flag such that bit 0 = UpdateRot, bit 1 = UpdateScale, bit 2 = UpdatePos, bit 3 = UpdateRotPivot, etc., and if—and only if—the track changes the let's say ROT.Y scalar value then it will also set bit 0 in the dirty flags.  This optimization can come later, but it is generally easy enough to implement on your first pass.  Attaching a track would look like: someFloatTrack.Attach( &myObj.Pos.x, &myObj.DirtyFlags, LS_DIRTYFLAG_ROT /* = 1 */ );.  Do take care to ensure the track is detached when the object is destroyed.

  3. That's mostly it.  You have advanced all the tracks, so your components (rotation, scale, pivots, offsets, etc.) are all updated.  I already explained in the first post that your engine will combine these into the final matrices that should match those in FBX, but now as you can see if you wanted to fully customize your animations it is now possible to do so while being fully compatible with the animations your artists provided.  You can decide to add your own track to a scalar if you want it to be dynamic, or pause a single track or whatever.  Your final result will always match what would appear in Maya if the same thing were done.

    1. Note that dynamic animations that cause a character to always look at a certain point are applied on the bone level after the final matrices have been created.  You would implement this by attaching an override to a bone, and after the matrices have all been created after this track update is done, execute all custom overrides on each bone (once again this would actually be more appropriate on CActor, since you would want this to be possible any anything, not just bones).  This particular override would take a target point and a weight, create a look-at matrix towards the target object, and interpolate between the default keyed animation matrix on that bone and the one it generated, so a weight of 0 = just the regular keyframe matrix, 0.5 would have the character look 50% towards the target, and 1 would have the character look directly at the target.

    2. Creating the final world matrices is also an implicit task of the CActor class.  The tracks have only generated a local matrix for that bone.  All CActor objects in the entire scene will be updated so that they build their final world matrices (with dirty flags again to avoid unnecessary updates) so there are no extra steps here.

 

Now you have a bind pose full of matrices and all the matrices that represent the current pose of your objects.  Applying skinning from here is entirely general and unrelated to FBX, so Google is your next resource.

 

18 hours ago, mrMatrix said:

I know this sounds bad, but I should be using a timer of some sort...but how?

Your game loop determines how much time has passed since its last update and updates all objects accordingly.  This is a matter of your game loop, not timers. http://lspiroengine.com/?p=378
Once you have determined how long an update needs to be in your game loop, that is the delta you pass to tracks, which handle the data as described above.


L. Spiro

I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid

I think it will much easier for the OP as a first step to simply iterate each frame over each bone in the skeleton and export the local transform similar to what I have shown above for the skeleton. This gives you an array of poses for your skeleton that is really simple and efficient to evaluate at run-time. After you have that working you can start thinking about animation compression or curve fitting or whatever is the new sexy. Personally I don't see much value iterating the curves in the FBXScene. I would rather build the curves myself for compression as shown below since this is way less error prone. E.g. you would need to take limits, constraints, and scale propagation into account. From my experience you really want to avoid dealing with the individual transform components. Note that I am calling FbxNodeEvaluateGlobalTransform() for both the child and the parent myself. This is what the FbxNodeEvaluateLocalTransform() should do as well to get everything right, but didn't and therefore had bugs in some earlier SDK versions. It might be save now with more recent SDK versions.

Here is an example  how to read an animation from an FBX file. It is the simplest I can come up with to give the OP the basic idea. It will give you the frame rate and length of the animation clip and the keyframes as an array of poses. 

 

void RnAnimation::Read( FbxScene* Scene, FbxAnimStack* AnimStack, const std::vector< FbxNode* >& Nodes )
    {
    Scene->SetCurrentAnimationStack( AnimStack );

    FbxString Name = AnimStack->GetNameOnly();
    FbxString TakeName = AnimStack->GetName();
    FbxTakeInfo* TakeInfo = Scene->GetTakeInfo( TakeName );
    FbxTimeSpan LocalTimeSpan = TakeInfo->mLocalTimeSpan;
    FbxTime Start = LocalTimeSpan.GetStart();
    FbxTime Stop = LocalTimeSpan.GetStop();
    FbxTime Duration = LocalTimeSpan.GetDuration();

    FbxTime::EMode TimeMode = FbxTime::GetGlobalTimeMode();
    FbxLongLong FrameCount = Duration.GetFrameCount( TimeMode );
    double FrameRate = FbxTime::GetFrameRate( TimeMode );
    
    for ( FbxLongLong Frame = Start.GetFrameCount( TimeMode ); Frame <= Stop.GetFrameCount( TimeMode ); ++Frame )
        {
        FbxTime Time;
        Time.SetFrame( Frame, TimeMode );

        for ( FbxNode* Node : Nodes )
            {
            // The global node transform is equal to your local skeleton root if there is no parent bone 
            FbxAMatrix LocalTransform = Node->EvaluateGlobalTransform( Time);

            // Do we have a parent bone, if yes then evaluate its global transform and apply the inverse to this nodes global transform
            if ( FbxNode* Parent = Node->GetParent() )
                {
                FbxNodeAttribute* ParentAttribute = Parent->GetNodeAttribute();
                if ( ParentAttribute && ParentAttribute->GetAttributeType() == FbxNodeAttribute::eSkeleton )
                    {
                    FbxAMatrix GlobalParentTransform = Parent->EvaluateGlobalTransform( Time );
                    LocalTransform = GlobalParentTransform.Inverse() * LocalTransform;
                    }

            // Decompose and save...

            FbxVector4 Translation( Transform.GetT() );
            mTranslationKeys.push_back( Translation );
            FbxQuaternion Rotation( Transform.GetQ() );
            mRotationKeys.push_back( Rotation );
            FbxVector Scale( Transform.GetS() );
            mScaleKeys.push_back( Scale );
            }
        }
    }

7 hours ago, Dirk Gregorius said:

I would rather build the curves myself for compression as shown below since this is way less errorprone.

This is quite close to what I warned him not to do in my first post, with the only difference being that you stored the decomposed matrix instead of the matrix.  It’s a baked animation either way, and I highlighted the problems with such a system in my first post.  It actually is the preferred method for animating cut-scenes and demos or creating a reference playback system, but manually handling pivots, scale propagation, constraints, etc., specifically is the method-of-choice for many games, because handling these is not really difficult and it gives you the control you need to publish a wide variety of games.

It is of course worth considering the experience of the person who has to actually implement it, and you are correct that your way is simpler, but I found it harder (impossible) to go from there to a correct system and had to start entirely over, and when I finally did I found that it was a lot simpler than I expected (all of my propagation code, creating all matrices via each property, and combining the matrices into the final result worked on the first try), plus using tracks/curves to animate made everything easier.  The run-time was cleaner and flexible enough to allow artists to work on model animations in a custom scene editor in a way that was natural to them.  In fact it was a custom scene editor that exposed all of the weaknesses of a baked-based system, and it all had to be scrapped.

I fully stand by my suggestion to walk the tracks and recompose manually, as it is deceptively easy and won’t leave a person with an empty feeling, knowing he or she has made nothing more than a static animation player…backer.


L. Spiro

I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid

Yes, I agree. I wouldn't of chosen to build the curves myself. Although I understand your reasoning for wanting to take on the task, some things are better off left alone.

E=MC Squared.

This topic is closed to new replies.

Advertisement