While this project is a little rough around the edges, the output looks to be worth the effort. It's also something I've seen asked for a number of times, the mapping of Kinect gestures to a character model...
XNA Skinned Model - Kinect Rig
This is a XNA application, written in C#, using the Kinect skeleton tracker.
It's possible to animate in real time a skinned model, in this case a warrior (the data is not being recorded, but it could be). The difference to the existing tutorial on msdn's code sample is that they use manual key frame animations.
The joint positions given by the kinect skeleton data are being used to calculate the relative angles between adjacent body parts, and then these angles are used for the rotation matrices on each logic bone, present on the 3d model.
In terms of code, nothing needs to be changed in order to use a different 3D model, as long as it uses the same Bone Hierarchy.
----------------------------------------------- Code walk-through
The SkinningSample from MSDN has animation logic over time, and reads each keyframe's transformations from the 'dude' file.
But we want kinect to provide the animation data, so i removed that keyframe part from the code.
The model has Bones, and a Mesh (skin and clothes).
The mesh is linked to the bones, so if we manage to transform a bone using a matrix, we will be automatically transforming the respective body section.
For example, if we add a rotation matrix to the left hip, the whole left leg will be rotating, also because of the bone hierarchy.
In the AnimationPlayer class, you will find UpdateWorldTransforms, UpdateSkinTransforms, and the most important, the UpdateBoneTransforms method. They fill these arrays:
If instead of reading keyframe data, the UpdateBoneTransforms method uses kinect's skeleton data to create transformation matrices for each bone, all we need to do is render.
Kinect provides us with 3d coordinates of each joint of the player, so that would give us a Translation Matrix, but for each bone we already the position, what we need is the rotation, for the leg for example.
So how do you turn points into angles, for the rotation matrix?
You take 3 points, like the kinect positions of: left ankle, left knee and left hip, and you create two vectors (like in physics).
Normalize these 2 vectors, use the Dot product on them, and get their angle using the ArcCos mathematical function (c#: Math.Acos()).
Consider this thread from msdn: https://social.msdn.microsoft.com/Forums/en-AU/kinectsdknuiapi/thread/8516bab7...
Create a rotation matrix for the XY plane using Vectors with z = 0.
Create a rotation matrix for the ZY plane using Vectors with x = 0.
Add your calculated matrix to the boneTransforms array, at the correct index.
(I analyzed the example 'dude.fbx' on a text editor, and saw that the order of the bones in the file gives you the index: index 1 is the Pelvis, 3 is the Spine, 7 is the Head, 13 is the Left upper arm, etc...)
------------------------------------------------- Replacing the Model
A while back I tried some tutorials for the "3ds max" modeling software, where you learn to edit bones, (specifically bipeds), and use "Envelope" and "Physique" modifiers to attach these bones to the 3d mesh of your model.
In order to change the model we animate through the kinect device, using the same XNA code/program, we need the same bone hierarchy, so let's copy them.
I didn't have time to try this myself, this is what i will do: