Started experimented with the hand states and made an alarming discovery: hand state tracking doesn’t work correctly on the Azure. It works well on the V2 so I can always revert to it, but I’ve been pretty stoked to move forward with this Azure upgrade. I’m not sure whether this bug is caused by the Azure Kinect Demos package or whether it’s a bug in Microsoft’s SDK, so I decided to draft up an email to the asset creator and see if I can get some help. On his site, he requests that people abstain from contacting him on weekends, so I’ll have to wait till Monday to send off the email. For now, I’ll work with the V2 sensor. Hello again, old friend.


I experimented with the idea of a unidirectional particle flow from one hand to the body to the other hand. This way, I’d be able to implement really cool behaviors when paired with hand state tracking. One hand would pull the particles back into it, and the other could push the particles out into space. It sounded good on paper…

I quickly decided this would be too tricky to implement properly on my current timeline.


I finally figured out how to scale the UserMesh and Cubeman together. I want to get the size of the user as close to the original version’s dimensions as possible so I can recreate the physics more easily. Now, my particles don’t need to be microscopic.

Now that I’m back on the V2 camera, I even considered reusing my original KinectManager script. It’s a whole lot simpler than the one that’s bundled with the demos in the Asset Store, and at this point, simpler is better. I better email the asset developer first though and see if we can get the Azure fixed.


I was feeling very frustrated about the lack of progress I made on this project this weekend. Then, suddenly, everything stopped working and I had no idea why. The Kinect userIds started instantiating as 12 digit numbers and bodies would no longer get tracked properly anymore. I had no idea what I did to break the code, and frankly didn’t even feel like digging deep when I knew it had to be something ridiculously stupid. I reverted to my last git commit and redid everything I’ve done this weekend, but more cleanly.

Doing this again was actually refreshing. Sometimes, the flow is blocked and other times, it isn’t. Profound, and quotable—I know.

Now that everything is set up again, I realized that scaling the player body doesn’t work with an actual user’s hand joints (yet). It wasn’t working because rather than moving the rigidbodies on the hands with force, the Cubeman controller script manually sets the joint positions each frame—this causes the rigidbody.velocity to report as 0 even when the game object is in motion. The fix was replacing that velocity with a manually calculated one based on change in position. Now, scaling sort of works.

I’m not sure why it starts and stops the way it does. This requires visible debugging rather than log statements. I’ll make the colliders and body visible, and I’ll draw rays along the velocity vectors of the hands. That’s a job for another day, but since I didn’t shut down my computer yet, I’ll say this: the issue has to do with the ray only hitting the opposite hand but not the body (this I could determine from the logs). For now, I disabled any need to intersect the body with the raycasts.

There are three factors that determine the scale amount.

float scaleAmt =
            displacementScaler
            * distanceDamper
            * hitAccuracyDamper
            * controller.so.pulseScaleDamper;

Either the scale amount is getting set to 0 by one (or multiple) dampers, or the rays stop hitting the collider during hand motion (seems less likely). I can rule out the pulse scale damper since that’s a static value, so it comes down to the other two.


Tags: gamedev unity kinect vfx physics debugging programming