I made an important decision a couple days ago. Seeing how quickly the days are flying by, I realized that if I’m going to deliver on my promise of a more interactive energy ball experience in my August 1st demo, I’ll need to use the code that I already have and get that working with my newer Kinect sensor. Therefore, from here until August 1st when I demo this project, I’ll be placing a priority on setting up the currently working player-scaler scene with the Azure Kinect instead of trying to implement the metaballs first.

_Metaballs are beautiful and complex and not to be rushed—_I’ll keep repeating that phrase until I have everything ready to go. As a token of my commitment to this more logical direction, I just purchased the Azure examples pack in the Unity Asset Store. Time to get moving.


Using the asset’s documentation, I managed to get my Azure sensor up and running. At first, body tracking didn’t work because I’d installed an obsolete version of the Body Tracking SDK, but once I got the newer version installed, I was good to go. Side note: I’m hoping it works with RTX 4000 series cards—will need to test. Anyways, I’m super impressed with the smoothness of the tracking!

I’m disappointed that hand-pose tracking still isn’t developed.

The green screen demo works pretty well. I’ve been wondering how to improve the avatars, and since it looks like fancy hands are out of the picture, perhaps I’ll just throw the actual user into the experience. Honestly, though, I like this idea better:

By getting the position of each joint we can actually map them to a VFX graph! This approach gives me the ability to work on a joint by joint basis. My only issue is that I can’t remember where I grabbed this scene from, but at least I can port it over from the Kinect V2.


Tags: gamedev unity kinect vfx scripting