I was unpleasantly surprised when I opened this site on my iPhone and saw that videos I added autoplay muted loop attributes to two days ago automatically played fullscreen on load. I don’t understand why iOS would even allow that behavior, and am surprised that annoying ads don’t hijack the browser viewport using this technique. Anyhow, the fix was adding playsinline to the video attributes. Annoyance gone.


Last night before the event tomorrow so I decided to do some final reviews and tests. You know how it goes. I found some bugs, fixed some bugs, then found some more bugs.

The first todo I crossed out (technically untrue) was switching to using a Webcam Texture for the Kinect Color Stream. I wasn’t able to get this working (even though I swear that in 2025-06-23 I had it working). Still, my efforts were not wasted because I fixed the size of the render texture I was outputting to and changed its color format to B8G8R8A8_UNORM, which, according to Claude, is the “closest match to your Kinect’s BGRA32 data format and should provide the best quality with minimal conversion overhead.” Additionally, I removed texture flipping from occuring during the Graphics.Blit operation and instead just inverted my RawImage quad’s X scale in the Inspector while rotating it 180 degrees on the Z axis. Tiny optimization.

Next, I tried adding a volume layer specifically to my particles because I want to increase their visibility against the camera feed background. I quickly learned that implementing this wouldn’t be a walk in the park—the camera rendering my particles is an overlay of the main camera, and there’s no way to restrict post processing effects to an overlay camera without them also applying to its base.

The cube only renders to the base layer, yet still inherits PP from the overlay
The cube only renders to the base layer, yet still inherits PP from the overlay

ChatGPT offered me a few potential solutions but none were easy and therefore I abstained. This is something I can get to further down the line. In the meanwhile, I took the simpler approach to increasing particle visibility. I reduced bloom, enabled particle colors, and then increased their size slightly.

White is the enemy of visibility
White is the enemy of visibility

Lastly, I fixed a bug that I noticed the other day but couldn’t replicate, which was extra metaball indices getting assigned to a single player without them ever being respawned. I realized that I was able to replicate the bug consistently whenever I clicked the toggle for custom colors in my runtime settings menu. That seemed like enough of a context to pass to Claude and hope for a fix.

Claude and I make a great team. It incorrectly described the issue, but also pointed me to the right piece of code, which was this:

if (Input.GetMouseButtonDown(0))
{
	DeleteAllBodies(knownIds);
}

I’d built a handy scene left-click reset function into play mode a while ago and forgot about it when I created my in game settings menu. The custom colors toggle had nothing to do with the issue—left clicking (the toggle, or anything else) is what was driving it. Adding a condition to that if statement that checked for the menu was the first part of the fix, and the second part of the fix consisted of actually fixing (gasp) the underlying issue, my DeleteAllBodies function, which incorrectly removed players from the scene without resetting their corresponding metaball indices.


Oh no. I just realized something. My laptop has a single HDMI output and two thunderbolt outputs. My capture card requires HDMI input and so does my projector. Therefore, I need a thunderbolt-to-HDMI cable to be able to use both. Or an HDMI splitter. I own neither of these things.

Thankfully, judging from Best Buy’s website, the store near my office has a few cables left in stock. Let’s hope that’s true. Guess I’ll find out tomorrow.


One bug I discovered tonight is that it’s not that difficult to exit the boundaries of the marching cubes container I use to render the metaball SDFs. Escaping the bounds leads to chaos as particles flee back to the center of the screen looking for the nearest attractor.

Right as my dad walked in lol

Luckily, I already have a setting in my game meant to address this: maxDistanceFromCamera. I was already using it to hide a player’s skeleton (when enabled) when their z-position is far away. I just added in logic to hide the particles by disabling the game objects they’re attached to. Then, I realized I already had logic for that and just needed to decrease maxDistanceFromCamera . Lel. This fixes the issue for participants that are far away, but does not solve it for energy balls tossed in any of the other 5 directions.


I’m glad I decided to test my capture card setup again. I learned that outputting one display to multiple other displays is not possible via Windows display settings, but is possible using a dirty display cloning hack in NVIDIA Control Panel. Instead of purchasing a USB-C to HDMI cable, I’ll be better off purchasing an HDMI splitter. Luckily, Best Buy also has one of those in stock.


And…for my final magic trick of the night, I fixed particles exploding back toward the center when the energy ball exits the metaball bounding box in any direction, not just the backwards one. I did this by adding an inverted Collision Shape (Box) node to my VFX graph that sets the lifetime of the particles to 0 as soon as a collision with its boundary is detected.

Time to code freeze.


Tags: debugging ios unity vfx ar