This is gonna be a long log because I’ve updated my code without posting in here. Yeah, yeah. I broke that rule. The beautiful thing about a good rule is that any punishment for breaking it lies within the rule itself. What I mean by that is, because I decided to skip writing in here the last few times, I have a lot more writing to do now. I’ll go chronologically…

I may not have expressively written this in here, but the job that I am now three months into is a coding job. I don’t work in Unity, but I do write code daily. As such, I’ve taken quite a liking to ChatGPT. In fact, I’m not confident I’d be able to get by without it anymore. I say that as to preface this documentation of my sorry attempt at getting that metaballs compute shader to work. I ChatGPT’d the hell out of that post until I realized it was not actually an all-encompassing tutorial. Thanks to ChatGPT, I realized that I actually needed to understand the code (and add on to it) before I could hope to get metaballs working. In the frenzy of the workweek, I decided that would have to wait for another time. What’s cool is that I can export my conversation with ChatGPT— Here it is for your viewing pleasure.

Having given up on coding the metaballs compute shader myself, I thought that would be it for now. But then, a week or two later, I saw this demo of Devin, and was inspired to request early access to it. I filled out their Google form, specifically submitting my compute shader questions to Cognition Labs in hope that AI will solve my code for me. Here’s what I wrote:

I would like Devon to help me with an interactive energy ball game experience I’ve been independently building in Unity over the last year and a half. This game experience allows a participant to form and manipulate an energy ball made of VFX graph particles using the tracked motion of their hands from a camera sensor. I’ve demoed V1 of this still unnamed experience at several festivals around the US and people have loved using it. However, I know that it can be improved and am working hard to make V2 of the experience even better. Specifically, I intend for V2 to introduce interactivity between the energy balls of different players using gravitational forces. Energy balls from different players will then be able to merge together and separate again based on each player’s movements, while the gravitational force of a given player’s energy ball will depend on its size. I expect this to really gamify the experience on a deeper level.

However, for the merge/separation effects to look perfect, I want to the spherical meshes that the energy balls are based on to behave as metaballs. Therefore, I need to create an optimized compute shader that creates metaballs from all spherical meshes inside a parent game object. A compute shader is needed rather than a vertex shader, because from what I understand, vertex shaders don’t write data back to the mesh object, and I need the modified mesh data to be passed over to Unity’s SDF bake tool in order to generate a single combined SDF that I can conform VFX graph particles to.

Additionally, there is another effect that would enhance the physics of the scene also involving the modification of mesh data that perhaps could be combined into the code of the metaball compute shader. I don’t know if that’s actually the case, and I hope Devon can help me figure that out. Let me explain.

In my current code, a spherical mesh will grow or shrink uniformly in size on its X, Y, and Z axes based on the movement of the player’s hands, but this scale effect would be even better if the mesh scaled only on an axis perpendicular to the midpoint of the velocity vectors of each of the player’s hands. For example, if the player’s left hand was moving at a 45 degree angle from the origin towards the upper left, and the player’s right hand moving from the origin at 25 degrees to the upper right, then the midpoint of the two hands’ velocity vectors would be 80 degrees from the origin to the upper right, and I’d want the mesh to transform in scale along an axis perpendicular to that midpoint vector, which in this case would be at 10 degrees from the origin towards to upper left and 10 degrees from the origin to the bottom right. The deformation should increase the scale along that axis when the hands are moving toward one another, and decrease the scale when the hands are moving away from each other. I’d want there to be an exposed parameter in the inspector that allows me to control the scaling amount relative to the change in distance between the hands between frames (time.deltaTime).

The reason I want this scaling to happen on the game object’s mesh rather than on its transform is because scaling the transform in a specific direction would require me to lock the transform’s rotation, which I expect would interfere with the overall physics of the scene at some point down the line. A big principle of mine is to not build features that will cause bugs later on! Anyways, using a compute shader to scale the mesh would give the game object’s transform freedom to rotate while forces move it around 3D space.

There are several resources I have for Devon. Most helpful, I think, would be my dev log, which details much of the process I’ve taken in building my V2 project. That link is here: https://www.notion.so/ntify/Dev-Log-c0e582a4488d4f898fe9dbb7bf45ac98

Second, is a link to my repo for the V1 project, which uses a 2nd gen Xbox Kinect. The production scene for V1 is located in Assets/Scenes/KinectParticles. I’m not sure if Devon can parse it, but either way, it’s over here: https://github.com/arghhhhh/KinectParticles

Last, are some links to compute shader tutorials that I tried following. One of them proposes a method for creating compute shader metaballs: https://gamedev.net/blogs/entry/2267304-were-back-with-unity-compute-shaders-and-some-delicious-metaballs/

The other one covers the different techniques for mesh deformation: https://blog.logrocket.com/deforming-mesh-unity/

Thanks for taking the time to consider my application. I’m not versed in compute shaders at all so this project has stalled a bit. I would love Devon’s help here!

Hopefully, they still consider me even though I mispelled Devin—I just noticed that and am now cringing. Wow.


All of that happened over the last couple weeks. With that past covered, I can now write about the past that just happened tonight. It’s Friday and I’ve been alone in the office for several hours. I decided to finally get back to building this thing and create Scene #5.

The purpose of Scene #5 is to get the VFX Graph functionality merged with all the physics stuff I’ve built. The sphere meshes from each player will get combined into one SDF that I’ll pipe into a VFX graph and make pretty.

I started by going into Scene #2 and Scene #4 and copying stuff over. I attached my mesh combiner script to a parent game object that my dummy players sat inside. I kept trying to drag my VFX graph in my project manager into the VFX slot in my script’s inspector. It took me way too long to realize why what I was doing wasn’t working. I’d allowed myself to stop thinking about this project on a daily basis and definitely had some rust to brush off in Unity. Eventually, I got my setup working. However, I soon realized that I’d need to tweak the spawn position of the VFX particles to match the hand positions of the players. I wanted to avoid having multiple VFX graphs at all costs, so I decided superposition was the way to go here. I knew Scene #1 would come in handy. At first, I thought I might be better off writing the positions of each hand in the scene to an array that gets passed into a graphics buffer and processing that buffer with my VFX graph, but clearly I only thought that way because I didn’t understand how graphics buffers worked. The buffer still gets updated each frame in the Update() loop, and once I learned that, I realized that all I should do is modify a Vector3 each frame and pass it to the graph. So that’s what I did!

This is a single VFX graph. Upon typing that, I just realized that for cool effects to work, I absolutely need to have more than one VFX graph.

The verdict is in: each player will get its own VFX graph. Because the SDF is shared by all graphs, and the particles are drawn to the nearest mesh, it’ll look great. But that means I just wasted time with the superposition again. Like I said, I’ve been away from this project for too long. Damn it feels good to work again!


Tags: unity gamedev vfx compute-shaders metaballs mesh ai prompting