New idea. Since I know how to utilize attraction force/speed and stick distance/force separately to control the particle attraction, I should place an attraction sphere at the same position as each player’s metaball. I can have the sphere draw the particles toward that player’s metaball, and give the metaball’s SDF a strong stick force so the particles conform to its surface. The advantage of doing this is so that each player’s particles favor their own metaball, even when another player’s is closer to a hand emitter.
I need to refine my animation subgraph a bit to add stopgaps for interrupted animations…
The more I talk to ChatGPT, the more I realize I’ve limited myself with this VFX graph-based animation approach. It seems Unity’s animation system is still king when it comes to blend trees, as there’s no way for me to smoothly blend between two animations completely within VFX graph since that would require tracking the current output value from the animations, something that only is available to the CPU. However, since I’d absolutely like to avoid using the Animator, I want to take the fair middle-ground approach—script-controlled animation.
The more I talk to ChatGPT, the more I think a fully contained VFX animation system with animation blending is actually possible by leveraging custom attributes. However, I need to consult the full council of LLMs (Claude, Gemini, GPT o3). I always get confused which ChatGPT model to use because of their stupid naming system. It makes zero sense that o3 is more powerful than o4. I think I screwed up my chat’s context by switching to o4 midway through “designing” this adaptable VFX animation system because I wanted to use the smarter model for what I considered a very complex task. Oh well.
Anyways, what’s more important is that I send a demo reel for this project to Film Gate because they need promo materials for the event in less than two weeks. I told them I’d get it to em today, and if you count today as any point before I go to sleep regardless of when midnight happens, I will certainly make that cutoff. Since I’ve been recording all my visual progress updates in my room and the project uses a live camera feed now, it’d be pretty gross (from an artistic standpoint) to send any current footage for the demo. Therefore I’ll once again rely on the good ol’ footage I took one time in my cousin’s friend’s art studio. Once I get these new animations worked out and implement a few more roadmap features, I’ll get some more studio footage.
Tags: unity vfx animation particles metaballs scripting gamedev