I decided to stay in last night. Did I wake up early? Not really. But I did recharge and reaffirm my mission to get these hand animations done, and that itself is worth something.

I started by watching several YouTube videos on the Playables API and on animating VFX graphs with the Unity Timeline. I didn’t see anyone using it as ChatGPT described to me was possible last night (per-instance binding), but decided I’d need to test things out before calling BS. I decided to clean up my player prefab structure even more first, making the left hand game object group a prefab and then cloning it into a variant for the right hand. This means that all game objects inside the hand prefab have the same name in both the original and the variant. This compromises hierarchy clarity in exchange for ease of animating since animation clips targeting children of a parent’s animator only work on clones of the parent if the children have the same names (parent names can differ). Thankfully, I found an editor script that lets me work around that in a StackOverflow thread ChatGPT linked me to.


I wanted to start with something simple to help me get a feel for how the Timeline works. However, I quickly realized that the Timeline was not what I needed because it’s built for linear animation sequences. In my case, I’m blending between different animation clips but not really playing anything sequentially. I decided to test out the Playables API instead and see if I could come up with a working graph…with Claude’s help of course.

It took three tries to get right, but for those of you out there wondering what proper coding prompt engineering looks like, you might wanna study the below:

Prompt

Take a look at these docs: https://docs.unity3d.com/6000.1/Documentation/Manual/Playables-Examples.html.

I want to use the Playables API to trigger and smoothly blend animation clips in my Unity scene. I’ll be animating exposed parameters of a VFX graph with 4 different animation clips: empty, init, open, and close. Here’s how I want this to work:

  1. At the time the VFX graph instantiates in the scene, the empty clip is applied. This effectively hides any particles from showing.
  2. When a particular “HandOpen” trigger in my game occurs, I want to call a function to start the particle emissions by immediately playing the init animation (without blending, since the empty animation is a 1 frame value setter). I should be able to trigger this function from another script so it should be public.
  3. When the init animation clip is reasonably close to finished, we transition smoothly into the open animation clip.
  4. When a “HandClose” trigger in my game occurs, call a function to blend from the current state into the close animation clip. This means that even if we’re somewhere in between blending init and open, we should begin blending into close.
  5. If 2 full seconds ellapse between the time that the closed animation clip finishes and a new “HandOpen” trigger occurs, then we blend between the closed and init clips and then to the open clip as described previously. However, if less than 2 seconds ellapse between the time the closed animation clip finishes and a new “HandOpen” trigger occurs, we blend between close and open clips, skipping the init clip entirely.
  6. Expose Animation curves in the inspector that I can use to control the blending between the different events. From what I gather, I only need 2 curves—one for close-to-open transition and another for open-to-close.
  7. Expose optional delay params I can use to delay the start of the close-to-open and open-to-close transitions when the VFX graph events are sent.

Aside from creating the scripts needed to get this system working, I also want you to describe how each node in the Playable Graph is connected.

Suffice to say the graph did what it was supposed to!

Unfortunately, even though this Playable graph was clearly working, and I could see the spawn rate exposed parameter of my VFX graph changing in the inspector, nothing happened to my particles. I asked Claude if this was a bug.

QUOTE

This is a known issue with VFX Graph and animated properties! The problem is that VFX Graph doesn’t automatically detect when properties are being animated by external systems like the Animation system or Playables API.

Mind you, that in my three separate prompts to Claude (and one additional prompt to o3), I explained that “I’ll be animating exposed parameters of a VFX graph with 4 different animation clips”. I spent all this time vibe-coding a working Playable Graph without being forewarned of the INABILITY to animate exposed VFX graph parameters. Unbelievable.

The good news is that the regular ‘ol Animator system still works (and blends). I’m going back to that. I’m able to fade my init animation into my open animation using a coroutine. People say coroutines are inefficient. They work, and that’s what matters.

The bad news is that my boss is praying I fix the state of the iOS app we’re building because our CEO is back in town from his tropical vacation tomorrow, and you know he wants to see that great progress was made while he lounged in the sun with his annoying ass kids. I kid. But seriously. The code freeze will have to wait.


Tags: unity gamedev animation vfx programming chatgpt debugging