Lag Spike / High GC allocation when enabling behavior


New member

I'm trying to track down the cause of a lag spike that occurs when I enable the behavior tree on some GameObjects. It looks like it is caused by JSON serialization inside the `EnableBehavior` call but not sure why.

I am Instantiating the GameObject with `Start When Enabled` unchecked and then disabling it. Later I enable the gameobject with a trigger and call `BehaviorTree.EnableBehavior()`. This is when I see the spike. If I Instantiate and enable behavior when the scene loads I don't see the same GC Allocations

I've attached screenshots top get an idea of the size of my tree and variables along with Profiler info. I can send a Profiler data file if that helps. I just can't seem to attach in post

Also, I did try to detach the first node in the BT and replace with a single Idle task and I still had the same issue.

I'm using Unity 2018.3.6f1 on Win10 machine with latest Behavior Designer version

Any help is appreciated



New member
I changed over to use as External Tree. This didn't help by itself but I added in a some logic to force Init the external tree before disabling the game objects.

So basically:
        var behaviorTree = go.GetComponent<BehaviorTree>();
        while (!behaviorTree.ExternalBehavior.Initialized)
            yield return null;
I will look at adding units to my object pool at some point and see how it goes.

EDIT: I spoke too soon. As soon as a increased the number of units spawned I still the lag spikes....


Staff member
I recommend going with the pooling option - this will allow you to initialize the tree during a loading screen so the extra time doesn't matter as much.


New member
Using a pre-loaded external tree definitely helped although I still had issues when loading more than a handful of units at a time. Not as big of a spike but still noticeable. I got around this by load balancing/queuing the spawning of objects +assigning of the external trees.

Maybe I'm misunderstanding the benefit of external trees in terms of performance (I do see the usefulness for creating modular AI). I would have thought that once one External tree was initialized/deserialized that others that used the same one would not have the same overhead when loaded. But it doesn't seem to be the case.

It is a little tedious having to do 3 step initialization but at least it now doesn't affect the FPS noticeably. Thanks for your help.