Almost a year after I posted the last comment here, Ive made some progress
In fact, I rewrote the whole behavior tree thing. Twice. Basically because I am not really happy with the current implementation.
In addition, I have had some bad experiences with developing abstract "frameworks" without having a concrete use case to show its features and verify the usability. So I wrote a small game using libgdx (which has a comparable entity system like terasology), and I finally found some places, where a behavior tree may become handy. This is true for any timed stuff - in the game I use the behavior trees to model the (damage/buff) effects on entities. So instead of having counters and "if (currentTime>duration) {...}"-code all over the place, I use small trees written in json ({ sequence: [{deal_damage:{ base:50, dice:"2D25"}}], {delay:{ duration:0.5 }}, {slow_down: {factor:0.5}}] }).
If someone wants to take a look at it, its in my github, as usual (
https://github.com/synopia/tdx). I also assembled some kind of overview documentation there.
The trees can be evaluated in two ways: The simple, direct way by iterating through the tree each tick and by compiling the tree into bytecode. I did some profiling to compare both ways. The performance is more or less equal, but especially for big trees, a lot of memory may be saved for other, more important things. All in all, I am quite unsure, if the compiling stuff is really necessary - but as its really fancy and not *that* complex, I tend to let the code in there and everyone using it, needs to decide which way to take.
Now, I am on integrating everything into terasology, which shouldn't be that much work.