Aware stage movement and animation

I know we won’t be allowed to make the animations for our animals, but will we be able to pick certain movement animations? For example, instead of a normal walking animation, make it hopping, like a frog or kangaroo. Or instead of chewing, swallowing something whole. Other options could include mating rituals, fighting (like using horns instead of mouth), or communication.

Other mechanics that could be implemented are having young follow parents, or go off on their own. Hunting in packs or alone, ambush method.

1 Like

I think it’s way too general to have a thread called “future gameplay mechanics”. So I renamed this thread.

1 Like

I think these are good ideas, but I don’t know if they’ll be too hard to implement. Although I also think it would be better if the animation was chosen by the game rather than the player by looking at the type of creature and its stats. A frog-legged creature would already hop on its own.

I think this might be incredibly hard to implement, but might be worth it in the long run (plus the aware stage is still years away). Hear me out.
We might use machine learning like this one to either always generate at least keyframes for the animation OR use it to create a library based on body types and then assign the type your creature mostly resembles. Both have benefits and downsides.
When generating the animations on the run, it will look much more natural and the creature won’t feel weightless or too heavy, or just too silly, but I doubt it is gonna be a super quick process.
On the other hand, if we create a library of hundreds of bodytypes and bodyshapes, all with their assigned keyframes, it might sometimes look a bit off, but at least it won’t take ages. The bigger problem might be if someone makes a creature whose bodytype was not thought of. Yes, the library might just get updated, but it does not seem right.
Once we have the keyframes, the animation should also be seamless and not look too clunky. Of course, we can have an animation for every single movement and every single situation. Or we can have a different approach. I was absolutely amazed by this video and how seamless the animation is with so little keyframes. Another example would be this video, though it did not amaze as much as the first one did.
What do you think? And would the first approach even be possible? (the last question is probably a question for @hhyyrylainen )

1 Like

Problem with machine learning type animation generation is that it takes a long time. Quote from the paper (https://www.goatstream.com/research/papers/SA2013/SA2013.pdf): “On a standardPC, optimization time takes between 2 and 12 hours” Yeah, we are not going to show a progress bar to the player for 12 hours after making edits to their creature.

I’m not sure how possible it is to make a library of premade body types and how they move. It’s very unlikely the player would make something that exactly matches. So anyway there needs to be some code to map the animations in the best possible way to the currently made creature. And it needs to be fast. That neural network for blending in animations seems very nice, but training it on a character probably takes a long time (so it is only useful for premade characters).

This is also a classical case of people making fancy videos and/or papers, WITHOUT PUBLISHING ANY SOURCE CODE. It just drives me wild. We can’t make replicate that work, without major, major effort in trying to understand the underlying concepts and then their explanation and work from that to try to build up a similar solution.

So for us the most likely thing to work is that we have a set of basic animations and a really complicated algorithm for comparing the creature the player made against them and blending together parts of the different animations to end up with a good looking animation set for the made creature. Additionally it needs to be fast, it should take just a couple of seconds.

2 Likes

Seems like I underestimated the size of the problem. Enormously. I was guessing the machine learning would be slow, but my guess was “yeah, maybe a few minutes”. Boy was I wrong! And the libraries would not have to match perfectly, just the closest match. For example dozens of bipeds based on their body proportions, dozens of quadrupeds based on their body proportions and so on. Maybe comparing the sizes of certain body parts and then based on that relation assigning it with the most fitting model (imagine it as taking all the cubes the creatures in the video are composed of and order them based on their size)? I am really not sure, perhaps I’m just beating a dead horse now, as I have no experience with any of this.

Maybe a combination of the two. Have a library, and detect the closest match, then modify slightly

I was about to necro an old thread in multicellular about this same thing, good that I broadened my search.

I’ve been thinking about plant gameplay, and the difficulty of allowing players to interact with growth in an intuitive way. It’s dawned on me through that process that assigning movements of an arbitrary (but fixed) body plan to player actions programmatically is related and just about as difficult.

It sounds like the plan is to make some kind of algorithm that assigns a roughly sensible swimming or walking animation to the creature, and use it for every movement. That’s a tall order on its own, but what about non-movement actions, like fighting or eating? And how will movement speed be decided? What about tolerance of of slopes for a walking gait (it could have a big impact in later stages)?

I don’t have tons of answers yet here, mostly just questions. For movement speed I suppose the base numbers would be the size of the creature and the percentage of it that’s used for locomotion, and then a factor from the stock gaits used (worm-like locomotion is slow, cheetah-like locomotion is fast).

1 Like

I didn’t forget about tis theme and searched any information about this. I found a video with link on website, where was left the code. I hope, it will help you, @hhyyrylainen. If this was known before this, i’m sorry, i just want to help realize the idea of machine learning for creatures. And i agree with OoferDoofer about combination of library and learning. I hope, it can help.
P.S. more one programme using Unity. GitHub here

From a quick look that’s just a walkthrough of implementing a certain kind of evolution simulation. I basically knew most of the big details already, and could have just googled them when needed.
Thrive is a bit of a special case when it comes to all of these evolution algorithms as we can’t have really long pauses when exiting the editor, whatever algorithm we use, it will need to be able to figure out animations for the player creature in a few seconds, instead of learning to walk using an evolution type simulation that takes probably at least minutes to get somewhat working animation.

I also want to say that while it is nice to see what other people have made, and there is a possibility that something is a novel idea, I’m not getting much out of these. Someone will need to at some point write the code for such kind of simulation in thrive directly, and while a few ready-made examples help, they don’t magically make suitable code appear out of thin air.

I understand that players mustn’t wait long time (whole hours), but if we will combine library with many ready variants, where, even during the redacting will be searched the most similar variant and its movements, with machine learning this process can be shorter, much shorter because of lack of need of firsts steps. But the hardest thing here - programme. Sorry, if i write obvious things.