Reference KSP, neuron for actions and action groups

reference KSP, neuron for actions and action groups.

1 neuron control 1 muscle ,set key bind for every muscle. add another 5 neuron to control 1 action group(whole finger),add another 10 neuron to control whole hand with one key bind. now you get grab action.

if player keep build up the action group , they can finally use a w to walk forward, e to eat ,f to grab.

2 Likes

I do like it, but I cant see the devs doing this.

Someone already did.

I do like the idea of having to add brain regions to control your body though, that sounds cool.

1 Like

Unfortunately this is probably not feasible for the intended scale of thrive, but it is a good opportunity to discuss an often underestimated problem (mainly of the Aware stage): Movement control. Individually grouping muscles into action groups and binding them to keys may work for something grasping, but now imagine you have an arm attached: Multiple muscles, often working in different directions and, most importantly, at different strengths of contraction to coordinate the limb in space.
Even if you could set macros, like "When “w” is pressed, contract biceps, relax triceps, etc.), you would probably need quite a few for a single limb, for all the different functions it can have, that require different movement patterns. (e.g. walking, running, jumping, swimming, crouching etc. in legs).
But even if we assume that we could further finetune these macros, so that one key controls all legs to walk, setting these macros up would be frustratingly complex for a player. And automatically generating a movement pattern isn’t feasible either. After all, how is the program supposed to know how to use all of the the limbs of your beautiful abomination, which may be a cornucopia of tentacles, legs, arms, feeding limbs and so on, when teaching A.I.s to walk with a standard quadrupedal 3D model is already a monumental project, demanding a lot of time and computing power?

4 Likes

I made a concept that might help with this:
image


@Spode ,what do you think?

First of all: I can always appreciate a well thought-out model and it is obvious that you put quite a bit of effort into this. Keep up the good work, and don’t be discouraged by me nitpicking about little problems in terms of gameplay or biology. After all, long-term motivation might very well be the most important aspect of one’s mindset, when contributing to the development of a game as ambitious as Thrive.

Despite this, I am not entirely happy with this. Intuitive as this system might be, I fear it wouldn’t scale well. By that I mean that it might be an elegant solution for early multicellular live and even primitive aware creatures, with simple body plans and a lack of precise, coordinated movements. But as soon as you involve actual specialized limbs into the equation, it could become a mess quite quickly. Let’s model a human arm with this model, shall we? We will notice right o the bat, that it has a ton of pivots (upper, lower arm, hand, every part of each finger…) which only allow certain types of movements against each other (upper arm can’t rotate against lower arm etc.). Finetuning even the basics of this, might be tedious, but with grouping of pivots together and good UI, this should not be too big of a problem. Instead, my main concerns are these:

Most limbs can perform very different movement sets depending of the situation. An Arm can grasp with his hand, punch, grab and pull or push something, perform gestures for intraspecific communication etc. We would have to set a hotkey for each of these movement sets, resulting in half of the keyboard being taken up by just the different options of a single limb. Then why don’t we just hotkey the corresponding movement sets of multiple limbs together? After all, no need to control both legs separately when running, right? But this leaves us with another problem: Relative position of the body to his environment. After all, a meticulously designed bipedal walking movement set may work perfectly on flat ground, but if we introduce even a slight slope the creature would probably topple over, since it performs the same movements but with it’s center of gravity shifted. Again, this might work out for the simple movements of multicellular organisms floating in a tidepool, but for complex terrestrial organisms it would probably not be enough. For example, to grasp something, you have to coordinate your arm in a way that your fingers are in the correct position to actually hold the object when they close, but always moving hands and arms separately would be tough for the player. How about moving sideways or jumping, which would be highly dependent on the surface? What about climbing, where precise grasping motions and balance (something the game will probably never be able to handle realistically by itself, since it is just so complex) are key?

So, in short, I don’t see this model in its current form working well for the aware stage, despite looking really good for the multicellular stage. Although I don’t think switching the movement system up completely between the stages would be a good idea, so we will probably need to work on a good base, that can be easily expanded to make it intuitive to learn for the player as he progresses towards a more complex anatomy and different environments.

Thanks for the feedback! I do agree with you, but I guess there is only so much explanation on a concept you can provide in a limited space. You seem to have listed 3 problems.

  1. Specialized limbs are hard to fine-tune

  2. The variety of movements are tedious to make, and seem infinite depending on pivot organization

  3. Repetitive movements of exact order are troublesome in stability

  4. Complex anatomies hard to simulate

I’ll try to list solutions for these problems. For the majority, the problems seem to focus on movement constraints. Let me just say first of all, that the pivots would move using inverse kinematics, and procedural animation. At first, this sounds less in the player’s control, but mind you that the player can add to the armature of the organism at anytime, as well as have a played version of the action on screen. Walking won’t require the movement of one side of pivots, but most likely a mirror side. Also, the pivots aren’t limited to rotational movement, but i would suspect linear as well. Second of all, when you walk, or feel a wall, you don’t normally aim through or toward the center of the wall/floor, but rather the surface. It is only when you do high force movements where you aim more inward for more power. There should be a line between passive low force moves, and aggressive (in an energy sense) high force moves. Speaking of which, there should probably be a break force of both objects, and the organism’s pivots, since it doesn’t seem viable for a tiny soft ant to be banging its head up a wall like a pachycephalosaur. This limits the familiar ksp mishap where you eva your kerbal while moving fast over a planet, and their meshes glitch out. Movements aren’t exact either, but rather aim to do something, but follow the most viable path to do it. Lets say you wanna make two legs walk up a hill. Your feet’s main target pivots won’t be set to your organism, but to the farthest allowed surface they can reach. Also, yes complex anatomies are hard to simulate. A muscle fiber is pretty easy to understand. A chain, a bit harder. An entire human musculatory system, very hard. Although it is hard to understand the overall picture, simplifying things can theoretically make the same effect as actual anatomy forms. Any more feedback on this reply would be nice.

auto ik binding and auto foot step tuning become a thing in game engine.
If we won’t change the engine in this few year.
Can we rebuild these funcion ingame?

Spode, as in the god referenced by certain alien civilisations in Spore’s space stage?

i got 40 meters without using the op keys
in qwop