Visualization of the five senses

I’ve been thinking about this topic alot and I think I have a good rundown with how the five senses in macroscopic stage may work.

There hasn’t been much talk about it but from what I’ve heard some people suggested to have your creature’s Sight blured, which I vehemently disagree with. I hate when things are blured, they make my eyes water, so instead if you’re creature has bad eyesight things look simpler, I was thinking closer to this style at it’s simplest probably even more simple. While with better eyesight it may look more like this Macroscopic editor concept art - #2 by Mr_42. But having no eyesight will result in pitch black surroundings.

Eyesight could scale in three separate way’s: color spectrum, depth perception, and eye sensitivity.

Color spectrum - effects what colors you can see, starting from greyscale and then into uv and beyond

Depth perception - effects how detailed a shape is, you start from barly being able to tell where the head of a creature is into seeing a human level of detail

Eye sensitivity - the more sensitive you make your eyes the better you can see In the dark, but the more you’ll be blind in the light to the point you can’t see anything in the daytime.

The discussion I’ve seen on this is pretty much do things like this Which is exactly what I’m suggesting, with lines representing soundwaves going over creature’s and objects. And I think leaving an outline,

would make things easier without having you scream constantly to know where things were. how long this outline lasts would scale with your creature’s memory giving improved brain cells a use beyond just progressing.

This is the strongest sense the player will start with on average. The sense acts like minecraft blindness in where it illuminates a small area around you but this version will only have it in black and white and will only illuminate things the player is touching.
The illuminated area could last for abit before fading away
In order for the player to not get turned around

Smell and taste from what I’ve seen was just thought as maybe a pop-up or line of text but I believe we can go much more in depth then that.
Smell could be represented by having everything be blanketed in colors that represent each Smell, similar to compound clouds, and to keep everything from looking like rainbow barf all the smells will mix into only one color which is more realistic in my opinion.

This color then can be defined into different ones if you use the “sniff” key, the better Smell you have the more the colors will be defined but the more overpowering stronger smells will be.

Taste is bit more difficult, I think having some text for this would work well, and also making it work like Smell if it had Touch based mechanics would work fine, so licking something would be an alternative to “sniff”. I also think that having each text say something like tastes sweet, gross, yummy ect, would help give insight into what it does to your creature. these texts could then give more details if you know where the tastes are coming from, it could say tastes like x creature if you know the creature smells like that.

the same thing could also work for smell.


Sense layering
Inorder for you to be able to sense things in quickly without always switching between senses they can layer when a “point of interest” pops up, a point of interest is things that will fulfill food, water or other needs of your creature, they also include danger. If a point of interest appears then the sense that picks it up will create a light overlap on the current sense.

Sight and smell with sound
Sight and smell will be overlapped by sound in the same way, with the surroundings darkening and the soundwaves being seen.

Sight with smell
When sight gets overlapped by smell something similar to sound will happen where everything will be lightly colored with smells

Sound & touch with smell
When sound & touch gets overlapped by smell I think only the “point of interest” can be seen inorder to keep the surroundings from being cluttered.

Everything and sight
I think anything that gets overlapped by sight will simply have anything that is “seen” ie. Smelled, heard, touched, will have their real colors shown. I don’t think it would work well with smell Though.

Layering should also be able to be turned off or toggled.

Sense mixing
This is about how the senses could mix in unque ways that is different from layering.
here’s the senses that I think can mix

Sight & sound
If you’re creature’s depth perception is below max you can use sound to fill the gap, allowing you to see creatures in detail without having to max out your depth perception.

Touch & sight

Same with sound except it uses touch instead

Sound & touch

Maybe soundwaves that come from the player are thicker?
The illuminated areas that you’re creature left could last longer when a sound wave passes over it? But this requires more thought.


I like these visualisations!

I’m reminded of the puzzle game Sensorium, where (spoiler alert!)

Sensorium spoilers

Each of the senses is tied to a puzzle mechanics, and “taste” is represented by the crosshair changing shape.

If each of the senses are incorporated, I think some care needs to be taken as to not overwhelm the player with too much information. Each of the senses providing information to the player might quickly clutter the screen. I like the sense layering solution.

The only thing I’m a bit unsure of is touch- what mechanical reason is there for touch in-game? Sight, hearing and smell all provide some form of spatial awareness to the player that goes much further than the reach of their limbs. I could reasonably see it used by creatures in (for example) cave systems where input for the other senses might be limited, but at first glance it seems circumstancial (in-game that is) Maybe touch could be expanded to include general sensory information about the environment like humidity and temperature?


My idea for touch was that it would be the strongest sense the player would start out with on average. Because every other sense (sound, sight, taste, and smell) would logically be either too weak or non-existent to work for the player. Then, as time goes on, they rely on touch less and less as other senses become more effective.
So, pretty much I was thinking that touch would be more of a tutorial sense to get you used to how other senses work, then something you would use later on.
having touch convay things like temperature, humidity, and other things you feel through your skin, would be a good idea but how would you put those together in a way that’s not too cluttered?


Why not make sound shown using… the headphones or speakers that most people will have. It seems like it will only over complicate to add a whole new overlay, for something that is already well integrated into computer output. In addition, having all the sound wave overlay and all of what you seem to be describing, really feels more matching to be relegated to a specialized echolocation and superior sound recognition, rather than just hearing in general.


Additional information display is required, otherwise the player’s own sensory limitations will become a soft limit.

There is also a screen: Depending on the position of the eyes, organisms will have different sizes of visual fields. The human perspective is about 180 degrees, the horse can reach 350 degrees, and there is also the unilateral vision of flatfish, and not all organisms have zoom ability.

How to present them on the screen? Is it achieved through limitations and fog from a third-party perspective?

Color is only a subjective feeling of a person, not an objective attribute of an object. Objects are only emitting or reflecting electromagnetic waves.

The general population (excluding color blindness) has three types of cone cells and obtains complex perception through brain processing based on tricolor vision. The lens filters out ultraviolet rays, and the cornea filters out infrared rays.

Non primate mammals only have dichroic vision, while Sauropsida animals have four color vision, and birds can see ultraviolet light.

Although some invertebrates have more types of cone cells, their simple brains are unable to perform complex color discrimination. Although octopuses have eye structures similar to vertebrates and powerful brains, they do not have color vision.

So we may need a color system and filter based on the wavelength of light?

Vision cannot exist without the brain. Configuration insufficient , algorithm to compensate.


Definitely sound can be played through the speakers or headphones of the player, but I think some extra visualization can help the player in interpreting the meaning of the sound. It might not always be easy for the player to be able to pinpoint where a sound is coming from, whereas- in theory at least- a creature with extremely accurate hearing would be able to do so easily. As Lan mentioned, the player might bump into their own sensory limitations, so there might need to be some assistance in that regard.

I do think it needs to be a toggle though, to prevent the screen from getting too cluttered (so: sound should always play of course, but the visualization of sound could be a separate overlay).

As an aside, from an accessibility perspective visualizing sound could also be useful.

I think this should not be too complicated, especially not considering the plethora of information the other senses give you. I think these could easily be simple indicators in the UI, perhaps simultaneously showing the tolerance zone for each environmental parameter (outside of which the player would start to get damaged).


That’s true, but I think alot of what you were saying would be provided a lot more information than can realistically be gained through hearing.

Well maybe, but most hearing won’t be extremely accurate. Human hearing is rather flawed, you can easily be tricked using reflections and blocking, into hearing something from a complete different direction than it actually is. I think that as you begin with hearing, it should have many of those same weaknesses.

I do agree that human sensory limitations are a problem, it’s quite a problem to try and avoid. Honestly, I’m not sure if it really is completely possible to overcome, even when just looking at any one sense, and not the many senses that there are, and especially not the ones that humans don’t even have.

Yeah, that seems reasonable. I might be overestimating what is, location-wise, theoretically possible to detect from hearing.

Of course, I reckon I overlooked the main use of hearing- communication! Maybe rather than visualising where sounds are coming from, the focus should lie on making clear the meaning of those sounds (and leaving localisation of objects mainly to other senses). Although this can just be limited to creature vocalisations (to distinguish i.e. a warning call from other sounds), I do not think every sound needs an explanation (i.e. a branch breaking).

What advantage is there to seeing different colors? Differenciating more different types of flowers? Is this the reason for showing the game as black and white for players who didn’t upgraded their eye? I can live with my color vision removed and wouldn’t have any problems in my daily life, if a word is written in a different color it is still the same word, etc. It does withold some information, but it would be mostly just annoying.

A better way to add advantage to better eyes would be to color the fruits green so that the player would have a hard time seeing it, but not the sky or something else that doesn’t have an importance. They should look like how we would see them as. If you upgrade your species’s eyes, the color contrast of species that are using camouflage can be increased. And there can be different types of upgrades such as light sensitivity as you said.

Those lines can be added over the visuals if the species has both vision and echolocation. It would add a different aesthetic. If it only has echolocation, here is what I think.

The world can have dull colors, but it should look normal. Moving objects should have their positions updated every once in a while instead of seeing them in real time.

What advantage does echolocation has? It is depth perception. There are two types of dept perception, echolocation helps both. The first one is related to the resolution of the eye and restricted by the number of pixels on the screen, unless a zooming feature was added.

Showing creatures in low resolution would just lower the gameplay experience for the players. They should be shown in perfect quality within the seeing range, and disapear once they exit the range.

The second one is about our brains being able to pinpoint the location of an object.

Our eyes have an intersecting region so that we could grab the next branch when brachiating. We later used it to catch prey (or throw spear) like a normal predator.

We can look at the 2d screen of the game and understand that this tree is closer and that tree is farther away. How can the game promote the evolution of depth perception? By telling how many meters or feet an object is away from you?

That would be hardly useful. But telling time instead of length would be more intuitive. When a predator is running towards you, if it is in the field of vision of yours with depth perseption, how soon it will reach you can be shown in seconds. When you look and walk towards a mountain in the distance and you have echolocation (which is basicly sonar), the game may show you how many minutes it will take to reach there. In Terraria, a day lasted 24 minutes, so if Thrive has simple ratios like that, the player can look at the sun, see that it is noon and by the time they reach there it will be dark, and pack their stuff while keeping that in mind. The time showing feature can be started by pressing a button on the keyboard. It can show the times of the object that is being looked at and all the objects moving towards the player that are in the depth perseption field. If there are multiple flying creatures attacking from all directions for example, seeing countdowns with the soonest one being written in red would help the player, unless it is 3rd person view in which case that could already be guessed. If you aren’t going towards an object fast enough, it may state “never” as the time for reaching. If you are at a cliff and start running to the other side, in real life you would feel if you can make it to the other side. This feature can help with that.

The image doesn’t load. I agree that the closest objects should be shown to the player even if the creature lacks sight. But it shouldn’t be limited to the things you are touching which is useless.

If there is a creature behind you, you can feel its breath, etc. The default perseption radius should be small, constant and non upgradeable, it would be the one used by trees. Other senses would increase the radius of this, for example,

a small region should still be shown in the third party perspective, even in directions you don’t have an eye in. The eye should expand that region but not be different qualitatively, it shouldn’t be black and white because that would be annoying. There is also memory, there should be a minimap showing you the places you have been before or your ancestors have been in the case the migration paths decided by the game. If you have no senses, the places you never went before can be shown in black (except the direction of the sun, you can feel its prescence due to heat) or with a fog, this is an artistic choice. And as you walk, the land features can be added permenantly, not just to the map, but also to the vision. Sessile species can be shown but there is no reason to remember the places of moving creatures, unless they are sleeping or hibernating or stuck in quicksand. Also by clicking the minimap, it would be nice if we could move in it like using the free cam in Spore

The information you gather from smell (such as: a creature passed from here 20 minutes ago) can be shown with a line of text. What else can there be? Adding colors everywhere adds no functionality, it just blinds normal vision.

It is possible to find the location of stuff from farther away using smell. For that, the wind needs to be blowing from that direction towards you. So maybe the wind direction can be shown by coloring the places you can’t see with smell with a slightly darker color. And if you can smell a creature that is behind a mountain or a hill, it can be shown similar to the waypoints in xaeros minimap mod. If you want to only smell and find some creatures, what you are looking for should be toggleable.

In underwater, taste is smell. On land, it can be used to tell if a fruit is poisonous or not. If you evolve taste reseptors on your hands like butterflies, it would be possible to do that without putting it in your mouth.

All the senses should show the world the same layer so that they don’t require switching. For example you may not see a creature due to tall grass but if you can smell it, it should just look visible as if you can see it

There can be a hotness or coldness bar just like the health bar. You don’t need to know the temperature other than the times it hurts you anyway

Good idea, but everything shown with headphones should also be shown in the game, unless it is toggled off

Well said. No reason to choose the visible spectrum you can see. We can only improve the color perseption ability, having no color perseption would make everything look normal and make camouflaged species look invisible. Upgrading it for the first time would make the poisonous species that tried to look threatening look more colorful and in every upgrade the camouflaged species can look more easy to spot by human players, there can be an arms race and infinite upgrades. Also, camouflage patterns should be procedurally generated.

Here is a fun fact about infrared vision. You can see hot stuff as glowing, such as living things, so it has a usefulness that some of the other wavelengths don’t. There are some snakes and bats that do that, the snakes have a different organ for that and the bats have the sensor in their noses. In order to see another creature in thermal vision you need be cold blooded and it needs to be warm blooded or at least it needs to be hotter than your thermoreseptor. The bats keep their nose 9 degrees colder than the rest of their bodies. The reason for that is because your own body is also emitting heat so if you were the same temperature as your prey, your thermal eye would see the light coming from your thermal eye and you wouldn’t see anything on the outside. Also, if a heatwave occurs and the temperature increases to the body temperatures, infrared vision would just be disabled. This ability may be more suited for nocturnal creatures.

Echolocation is good hearing in the abscence of external sound sources

For red green color blindness, red and green look the same.

Primates have evolved from dichromatic vision to trichromatic vision, allowing them to visually identify whether the fruit is ripe.

Protecting colors is an effective way for organisms to prevent themselves from being discovered. In the eyes of humans, the orange tiger is very prominent in the forest, but its prey only has dichroic vision and cannot distinguish between orange and green.




For a monochromatic species, the tiger should look green in front of green leaves and grey in front of grey rocks. I think this is better than having just one color in the game and camouflaging the leafs as well.

1 Like

Color Blindness Simulator

The ability to distinguish colors comes from the cross peak of the cone cells stimulated by light wavelength, and the brain determines hue by the proportion of stimuli it receives.

The Achromatopsia species does not have cone cells, and rod cells can only work when light is weak, nocturnal, and photophobic.

Monochromic vision species only have one type of cone cell, and can only see the monochromatic and white colors corresponding to the cone cells. Under the processing of the brain, other colors can be judged as dark monochrome or light monochrome (this discrimination is different from light and dark, unlike Achromatopsia, where Monochromatic vision can determine whether the color difference it sees comes from different light waves or light and dark).

Dichromatic vision species have the ability to distinguish between two colors (light wavelengths) and, with the help of the brain, distinguish more colors (light wavelengths) through the proportion of cone cells stimulated.

The brain of arthropods lacks the ability to determine the corresponding color (light wavelength) of each cone cell through a wider variety of cone cells. In their eyes, the world is filled with spots in black and white.

Vertebrates benefit from a well-developed brain and can distinguish colors (light wavelengths) through a small number of cone cells. Of course, the more types of cone cells there are, the finer their resolution (primates’ trichromatic vision allows them to better distinguish between red, green, and yellow than other mammals).

1 Like

What if you’re trying to find a green plant in a stony biome?

Assuming we have a species that only has 1 type of photoreceptor (and thus only sees in one ‘colour’), and the value of both the green leaves and the grey rocks in this one ‘colour’ appear the same in this lighting condition, then we wouldn’t know we’re looking at 2 different things (ignoring shape, depth, etc).

Given this, our species wouldn’t be able to distinguish between the two. So how would you represent this by your method? Change the colour of the green leaves, or the grey rocks? Or maybe you’d change both of their colours? What decides which colours are changed? And what if there’s a third thing e.g. a tiger (with the same value in that specific ‘colour’) mixed in with the green leaves and the grey rocks? Do the parts of the tiger get their colour changed based on what they obstruct? Would you just make everything the same colour? If you did, then what’s the point of even having different colours in the first place?

It’d not only be inaccurate, it also just sounds like a complete headache to implement. I see many potential cons, but no benefits. Though I’m not sure how accurate it is, a filter is an easier solution that also avoids the issue of potentially confusing the player into thinking everything has camouflage abilities.

There has to be some deep misunderstanding somewhere. You seem to be confused as to why life has repeatedly evolved colour vision, yet just a paragraph later you explain exactly why it is useful. Colour vision allows animals to more accurately distinguish between different colours. Maybe B&W would ‘be mostly just annoying’ to a modern human that spends most their time writing, but that small difference can mean life or death to an animal that needs to detect predators (or an animal that needs to find prey).

There is a Wikipedia article that talks about the topic of colour vision ―there’s even a Simple English one.


Most plants don’t try to hide from herbivores because they would be found eventually. There is no reason to make them harder to find for the player.

Things that want to hide=>Its color can be changed to the majority color behind it or it may be made invisible

could be

Would you want the game to have worse graphics when you progress beyond cell stage?

What meant by simplifying the visuals is by simplifying them in way that has the visuals still look good but giving them less details. Like making the textures on a creature look smooth, or not being able to see the mouth but it still has a defined headshape, etc. But this is a difficult balancing act to have and might not be worth it, but it is a fun concept.

This is an interesting concept, but what if your creature doesn’t have eyes or is in a pitch-black area? I think the player shouldn’t be able to see any color if they have no way to interpret colors. I think positions should only be updated if the player interacts with them, hears them, or if the object gets hit with sound waves.

Fixed! ^-^
We can change the size of the illuminated area until it feels right. Keeping the areas you touch illuminated would also help, the area could fade after some time, leaving an outline before disapearing completely.

Well what I was thinking of was that this version of smell would be ideally used if you didn’t have sight, or if this was the strongest sense you had. Thats how I designed all the senses with the assumption that it is the main one. Giving information on all the smells in area would help the player find things other than just food. The surroundings could change color depending on the concentration of smells. so you could see at a glance that a area is dangerous just by the color, or follow a trail to a nearby stream, you could also follow a smell of a material you were looking for.

Good point,
To keep it from getting cluttered a simpler version of the rest of your senses (ui or visual) could go with whatever sense you’re using at the moment, and if you need more detail from a certain sense, you can just toggle it. The problem with this approach though is that it you would have to make five different ui plus whatever other senses could be added for this to work.

Color filter =/= worse graphics
Also this topic A more stylized representation of sightless 'vision' in organisms shows how microbe stage could later better represent how a microbe would “see”.

1 Like

I wasn’t aware that computer graphics are holy and may not be interfered with.
To me, computer graphics are meant to communicate ideas, not to just show the player a slide-show of every 10000 € asset there is. I often prefer graphics in games that are stylised, or ‘worse’, than ones that just aim to achieve a window into your backyard ―especially if that ‘worsening’ or stylisation leads to communicating things that would otherwise just be a number.
This is genuinely so silly.

This is actually a really cool post! I love this idea! I really hope we get to see a mod like this ―it also sounds like something that wouldn’t be hard to implement at all. Could probably even do it with reshade, the only problem would be the UI getting blurred.
I wanna see this idea tested and similar ideas discussed, because I love the idea of your character affecting how you experience the world, and not only how you interact with it.

1 Like

of course not! we would make the cell stage even worse!

ok, that’s kinda a joke but I do think that overall, changes have to be made senses equally in all stages, and it doesn’t make much sense for the graphics we currently have in cell stage to exist. I think there are quite a few changes that could be made to alter cell stage graphics. The thread confused protist linked regarding different cell stage graphics is a good start, but i think more can also be added, such as light sensing organelles, and their corresponding effects on what you can “see”.

1 Like

The cellular stage is 2D and microscopic, while the biological stage is 3D and macroscopic. I think the performance of visual and other senses should be different in these two stages.

The cellular stage

Consider cell sense: contact, photosensitivity, chemical sense

Introducing fog. The player’s field of view is divided into clear areas, foggy areas, and black areas.

The basic field of view is a range that includes the cells themselves and extends outward for a fixed length, representing contact sense. This fixed length does not increase with the increase of cell size, and small cells are sufficient and require larger cells to obtain further sense.

The clear area is based on the basic field of view, and in addition, you can expand the clear area with other senses. The most effective way is through photosensitive vision, which can provide a high clear area distance in well lit patches; In the dark seabed, bioluminescence can also be used to coordinate. Chemical sense is superior to stability, as it can provide a certain clear area even on the dark seabed. The field of view provided by different types of additional sense cannot be superimposed, and the maximum value is taken.

The foggy area is the transition from the clear area to the dark area. The fog gradually thickens, and only the colors and blurry outlines can be seen (purely visual effect). The length of the foggy area is proportional to the length of the clear area, allowing for viewing information with the mouse. By upgrading the chemoreceptors, you can highlight cell contours or toxins within a certain range of foggy areas. If you have photosensitive organelles, other cells’ bioluminescence will create clear areas within the foggy area.

The black area is beyond the zoom limit of your field of view, and chemical receptors can search for compounds.


I heard people say that chemoreseption is the main sense for small or simple species

Quote of message by user «Lan»

I do agree that, because the objectives and overall context of the cellular stages are different than those of the late multicellular, there should be different implementations based on which stage the player is on.
Speaking on the discussion­/example given by «Lan», I want to mention that, if the devs were to implement «mechanics that change how the player experiences the world, based on their organism’s senses», then I think this is like sort of the ‘minimum viable implementation’ (if that makes sense).

It’s obviously important to discuss multiple implementations (even the ‘minimum’ or ‘basic’ ones), and because of that, I think one could elaborate more on this idea. I think view range can definitely be implemented for some or all senses, but I think there is more that can be done with this idea than just view range/fog.

I wanna point towards the thread that «the confused protist» mentioned, and say that you should definitely discuss your ideas about the cellular stage there, because in that thread, you can build on previous discussion. I’m also mentioning this, because I think I’ll also try to experiment around with some concepts in the current game using «reshade» ― if I can get it working in thrive. Getting some discussion revived there could be good, is what I’m saying. I just love the concept of stylising how the player sees the world/game.

1 Like