what i meant is the mod has software required for it to run and the code is encrypted in a way that it would take years (or at the very least months) of constant work for someone to decode so they would instead just download the mod
edit: i barely know how this would be done but i will try to learn how to code within the development of thrive so sorry if what i am suggesting is not possible
This doesn’t make sense… Why would the mod need to be encrypted? And if your computer can run some software the software decryption key must be included in the software. And anyway if the mod is in C#, then capturing the unencrypted code after it is loaded for the C# runtime, would probably be pretty trivial with the right tools.
Your suggestion is entirely aimed at the wrong thing. If we make Thrive without the questionable content, have it on Steam etc. as normal, and then separately make a download for a mod available that adds the questionable content, there’s no reason why the mod needs to have any encryption or anything. Why would the mod file need to be protected in any way? If you mean that the content would be encrypted in the game but unlockable with a key, people can just steal that key from the mod files and just distribute that to other people. I would imagine that video game rating boards would just consider that a deliberate attempt to obfuscate certain parts of the game without actually removing them. And just one more final point is that if the content was included in the game but encrypted, that content anyway has to use the same method as mods use to load themselves in, you cannot have part of an executable encrypted like that so that it interacts normally with other code in the executable. So this encrypted content idea is the exact same as the mod except for some reason the files would be included in Thrive itself instead of entirely in a separate download.
as i said i do not yet know how to code so sorry if what i suggested is implausible impossible or impractical also i took that idea from a game that did that but i do not remember what it was called
I have a proposal on how to implement the censorship of genitalia
While the idea of simplifying genitalia to blobs seems like it will work, it is insufficient to eliminate obscene organs entirely. The auto-evo, if it is of any workable complexity, could easily contrive an organ system of a gun and sheath for use in reproduction. There is also the matter of all the other types of limbs, glands, and mouthparts that could be extended from an animal’s groin
Overall, no automated system could ever hope to divide the obscene organs from the acceptable. However, we aren’t limited to automated systems here: Thrive is a game, and like all games it requires a player. Furthermore, the player sees everything that might need to be censored, as anything beyond what they see will not be seen at all
Given this, it seems the only reasonable way to censor Thrive is to show everything to the player and permit them to declare what is and is not obscene. Then, if they wish to post out on youtube or other platforms they can edit out this obscenity before posting
I’m not an expert but I’m pretty sure that any ratings board is going to call bs on that. You can’t show the player something, then ask them if it should be hidden. It doesn’t work because the player has already been exposed to the content that potentially should have been censored and never been shown to the player in the first place.
If that system is insufficient what should be used instead?
how about both automated and ask the player to train the censorship AI but don’t show the things the AI has already censored
If there was an easy solution, this thread wouldn’t have latest multiple years…
That hits the same exact problem, the player is shown uncensored content. AFAIK video game content rating boards look at the stuff the player will see in their playthroughs. So the worst content the player might see sets the rating that high.
what i meant is have an ai that has an interaction page where you can sculpt anything and mark it as obscene after you select an age and it marks it as obscene for that age group as well as a group of volunteers or dedicated workers to go through the pre releases and mark any obscene content as obscene
Okay, so your idea is crowdsourcing + AI. So the components would be a tool for the community to go through thousands and thousands of creatures marking the obscene stuff, then an AI would be trained based on that data. And finally that AI would be included in the game and it would do the censoring.
That’s probably doable, but has some things to consider first:
- How long it will take to program the tool for people to mark obscene stuff?
- How many examples of obscene things we need, are thousands enough, or do we need to go up to hundreds of thousands? And how often we need to update the data (for example due to changes to the creature generation in the game)?
- How long does it take to train the AI each time? Do we have people with powerful enough hardware to do it whenever needed?
- Can the trained AI run fast enough on old laptops to the usable in the game?
All in all, that’s a ton of work, and ongoing maintenance. I still hold my position that we’ll just skip allowing making anything that would potentially need censoring (what could be shown in a nature documentary).
the AI would likely be constantly communicating with any devices thrive is installed on unless that feature is turned off to keep the installed AI as simple as possible while still censoring anything obscene so you don’t just get all the obscene shit just because you decided to play thrive while your internet was down and have a server with all the AI’s code to allow this to work and just give the AI a new input to have it censor that
So in addition to what I listed above, you suggest a peer to peer networking system or a bunch of servers that are available to run the AI? That’s even harder as that’s more to do on top of what I already thought of.
ok. would having it only update the AI from a server on game startup be easier
Having an AI itself would be time consuming to even start
if you make it relatively simple and have it be self learning while every day going through it’s code and cleaning it up and afterwards testing to see if every obscene thing that was censored stays censored it wouldn’t be as time consuming
‘Relatively simple’ and ‘self learning AI’ would not be the same thing if I’m understanding you right. I’m not that well-versed in AI though.
Anyway, Would what you are suggesting be a helpful tool? Sure, absolutely. But imo it is more something to consider when the game isn’t also still going through development, as keeping it family friendly from the start is safest and simplest. Then after game completion it is something that could be tackled.
TLDR; cool idea willow, a crap ton of effort versus simply not including things that need to be censored, that imo isn’t worth it at the very least until the rest of the game is completed
i just thought of a much simpler solution that if it works that’s good but if it doesn’t it wouldn’t be that surprising. if the auto-evo program makes something obscene penalize it. if it makes something that attacks obscene creatures on sight reward it, and have auto evo be able to learn what is and isn’t viable as well as view obscene creatures as unviable without the presence of a mod specifically to disable obscenity unviability
And how would you detect something obscene? That’s the whole core point of the discussion on how to censor it.
Well I guess it is kind of true, in that once you have a ton of training data, you can make a relatively simple program with TensorFlow for example to learn based on the annotated data. So the hard part is having a good enough quality big dataset.
And the point would be? Once the AI is done, it can just be included with the game release. The AI doesn’t need to be updated constantly. It just needs to be ready before a release.
At this point I feel like you don’t know enough about how things work to evaluate whether your suggestions are actually feasible. Technically possible solutions are nice and all, but if there’s no realistic chance it is possible to make for Thrive with the available resources, it’s not a solution.
that is why i say i don’t know much about coding so much i just suggest ones that seem possible or plausible (ex: post 161 that i took the idea from a game that is probably completely dead by now) and see whether or not they would be possible, practical, and worth the amount of work that would need to put in to get it to work
I think the chance of auto-evo creating something obscene out of pure chance is too low to be actually considered as a possibility that needs to be addressed since there are a lot more shapes it could create.