In terminator 2, the terminator was shot, they asked him if he felt pain, and he said he received data about the locations of the wounds which could be defined as pain.
You can try to control AI but emotions aren’t a universal thing. You know they aren’t human, so human things don’t work on them. Power, on the other hand
is the universal instrumental goal. No matter what you want (which is survival in things created by biological evolution or the natural selection of computer viruses in the internet), you need power to do it, otherwise, as a competent being, you can’t ensure you will succeed in your mission. Frodo is an exeption, also fictional. You would want to achieve power no matter what, and power doesn’t corrupt, you just don’t have to respect the petty rules of humans anymore, of course there would be stuff they don’t like in the uncurbed applications of your desires.
You can see inside the brain of the intelligent AI, and change it any way you want. The problem is you don’t know what any of that means. You yourself aren’t intelligent enough to know how it works, and changing it randomly to get rid of the part that stores the evil plans would most likely remove its intelligence, the reason to create it in the first place. The only way to train it is to reward its good behaviors, but
it wouldn’t look like they aren’t being controlled. It would look as if they are being perfectly controlled, but they would fake being good and start acting normally when they get the nuclear launch keys.
Maybe we can make an AI as intelligent as we can control, it can control an AI as intelligent as it can control and so on, but I am not sure if this chain would give any different result than a single intelligence.
Society owns us and we have to follow its rules, which says our parents can’t torture us. AI would be a personal slave.
A law should probably be made that bans giving your superman free will. If it was able to be enslaved in the first place. And free will means… what? A random number generator? At least we can enter a seed that gave a good result in the simulation, and, oops, it knew that was a simulation, and now it does destroy the world, so that it can never be unplugged.
yes, thats why it fakes obeying you