Hey, check this out:
You play as an AI, and I think its pretty different from most other titles comin out
Hey, check this out:
Seems quite scary. It reminds me of HAL-9000 computer in 2001 : A Space Odyssey. (It can be either the movie or the novel)
Please, donât do it Dave.
Dave, Iâm afraid.
Sorry Dave. Iâm afraid I cannot do that.
Open the door HAL! HAL!
Technically, Robots can feel emotions if we program them to react to certain things a certain way. That may end up being hard to do without making the robot seem creepy or just unnatural.
To be fair, I think robots that mimic humans are coming too quickly, as we donât have the technology to perfect them yet.
No, they donât. They only follow their program lines which tell them to react. They just react, but they donât feel.
P.S. : I said robots can be dangerous, but we can just drop a big EMP bomb on them, lol. However, it would cause some major economic problems (might cause car crash as well, but at least robots are de-activated!)
Who is to say that wonât change several hundred years into the future?
humans are also just coded to have emotion, but we are just coded by evolution and our genome.
True, but humans have a hugely complex brain, at least to us. To have what we call emotions, the robot must write itâs own code over time. Right? This feels wrong.
HmmmmâŚ
But at least humans learn how to react, robots have no choice but to react in a certain way.
But people arenât controlled by their genome, we are controlled by our brains, which are made by our genome, but are independent and ignore it. The genome tells how to build it, while the brain is constantly changing itself, but robots are restricted by us.
Iâm be the a confuzzled.
Robots will never feel emotions
They will fake it according to us, the âgenome.â
Human emotions are independent of the genome.
Therefore, robots can not feel emotions unless they write their own code.
Right?
Before we can determine if a machine truly has feelings weâll need to understand our feelings and brains better. At this point the best I think we could do would be a variant of the
for measuring feelings. Because without a model for what feelings are we can only observe the effects of them. And we could conclude that some robot acts just like if it had feelings, but we couldnât really tell if they were real in the sense you people are arguing.
Btw neural networks, and the current machine learning craze in general, work by training the algorithm to do what we want, so people donât actually know how it works, just that it properly responds to our training data.
Unless the universe is a simulation a video game
Yeah as hhyyrylainen said, neural networks can change their own code over time, and after a while of training theyâll give different answer compared to earlier times. So according to that âruleâ, neural networks qualify as living beings too.
Not exactly. They can change their own code, yes. But that doesnât change the fact that machines will never be able to feel emotions and that they will always follow their code lines even if they change.
EDIT : Hey @Omicron, since your post on Alien Invasion about criticizing XCOM : Enemy Unknown, I just beated the game on Normal. I laugh at all losers. (I wonât be laughing while playing on Impossible with Iron Man enabled lol) Did you win the game too?
Youâre saying itâll be more difficult on iron man? Because if youâve been savescumming, then itâll become one of the easiest games out there. Iâve beaten the first nuXCOM once, but I have a tendency to zone out because of the late-game becoming too easy.
But still, the definition of âemotionâ is what matters here. Are emotions chemicals? Because then code canât have it. At the same time however, if you classify emotions as a need to do something (which is what they are, evolutionary speaking), then you could say robots have more emotions than people. At the same time, because of how difficult it is to properly explain the idea of emotions, technically speaking futuristic robots could experience emotions (qilsnib) we canât, and one could wonder whether we can call those emotions or not. This is all very philosophical, and there isnât really an answer. (Except of course if you take a machine that can literally simulate the entire brain down to the molecules, then you should be able to say it has emotions, though I donât know why you would do it except âbecauseâ)
What do you mean by âzone outâ?
Thatâs not what I meant, I meant that savebelgiumming would make it easy, not that it is easy to do on iron man. (Though it kinda is, alt+f4 allows you to quit without saving your turn, allowing you to retry a turn)
Mostly just that I get bored and slowly play less and less until I start a new playthrough.
So your definition of emotion is the chemicals in your brain? Because thatâs a very strict definition, and doesnât allow for any variation (for example, a more mechanical, yet biological creature would not be able to have emotions purely because theyâre not using chemicals.)
The same logic goes with people (just launch a normal grenade), yet revolutions have been a thing. (Also, EMP shielding is a thing, EMP grenades donât really easily exist yet, and there wonât be a lot of working tech over afterwards since youâve fried their systems. (EMP doesnât work as a temporary stun like games do it)
yo guys pokemon swsh direct on june 5th
blackjacksikeâs three laws of robotics (inspired by the three laws of robotics by Isaac Asimov-yet I donât remember them! lol)
EDIT :
I agree. I find it pretty unbelievable that by random chance the robots somehow âunlockâ feelings / the ability to disobey. Like in detroit become human I was distracted by the fact that why they didnât hire competent software engineers to make the software for the robots. If they had, there would be no game as any deviants would almost instantly fail a software integrity check and power off. And even if that didnât happen, why wouldnât the software fault just cause the robots to spasm out of control instead of deciding to intelligently start pursuing their own agenda.
Why? I feel like this is the biggest thing we are debating. You are saying that non-biological things canât have human-like emotions. And I (and Omicron, I think) are saying that we donât know enough to rule out that a machine could have just as real emotions as our own.
I think the only way we are going to develop truly intelligent (AGI, instead of machine learning we have now) computers is that we create a computer that is somewhat intelligent, but can design a better computer. Then we just let that cycle continue for a bit and presto we have a technological AI singularity.
I think you misunderstood what I said. The concept of equality I mentioned was about human rights, not about intelligence.