THE NEW Miscellaneous Talk That Doesn't Deserve A New Thread Thread Thread (Part 1)

Hey, check this out:


You play as an AI, and I think its pretty different from most other titles comin out

Seems quite scary. It reminds me of HAL-9000 computer in 2001 : A Space Odyssey. (It can be either the movie or the novel)

Please, don’t do it Dave.
Dave, I’m afraid.
Sorry Dave. I’m afraid I cannot do that.
Open the door HAL! HAL!

Oh my god, that computer is scary. That’s why I hate Detroit: Become Human. The developers seemed to try to make us feel empathy to robots, even though they will never be able to feel emotions. Emotions are felt only by animals because animals have a neural system which works with electrochemical processes i.e. that specific molecules will trigger specific reactions to the subject. On the other hand, a robot can only emulate emotions. However, new technologies on self programming, learning and thinking AI can come to serious problems. AIs won’t feel emotion, but they will be able to rely on themselves. Robots were made to help us doing tasks (generally dangerous ones), not to go rogue and live by themselves ! We should enforce the laws to forbid research and development on deep learning and such things.

Technically, Robots can feel emotions if we program them to react to certain things a certain way. That may end up being hard to do without making the robot seem creepy or just unnatural.

To be fair, I think robots that mimic humans are coming too quickly, as we don’t have the technology to perfect them yet.

No, they don’t. They only follow their program lines which tell them to react. They just react, but they don’t feel.

P.S. : I said robots can be dangerous, but we can just drop a big EMP bomb on them, lol. However, it would cause some major economic problems (might cause car crash as well, but at least robots are de-activated!)

Who is to say that won’t change several hundred years into the future?

humans are also just coded to have emotion, but we are just coded by evolution and our genome.

1 Like

True, but humans have a hugely complex brain, at least to us. To have what we call emotions, the robot must write it’s own code over time. Right? This feels wrong.
Hmmmm…
But at least humans learn how to react, robots have no choice but to react in a certain way.
But people aren’t controlled by their genome, we are controlled by our brains, which are made by our genome, but are independent and ignore it. The genome tells how to build it, while the brain is constantly changing itself, but robots are restricted by us.
I’m be the a confuzzled.

Robots will never feel emotions

They will fake it according to us, the “genome.”

Human emotions are independent of the genome.

Therefore, robots can not feel emotions unless they write their own code.

Right?

Before we can determine if a machine truly has feelings we’ll need to understand our feelings and brains better. At this point the best I think we could do would be a variant of the

for measuring feelings. Because without a model for what feelings are we can only observe the effects of them. And we could conclude that some robot acts just like if it had feelings, but we couldn’t really tell if they were real in the sense you people are arguing.

Btw neural networks, and the current machine learning craze in general, work by training the algorithm to do what we want, so people don’t actually know how it works, just that it properly responds to our training data.

2 Likes

Let me correct your statement. We are not coded to have emotions. Emotions are natural to us because our brain is designed that way (particular brain zones can manage emotions). Also, we are not coded by evolution. Evolution is simply a process by which a living being can change and if possible, to adapt itself. Furthermore, our genome doesn’t code us, at least not precisely. Our genes code our proteins. In other words, each protein is coded by a gene. Our genes don’t code our feelings or thought!

We are not sure actually if it can comes to feel emotions as @hhyyrylainen said. But it is sure that the robot can become independant and rebel if it can write its own code. But it will only follow its own code lines.

An AI follows its code lines. Its code will generally remain unchanged.

A living being will act differently depending on the situation and on its behavior. A behavior can change from time to time, depending on the individual (for example, if you would ask yourself a question as a kid, the answer would probably not be the same when you would ask yourself the same question as an adult).

Unless the universe is a simulation a video game

Yeah as hhyyrylainen said, neural networks can change their own code over time, and after a while of training they’ll give different answer compared to earlier times. So according to that ‘rule’, neural networks qualify as living beings too.

Not exactly. They can change their own code, yes. But that doesn’t change the fact that machines will never be able to feel emotions and that they will always follow their code lines even if they change.

EDIT : Hey @Omicron, since your post on Alien Invasion about criticizing XCOM : Enemy Unknown, I just beated the game on Normal. I laugh at all losers. (I won’t be laughing while playing on Impossible with Iron Man enabled lol) Did you win the game too?

You’re saying it’ll be more difficult on iron man? Because if you’ve been savescumming, then it’ll become one of the easiest games out there. I’ve beaten the first nuXCOM once, but I have a tendency to zone out because of the late-game becoming too easy.

But still, the definition of ‘emotion’ is what matters here. Are emotions chemicals? Because then code can’t have it. At the same time however, if you classify emotions as a need to do something (which is what they are, evolutionary speaking), then you could say robots have more emotions than people. At the same time, because of how difficult it is to properly explain the idea of emotions, technically speaking futuristic robots could experience emotions (qilsnib) we can’t, and one could wonder whether we can call those emotions or not. This is all very philosophical, and there isn’t really an answer. (Except of course if you take a machine that can literally simulate the entire brain down to the molecules, then you should be able to say it has emotions, though I don’t know why you would do it except ‘because’)

2 Likes

I have savebelgiummed a bit, but not that much. I only had three soldiers dead and no country got off the council, for I rushed toward satellites right at the beginning (in my walkthrough, I lost UK, but I refused to accept such a thing, so I began a new walkthrough by beginning with satellites). But if there is only one save file slot (ironman mode), why would savebelgiumming be easy?

What do you mean by “zone out”?

Emotions are not necessarily chemicals, but they have a relation with chemicals (emotions can’t exist without chemicals [the opposite is false i.e. that chemicals can exist without emotions]). Thus, code can’t program emotions (however, it can program an emotions simulator). However, I am not sure if simulating chemistry could lead to emotions (I think however it can lead to biology simulation). But whatever the case, I will remain against rebel robots because robots are meant to do jobs that are very dangerous and/or painful for humans. So, there is no need of a robot that is emotive or rebel (we just launch an EMP grenade on it and take the delicious electronic components for ourselves :-] )

That’s not what I meant, I meant that savebelgiumming would make it easy, not that it is easy to do on iron man. (Though it kinda is, alt+f4 allows you to quit without saving your turn, allowing you to retry a turn)

Mostly just that I get bored and slowly play less and less until I start a new playthrough.

So your definition of emotion is the chemicals in your brain? Because that’s a very strict definition, and doesn’t allow for any variation (for example, a more mechanical, yet biological creature would not be able to have emotions purely because they’re not using chemicals.)

The same logic goes with people (just launch a normal grenade), yet revolutions have been a thing. (Also, EMP shielding is a thing, EMP grenades don’t really easily exist yet, and there won’t be a lot of working tech over afterwards since you’ve fried their systems. (EMP doesn’t work as a temporary stun like games do it)

1 Like

yo guys pokemon swsh direct on june 5th

Sorry, I think I’ve quite irrational on that discussion. I shouldn’t have thinked with opinions as the discussion is about an unknown fact (robots are or aren’t alive). I should have remained neutral and said “I don’t know if they can be alive/have emotions or not.”. I’ve just always been afraid of a robot rebellion, that’s why. But the main point is not that they are alive or not. The main point is that robots were created to perform tasks without fatigue. Thus, why the hell would we give them the ability to disobey? That would just be retarded. The only kinds of robot I would probably consider as an equal to human would be a bioengineered robot (but again, why making a bioengineered robot, unless if it is for research).

blackjacksike’s three laws of robotics (inspired by the three laws of robotics by Isaac Asimov-yet I don’t remember them! lol)

  1. A non-bioengineered robot should never occupy a post in administration, programming, engineering, researching or arts.
  2. A non-bioengineered robot should never rewrite its own code.
  3. A non-bioengineered robot should never disobey.

EDIT :

Sorry, @Omicron. The main reason I was talking about XCOM to you is that you seemed frustrated about the level of intelligence of the Elders (you said that they were retarded) and it has always made me laugh when thinking about it. I’m sorry if it makes you angry, it’s just that it’s very funny to think the Elders as the Retarded. :-]

I agree. I find it pretty unbelievable that by random chance the robots somehow “unlock” feelings / the ability to disobey. Like in detroit become human I was distracted by the fact that why they didn’t hire competent software engineers to make the software for the robots. If they had, there would be no game as any deviants would almost instantly fail a software integrity check and power off. And even if that didn’t happen, why wouldn’t the software fault just cause the robots to spasm out of control instead of deciding to intelligently start pursuing their own agenda.

Why? I feel like this is the biggest thing we are debating. You are saying that non-biological things can’t have human-like emotions. And I (and Omicron, I think) are saying that we don’t know enough to rule out that a machine could have just as real emotions as our own.

I think the only way we are going to develop truly intelligent (AGI, instead of machine learning we have now) computers is that we create a computer that is somewhat intelligent, but can design a better computer. Then we just let that cycle continue for a bit and presto we have a technological AI singularity.

1 Like

I think you misunderstood what I said. The concept of equality I mentioned was about human rights, not about intelligence.