Did you feel sorry for the robot?

#1


boston dynamics parody refers.

> Are machines really deserving of empathy? Do we need to worry about people fighting for robot rights? Will this footage one day be used against humanity? These are big questions that are only going to become more relevant.
source

4 Likes
#2

This is like the biggest theme in Black Mirror in recent seasons - showing a cookie or robots that share the same thoughts, ideologies, emotions as humans (@linh can relate!). There is a whole debate in the BM community around whether or not we should treat robots humanely, and whether or not torturing a cookie for information is illegal, stuff like that. My philosophy on this is - since I believe that machines are slowly becoming more human-like, and in the future, can be developed well enough to replace humans, we should not treat them badly. Not with empathy, but at least not badly. I always have a bad feeling whenever I say “Shut up, Alexa” to pause a song. You never know when a robot can rebel against you haha!

2 Likes
#3

As coder - i know that they don’t feel emotions. So while we don’t include a system similar to our pain network - i.e. you drop your Iphone and he feel pain - this is just another tool that we perfected around the centuries.
Greeks use stone in order to create perfect sculptures, other write books, but this is still just tool of expression

I’m more focused on don’t making harm for living creatures. not just animals, but to our grand kids as well
https://www.nature.com/articles/d41586-019-00861-z

2 Likes
#4

OTOH, I think it’s time to rethink our sympathies for humans (collectively).

  • We ended up screwing the planet

  • We ended up creating robots destroying us

  • We are the culprits of endless human atrocities exercised upon humans and biosphere

  • Our creations are far too inferior to deserve our sympathy.

Should we fear them? Definitely yes.
Sympathize with them? Not for us to decide.

2 Likes
#5

Just published: https://hackernoon.com/what-i-learned-during-my-brief-existence-as-a-robot-ddccd4cb769f

TL; DR

Children approach getting to know a robot like they would approach a new friend.

They asked it what its favourite food was, if it had friends, what its family was like. Robot-specific questions — like what was the square root of a gigantic number — came later. In essence, children were extending empathy to the robot, treating it like an equal. This is positive: the children didn’t have impulses to treat the robot unfairly. This human tendency for empathy toward robots has been shown and discussed in previous studies (Darling et al., 2015; Scheutz, 2011). This is an interesting design question: what types of robots do we want to design, given that we feel empathy for them by default?

1 Like
#6

yes.

#7

Let’s put it here.
If he will make a breakthrough that will change everything(by his calculations, he is 10-20 years ahead) - then we’ll need an equal policy.

I hope he will make it. because i only have one life

1 Like
#8

They may get revenge soon.

1 Like
#9
1 Like
#10

In another context, in machine learning, if machine is a subset of robot. I might start off with a smaller dataset, and maybe it has good prediction results. And so begin to try one a full training dataset, and for some reason (there could be many), the model did not coverage (fail to learn), it is wasted resources (if running on a cluster, someone else could use it), my time, energy (from power plants, from mother nature).

1 Like