Did you feel sorry for the robot?

boston dynamics parody refers.

> Are machines really deserving of empathy? Do we need to worry about people fighting for robot rights? Will this footage one day be used against humanity? These are big questions that are only going to become more relevant.


This is like the biggest theme in Black Mirror in recent seasons - showing a cookie or robots that share the same thoughts, ideologies, emotions as humans (@linh can relate!). There is a whole debate in the BM community around whether or not we should treat robots humanely, and whether or not torturing a cookie for information is illegal, stuff like that. My philosophy on this is - since I believe that machines are slowly becoming more human-like, and in the future, can be developed well enough to replace humans, we should not treat them badly. Not with empathy, but at least not badly. I always have a bad feeling whenever I say “Shut up, Alexa” to pause a song. You never know when a robot can rebel against you haha!


As coder - i know that they don’t feel emotions. So while we don’t include a system similar to our pain network - i.e. you drop your Iphone and he feel pain - this is just another tool that we perfected around the centuries.
Greeks use stone in order to create perfect sculptures, other write books, but this is still just tool of expression

I’m more focused on don’t making harm for living creatures. not just animals, but to our grand kids as well


OTOH, I think it’s time to rethink our sympathies for humans (collectively).

  • We ended up screwing the planet

  • We ended up creating robots destroying us

  • We are the culprits of endless human atrocities exercised upon humans and biosphere

  • Our creations are far too inferior to deserve our sympathy.

Should we fear them? Definitely yes.
Sympathize with them? Not for us to decide.


Just published: https://hackernoon.com/what-i-learned-during-my-brief-existence-as-a-robot-ddccd4cb769f


Children approach getting to know a robot like they would approach a new friend.

They asked it what its favourite food was, if it had friends, what its family was like. Robot-specific questions — like what was the square root of a gigantic number — came later. In essence, children were extending empathy to the robot, treating it like an equal. This is positive: the children didn’t have impulses to treat the robot unfairly. This human tendency for empathy toward robots has been shown and discussed in previous studies (Darling et al., 2015; Scheutz, 2011). This is an interesting design question: what types of robots do we want to design, given that we feel empathy for them by default?



Let’s put it here.
If he will make a breakthrough that will change everything(by his calculations, he is 10-20 years ahead) - then we’ll need an equal policy.

I hope he will make it. because i only have one life

1 Like

They may get revenge soon.

1 Like
1 Like

In another context, in machine learning, if machine is a subset of robot. I might start off with a smaller dataset, and maybe it has good prediction results. And so begin to try one a full training dataset, and for some reason (there could be many), the model did not coverage (fail to learn), it is wasted resources (if running on a cluster, someone else could use it), my time, energy (from power plants, from mother nature).

1 Like

Oh, sure! :roll_eyes:
That reminds one of my favorite films “I, Robot” with Will Smith


So good.

To this day I still can’t look at a shipping yard full of containers without thinking “Yep, that’s where they keep the robots.”

1 Like

:laugh: :grin: :laugh: :grin: :laugh: :grin: :laugh:

OTOH, I felt so much respect for Sophia the Robot that I wrote an article humanizing her…published yesterday.

1 Like

nice, @niravbhatt.cpp; pity I can’t read it behind the paywall :catsmirk:

@natasha thanks!

Here goes friend link

1 Like

I think it should be posted here

Oh, wow :crazy_face: