The European: Why are we so fascinated by robots?
Sharkey: We like puppets as well. It’s not a dissimilar phenomenon. Animated things are very appealing to us and a robot is in many ways like a big animated puppet. This fascination goes quite a long way back. In 60 AD, Hero of Alexandria already started to build robots. He kept all the mechanics hidden so that people thought it was magic. Even Homer wrote about tripod-robots.
The European: Surely, they were fairly different to the robots we know today.
Sharkey: True. Karel Čapek was the first to use the term “robot” in his 1920 science-fiction account “Rossum’s Universal Robots” – although it features what we now call androids rather than robots. He also introduced the dystopian view that robots would at some point extinguish the human race and take over the world. Other books and movies like Fritz Lang’s “Metropolis” amplified the growing public interest and fascination for robots. Westinghouse, an American manufacturing company, caught on to this idea and started to develop show robots. They were the inventors of the metal-man-like robot.
The European: What did they do with it?
Sharkey: Showing off, basically. You have to put it in context. In the 1920s, technology was developing exponentially and people were bewildered by all these innovations like cars and planes. It frightened and amazed people at the same time. Building a robot was a sign of power, a proof that you are at the peak of technology. The robot became the face of the technological change and innovation.
The European: Just like today.
Sharkey: Exactly, Honda does the same thing with their ASIMO robot. They don’t sell it, they just like to show that they’ve got the know-how and means to build it. It’s a good publicity stunt. It appeals to people because humans personify and emphasize.
The European: We like to compare robots to humans?
Sharkey: We like to project some humanity onto machines and in that respect; there is nothing better than a robot. Let me give you an example: During war missions, soldiers often bond very strongly with the remote-controlled robots that are used to dispose bombs. They start to treat them like fellow soldiers. In one case, they started to call a garage maintenance place the “droid hospital.” They brought in their damaged robots and when they were offered new ones, they declined because they wouldn’t accept any other robot. They would even take them on fishing-trips on their day off.
“Bonding with robots is pure deception”
The European: Do you think that we will start to bond more strongly with robots in the future?
Sharkey: There is a lot of evidence for it. Children and elderly people often bond very strongly with robots and they are increasingly used for child- and elderly care. That will however result in a huge problem.
The European: Why?
Sharkey: We’re facing a moral dilemma here. Ethically, bonding with robots is pure deception. Small children fall for robots. Studies show that they believe them to be more cognitive than their pet dogs. Sherry Turkle puts it like this: “You’re getting a child to love an object that can’t love it back.” Robots should be used for the tasks of care, not the practice of care.
The European: But robots could be good companions for the elderly – a distraction from the realities of their solitary lives.
Sharkey: That’s the dilemma. Robots are often used for therapy and do have some success in that respect. It’s an ethical trade-off. In a way, you’re taking away the dignity of old people because they might not have the sense to realize that it is just a robot. On the other hand, if robots improve the quality of their lives, then you have to balance it.
The European: Could you bond with a robot?
Sharkey: No. I know that it’s just wires and metal. How could I bond with that? But I find it very interesting how other people engage with robots, for example with sex robots.
The European: Sex robots?
Sharkey: Yes, it’s an emerging industry. There’s one called “Roxxxy” and it’s just such a bad robot. It just sits there and if you touch it, it goes “I know what you can do with that hand.” It can talk about sports as well. It’s very bizarre and I initially didn’t think that anyone would really use them.
The European: But you were wrong.
Sharkey: Some people prefer sex robots to actual human partners because robots won’t leave or disappoint you. I know about a case where a guy took his girlfriend to an erotic hypnotist so that she would behave like a robot. He fantasised about making love to a robot and refused to have sex with her unless she would act like one. In South Korea, brothels with dolls instead of prostitutes are booming. Oddly enough, clients start to bond with these dolls. So why shouldn’t that happen with robots as well? For some people, the line between robots and humans is becoming very blurry.
“Robots won’t be the main problem”
The European: Why do we want robots to look and behave like humans?
Sharkey: It depends how you define “we.” Studies show that there is great variation between countries. Generally, Asians like their robots to be humanlike, we in the West prefer them to be mechanical. The first American robot was a metal-man; in Japan it was a big, fat, laughing Buddha.
The European: There is growing public concern that robots will replace human labor and that thousands of people will hence be thrown out of work. Do you believe that?
Sharkey: I am not sure. It takes an awful lot of people to design, build, and maintain these robots. But it will eliminate certain jobs; that‘s inevitable. However, I don’t think that robots will be the main problem.
The European: How do you mean that?
Sharkey: We tend to label all mechanical devices as robots. In South Africa, traffic lights are called robotic. But algorithms and assembling devices are actually very different from robots. It’s hard to define the term so we should use it with greater care.
The European: What’s your definition?
Sharkey: I like Joseph Engelberger’s definition: “I can’t define a robot, but I know one when I see one.”
The European: Not only factory owners but also military generals are tempted to replace humans with robots. Drones and armed robots will shape the future of warfare.
Sharkey: There’s nothing wrong with the technology as such, but we use it in a very improper way. The same goes for guns, they are neutral as well. Of course, if we talk about autonomous robots, the story is quite another.
The European: Because they are no longer a mere tool but an effective decision-maker.
Sharkey: Yes, but one that lacks the skills to discriminate and differentiate. They can’t tell the difference between a combatant with a weapon and a little girl pointing an ice-cream cone at them. That makes them very dangerous.
“Robots could start to slaughter humans”
The European: Do you believe that because of drones and robots, there will be bloodless wars at some point in the future?
Sharkey: That’s just fantasy. War is far too complex for such a science-fiction scenario to happen. There are many human feelings involved: Anger, fear, or revenge. I often use the example of the Irish fighting the English. Do you think that if the Irish robots would have beaten the English robots, that the English would have said: “Alright mate, we surrender, you win”? No chance. If we build robots for those purposes, why don’t we just abolish war altogether and play chess or football to resolve our quarrels?
The European: Would it even be desirable to have a bloodless war? Body bags are a strong argument to stop the fighting or to avoid it in the first place.
Sharkey: Unfortunately, body bags do have a vital role in warfare. It’s a major inhibitor because it can cost politicians many votes. The wars in Vietnam and Iraq have proven that point. Without body bags there is no end to war. Drones for example prevent soldiers from being killed but they foster a continuous state of war all around the globe.
The European: That sounds very dystopian.
Sharkey: The trouble with drones or autonomous robots is that they create an important amount of uncertainty. If one nation’s autonomous robots were to fight other autonomous robots, you could not foresee what would happen – that’s the reality of computer science. The robots could just start to slaughter humans. They could easily trigger an unintentional war by attacking a target nobody had commanded them to attack.
The European: The question of responsibility becomes quite important in this context.
Sharkey: You certainly can’t blame or sentence a robot. That’s for sure. That would amount to just giving the military a carte blanche because they could claim that the machine is accountable. That would be very stupid and dangerous. Also, how would you sentence a robot? By switching it off? It’s not human so it can’t be sentenced.
The European: Under Corporate Law, you can also punish corporations.
Sharkey: True, but there are people behind that corporation and they will feel the repercussions. The same cannot be true for robots. Those who control or command them must be held accountable. Robots can’t use proportional force; they don’t care about international law or civilian suffering. There has to be a clear chain of command because so many things can go wrong with robots: they could be hacked, damaged, or misused. It’s scary to think about the consequences; therefore it’s vital to prohibit the use of autonomous war robots through an international legally binding treaty if we want to avoid a global catastrophe.
Did you like the conversation? Read one with Christoper Steiner: "Innovation is a Social Issue"