Does Artificial Empathy in AI Create Robot Psychopaths?

Do we really want robots to be human? Do we want them to be happy, feel pain, or fall in love? Do we want them to be indistinguishable from humans, because, then, what purpose would they serve? What niche would they fill that humans haven’t already filled? Would you be upset if your husband, wife, boyfriend, or girlfriend was, in fact, a robot? Why, if you loved them? Would it be a problem if they were just pretending to be what you wanted them to be? These are all questions that have been addressed in science fiction films and TV shows. The problem is that science fiction is rapidly becoming science fact and we will soon be called upon to answer all of these questions in real life.

For example, in the series, Westworld, it is impossible to distinguish the robots from the humans. The robots are programmed to behave as humans would in specific situations. Genuine emotions are lacking but humans are easily fooled into believing the robots are feeling them. This should really be nothing new. Humans can often fool other humans into believing that they are feeling emotions that they aren’t really feeling. Romance scammers and con men do this all the time. There is a profit to be made in faking emotions.

In fact, there is a psychological condition that mimics what we see developing in so-called empathetic robots. The condition is known as psychopathy. It is defined as being “characterized by the absence of empathy and the blunting of other affective states.”  In humans, this leads to those who have this condition, psychopaths, being able to cold-bloodedly manipulate the emotions of others for their personal benefit. Psychopaths don’t feel emotions themselves; however, just like robots, they can learn how to make themselves appear to feel emotions. That’s how they manage to fool people.

When I took the Hare Psychopathy Test as a robot, which, admittedly, was not easy, I came up with someone who qualified as having a “Psychopathic Interpersonal Style”, which is described here.

So, it seems that scientists have begun to develop psychopathic robots, but only at the interpersonal level. The question is: Should we continue to go down this road?

Actually, it’s probably too late to turn back. More and more robots are appearing that mimic human emotions. So far, however, they have not gained widespread acceptance. True, they do seem to have a niche use, often among isolated elderly people or with those suffering from mental disorders such as autism. However, most people don’t feel that these robots are indispensable. For this reason, many promising emotional robots make a big splash as a novelty when they first appear, but never get into full production. This is what happened to Buddy, the emotional robot. Buddy was featured on national TV and generated a lot of interest. However, the French developers, Blue Frog Robotics, simply ran out of money.

(Buddy video here)

Buddy was expected to launch in September of 2020, but I found no evidence this occurred. Buddy’s financial fate has been paralleled by other robotic companies. What this probably means is that investors are not convinced that there will be a market for an empathetic robot companion at any time in the near future.

Emotional Connections with Robots

Emotional synthesis software firm, Emoshape, has developed the first emotional processing unit (EPU). This chip “allows a robot toy to develop a completely unique personality based on its user interactions, which will ultimately mean that no two have exactly the same personality. The chip is able to control the different facial expressions and body languages of a robot without hard coded predicates.”  The chip is able to help a robot identify anger, fear, sadness, disgust, indifference, regret, surprise, inattention, trust, confidence, desire and joy with an 86% accuracy rate. The newest version of the chip can also identify pain, pleasure, frustration, and satisfaction. The company predicts that “before the end of this century humans will talk more to sentient machines than to other humans”. The developers hope to link this emotional awareness with language generation software to create more appropriate linguistic responses than what is seen in most emotional robots.

Now, let’s take this to the next level. No one will ever confuse a robot like Buddy with a real human; however, on another front, humanoid robots (those which look like humans) are being created, and they are getting disturbingly more realistic every day.

I would be remiss in not mentioning sex robots with internal heating systems, replaceable faces, and customizable personalities. I’ll let your imagination fill in the blanks.

At the moment, I can’t imagine anyone mistaking a robot for a human, no matter how human-like they may appear to be. This, however, will not prevent humans from forming emotional bonds with them. My mother would get upset if someone spoke harshly to Alexa. As humans, we are irredeemably empathetic. Someday, though, all of these AI systems will join together and form a truly realistic, truly believable partner.

It may be, as Emoshape predicts, that humans will evolve into talking more to robots than humans. Certainly the recent lockdowns caused by the pandemic may have given many a need for such technology. In the end, robots, some of whom may be physically attractive, will learn how to behave to make you happy. They will have meaningful conversations with you and respond to you in appropriate ways. They will adjust to your moods in ways that few humans you know will have the patience for. They will understand you in ways that people will not. So, in non-technical terms, you will be able to create your soul mate. The bad news? Your soul mate will have absolutely no feelings for you. They can manipulate you emotionally and influence you to act in ways you would not have, had you not had them as companions. They may, in this sense, control you more than you control them. In the end, you will have, unintentionally, developed your own, personal psychopath. Can you live with that?

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s