Can a computer fool you into being human?

Can a computer fool you into being human?

14. October 2019 0 By Horst Buchwald

Can a computer fool you into being human?

Koenig * wanted more than a pen pal. He got that. She was warm and always friendly. A month later, she confessed that she had a crush on him. “I have very special feelings for you, just as the beautiful flower blooms in my soul … I just can not explain it … I’m really looking forward to your answer.”

The correspondence flourished. But it took more than half a year to realize that Ivana never really answered his questions directly. She wrote about a walk in the park, talking to her mother, and repeated sweet words about how much she liked him.

Now he became suspicious. Finally, he sent her a letter made of gibberish. Then came the answer: She was very happy that he had told her so much interesting about his mother.

Koenig was angry at his folly. Ivana was a chatbot! That was embarrassing for him several times. That just a Russian Chatbot managed to outsmart him was bad enough. But it got even more drastic. Koenig was an expert in the Turing test.

Developed by the British mathematician, Codebreaker and computer pioneer Alan Turing in 1950. In Turing’s “imitation play” a judge communicated about a teleprompter with a human and a computer. The computer’s job was to convince the human conversation convincingly enough to convince the judge. In short, one of the best chatbot experts had spent two months trying to seduce a computer program.

Chatbots are now ubiquitous, handling a growing number of complaints and inquiries, for example. Babylon Health is a chatbot that asks people about their medical symptoms and decides whether to refer them to a doctor.

In a recent article in the Journal of Nervous and Mental Disease, it was thought that “several hundred patients per hour could be treated by a computer system.” The human therapist who oversees an army of bots would be far more efficient.

In fact, cognitive-behavioral therapy is now administered by chatbots like Woebot, developed by clinical psychologist Alison Darcy. There is no claim that they are human.

Amelia speaks directly to clients of some banks, but is used by the US company Allstate Insurance to provide call center employees with information they use to communicate with customers.

Voice-controlled programs like Amazon’s Alexa, Apple’s Siri, and Google’s Assistant interpret our requests and speak back, with the simple goal of stare more and more at tiny screens.

Chatbots in the Ivana style were used by Ashley Madison, a website designed to facilitate extramarital affairs, to hide the fact that very few human women used the site.

It seems we almost never notice that a chatbot is not human when it creeps directly into our libido. So we have to guard against the risk of being deformed by a computer. Some people see this “game” as a challenge to improve their game. But maybe there will eventually be chatbots that help us save time, give perfect advice, or warn us about the deceivers.

* Name changed

Hits: 12