Monday, May 14, 2018

Google's Duplex: Fraud or Helpful Assistant?


Duplex is a new technology announced by Google last week in a presentation by Google CEO Sundar Pichal.  He played some recordings of what sounded to the uninitiated ear like humdrum phone calls to a restaurant and a hair salon to make reservations.  In both cases, the business service providers heard a voice on the other end of the line which sounded to all intents and purposes like a human being calling on behalf of somebody who was too busy to make the call herself.  There were natural-sounding pauses, "hmm"s, and the information about appointments was conveyed efficiently and without undue confusion. 

The only thing was, there was only one human talking in each conversation.  The "agent" making the call was Duplex, an AI system that Google plans to offer to businesses as a giant step forward in robo-calls and related phone activities. 

I happened to hear a couple of these calls on a radio program, and I must admit the computer-generated audio sounded natural enough to fool anyone who wasn't clued in.  Now, nobody happened to ask the computer's name or try to start up a conversation with it about, say, existentialism, and I don't know what would have happened in those cases.  But for routine specific tasks such as making appointments, I suppose Google now has just what we want.  But is this something we really want?

Google thinks so, obviously.  As this example shows, we are rapidly approaching a time when companies will field AI systems that make or receive phone calls with such a good imitation of a live person, that the live person on the other end will not realize that he or she is not talking to another human being.  An Associated Press article about Duplex focuses on some narrow concerns such as state laws against recording phone conversations without notification.  These laws explain why you so often call a business and first hear something like the phrase, "For quality-assurance purposes, this call may be recorded or monitored."  Because it's so easy to include that phrase, I see this as a non-issue.

What wasn't addressed in the reports is a more fundamental question that relates, believe it or not, to a philosopher named Martin Buber who died in 1965. 

Buber's claim to fame is a book called I and Thou which explores the philosophical implications of two kinds of interactions we can have with the world:  the "I-it" interaction and the "I-Thou" interaction. 

A very oversimplified version of these ideas is the following.  When you are interacting with the world as an I to an it, you are experiencing part of the world, or maybe using it.  You have an I-it relationship to a vacuum cleaner, for instance. 

But take two lovers, or a father and a son, or even an employee and an employer.  The I-Thou interaction is always possible in these exchanges, in which each person acknowledges that the other is a living being with infinite possibilities, and ultimately the relationship has a mystical meaning that is fully known only to God. 

It's also possible, and happens all too often, that you can deal with another person using the I-it mode:  treating them as merely a means to some goal, for example.  But this isn't the best way to relate to others, and generally speaking, treating everyone as a Thou respects their humanity and is the way we want to be treated ourselves.

The problem that facile human-voice-imitation systems like Duplex can lead to is that they can convince you they're human, when they're not.  As the AP article points out, this could lead to all sorts of problems if Duplex falls into the wrong hands.  And who is to say whose hands are wrong?  At this point it's up to Google to decide who gets to buy the still-experimental service when they think it is ready for prime time.  But Google is in business to make a profit, and so ability to pay will be high on their list of desirable customer characteristics, way ahead of their likelihood not to abuse the service.

At some level, Pichal is aware of these potential problems, because he emphasized that part of a good experience with the technology will be "transparency."  Transparency is one of those words that sounds positive, but can have many meanings, most of them pretty vague. 

In this case, does it mean that any Duplex robot has to identify itself as such at the beginning of the conversation?  Starting off a phone call with, "Hi, I'm a robot," isn't going to take you very far.  The plain fact of the matter is that the phone calls Pichal played recordings of were remarkable precisely because the people taking the calls gave no clue that they thought they were talking to anything other than a fellow human.  And while it might not have been Google's intention to deceive people, it is a deception nonetheless.  A benign one, perhaps, but still a deception.

Even if this particular system doesn't get deployed, something like it will.  And the problem I see is that the very obvious and Day-Glo-painted line we now have between human beings, on the one hand, and robots, on the other hand, will start to dim and get blurry.  And this won't be because some philosophers start talking about robot rights and humans who are less than human.  No, it will be the silent argument from experience—as we deal with robots that are indistinguishable from humans over the phone, we may start to get used to the idea that maybe there isn't such a big distinction between the two species after all.

The movie Her is about a man who falls in love with a computer voice he names Samantha.  I won't summarize the plot here, but the relationship ends badly (for the man, anyway).  The film was made only five years ago, but already events have progressed to a point where the film's thesis has moved from completely impossible to merely implausible.  Maybe something like a computer identity badge or some other signal isn't such a bad idea.  But before we wholeheartedly embrace technologies like Duplex, we should run some worst-case scenarios in detail and think about ways to forestall some of the worst things that could happen—before they do.

Sources:  As carried on the KLBJ radio station website at http://www.newsradioklbj.com/news/technology/high-tech/what-happens-when-robots-sound-too-much-humans-0, the AP article by Matt O'Brien I referred to was entitled "What happens when the robots sound too much like humans?"  I also referred to Wikipedia articles on Martin Buber, Her, and I and Thou, Buber's book published in 1923 in Germany.

No comments:

Post a Comment