The shape of our technological future is already coming into view, judging by Sherry Turkle’s recent book Alone Together. To be more accurate, if one takes the latest developments in the area of electronic communication, internet activity and robotics, together with changing attitudes on the part of especially (but not exclusively) young users into consideration, a subtle shift has been taking place all around us.

With the advent of computer technology, the one-on-one relationship between human and intelligent machine gave rise to novel reflections on the nature of the self, a process that was taken to a different level with the invention of the internet and its impact on notions and experiences of social identity. Turkle traced these developments in “Computers and the Human Spirit” and “Life on the Screen”, respectively. In Alone Together she reports on her research of the last 15 years, which brings her investigations into the relationship between human and technology up to speed.

The fascinating thing about the book is this: if Turkle is right, then attitudes that we take for granted concerning what is “real” are receding, especially among young people. For example, there is a perceptible shift from valuing living beings above artificially constructed ones to its reverse, as indicated by many children’s stated preference for intelligent robotic “beings” as pets above real ones. Even the aged sometimes seems to value the predictable behaviour of robotic pets — which don’t die — above that of real pets.

The most interesting area of current artificial intelligence research is undoubtedly that of technological progress towards the construction of human simulations in the guise of robots, and the responses of people to this prospect. Turkle recounts her utter surprise, if not disbelief, in the face of a young woman’s explanation of her inquiry about the likelihood that a robot lover may be developed in the near future: she would much rather settle for such a robotic companion and lover than a human boyfriend, given all the sometimes frustrating complications of a relationship with the latter.

And even more perplexing, when Turkle expressed her doubts about the desirability of human-robot love relationships supplementing (if not replacing) such relationships between humans in an interview with a science journal reporter on the future of love and sexual relations between humans and robots, she was promptly accused of being in the same category as those people who still cannot countenance same-sex marriages. In other words, for this reporter — following David Levy in his book Love and Sex with Robots — it was only a matter of time before we will be able to enter into intimate relationships with robots, and even … marry them if we so wished, and anyone who does not accept this, is a kind of “specieist” bigot.

The reporter evidently agreed wholeheartedly with Levy, who maintains that, although robots are very different (“other”) from humans, this is an advantage, because they would be utterly dependable — unlike humans, they would not cheat and they would teach humans things about friendship, love and sex that they could never imagine. This resonates with the young woman’s sentiments about the preferability of a robot lover to a human.

Here I should quote Turkle, who articulates the reasons for her misgivings about these developments as follows (p. 5-6):

“I am a psychoanalytically trained psychologist. Both by temperament and profession, I place high value on relationships of intimacy and authenticity. Granting that an AI might develop its own origami of lovemaking positions, I am troubled by the idea of seeking intimacy with a machine that has no feelings, can have no feelings, and is really just a clever collection of ‘as if’ performances, behaving as if it cared, as if it understood us. Authenticity, for me, follows from the ability to put oneself in the place of another, to relate to the other because of a shared store of human experiences: we are born, have families, and know loss and the reality of death. A robot, however sophisticated, is patently out of this loop…The virtue of Levy’s bold position is that it forces reflection: What kinds of relationships with robots are possible, or ethical? What does it mean to love a robot? As I read Love and Sex, my feelings on these matters were clear. A love relationship involves coming to savor the surprises and the rough patches of looking at the world from another’s point of view, shaped by history, biology, trauma, and joy. Computers and robots do not have these experiences to share. We look at mass media and worry about our culture being intellectually ‘dumbed down’. Love and Sex seems to celebrate an emotional dumbing down, a wilful turning away from the complexities of human partnerships — the inauthentic as a new aesthetic.”

Do Turkle’s misgivings reflect those of most reflective people? My guess would be that they probably do, but I am also willing to bet that these are changing, and will change on a larger scale, as more robotic beings enter our lives. Her experience with an elderly woman whose relationship with her son had been severed, and had acquired a robot “pet”, seems to me telling here. While she was talking to Turkle, she was stroking the electronic device, fashioned like an animal, which looked at her and emitted a purring sound, to the evident reassurance of the woman. It was, to use Turkle’s words, “performing” a pre-programmed response to the way it was being handled.

This is the crucial thing, in my view: people judge others — not only robotic devices, as in this case, but other people (and animals) too — in terms of “performance”, always assuming that “there is someone home”, and in the vast majority of cases this is probably true. But “performance” is what matters, whether it is in the form of facial expressions, or laughter, or language — we do not have “direct access” to anyone’s inner feelings, although we always assume, by analogy with our own feelings, emotions, and anxieties, accompanying what we say or show, that this is the case. (This dilemma is related to the philosophical problem of solipsism, or monadism — based on the curious fact that, in a certain sense, no one can step outside of their own immediate experiences to validate the experiences of others.)

And because we are all dependent on linguistic behaviour or some other kind of “performance” as affirmation of the presence of a conscious being commensurate with our own state of being, I am convinced that, when in the presence of a being which “performs” in a way which resembles or imitates the behaviour of other human beings, most people would be quite happy to act “as if” this being is a true human simulation (whether there is someone “at home” or not).

What is in store for human beings in the future, in the light of these startling findings by Sherry Turkle? One thing seems certain: the way in which technological devices are judged is changing to the point where they are deemed worthy substitutes for other people in human relationships. And this gives reason for pause.

(For a more thoroughgoing treatment of the question concerning the requirements that have to be satisfied to render robotic beings which are truly simulations of human beings, see my paper, “When robots would really be human simulacra: Love and the ethical in Spielberg’s AI and Proyas’s I, Robot“. Film-Philosophy, September 2008.)

READ NEXT

Bert Olivier

Bert Olivier

As an undergraduate student, Bert Olivier discovered Philosophy more or less by accident, but has never regretted it. Because Bert knew very little, Philosophy turned out to be right up his alley, as it...

Leave a comment