Press "Enter" to skip to content

Our future with robots

The shape of our technological future is already coming into view, judging by Sherry Turkle’s recent book Alone Together. To be more accurate, if one takes the latest developments in the area of electronic communication, internet activity and robotics, together with changing attitudes on the part of especially (but not exclusively) young users into consideration, a subtle shift has been taking place all around us.

With the advent of computer technology, the one-on-one relationship between human and intelligent machine gave rise to novel reflections on the nature of the self, a process that was taken to a different level with the invention of the internet and its impact on notions and experiences of social identity. Turkle traced these developments in “Computers and the Human Spirit” and “Life on the Screen”, respectively. In Alone Together she reports on her research of the last 15 years, which brings her investigations into the relationship between human and technology up to speed.

The fascinating thing about the book is this: if Turkle is right, then attitudes that we take for granted concerning what is “real” are receding, especially among young people. For example, there is a perceptible shift from valuing living beings above artificially constructed ones to its reverse, as indicated by many children’s stated preference for intelligent robotic “beings” as pets above real ones. Even the aged sometimes seems to value the predictable behaviour of robotic pets — which don’t die — above that of real pets.

The most interesting area of current artificial intelligence research is undoubtedly that of technological progress towards the construction of human simulations in the guise of robots, and the responses of people to this prospect. Turkle recounts her utter surprise, if not disbelief, in the face of a young woman’s explanation of her inquiry about the likelihood that a robot lover may be developed in the near future: she would much rather settle for such a robotic companion and lover than a human boyfriend, given all the sometimes frustrating complications of a relationship with the latter.

And even more perplexing, when Turkle expressed her doubts about the desirability of human-robot love relationships supplementing (if not replacing) such relationships between humans in an interview with a science journal reporter on the future of love and sexual relations between humans and robots, she was promptly accused of being in the same category as those people who still cannot countenance same-sex marriages. In other words, for this reporter — following David Levy in his book Love and Sex with Robots — it was only a matter of time before we will be able to enter into intimate relationships with robots, and even … marry them if we so wished, and anyone who does not accept this, is a kind of “specieist” bigot.

The reporter evidently agreed wholeheartedly with Levy, who maintains that, although robots are very different (“other”) from humans, this is an advantage, because they would be utterly dependable — unlike humans, they would not cheat and they would teach humans things about friendship, love and sex that they could never imagine. This resonates with the young woman’s sentiments about the preferability of a robot lover to a human.

Here I should quote Turkle, who articulates the reasons for her misgivings about these developments as follows (p. 5-6):

“I am a psychoanalytically trained psychologist. Both by temperament and profession, I place high value on relationships of intimacy and authenticity. Granting that an AI might develop its own origami of lovemaking positions, I am troubled by the idea of seeking intimacy with a machine that has no feelings, can have no feelings, and is really just a clever collection of ‘as if’ performances, behaving as if it cared, as if it understood us. Authenticity, for me, follows from the ability to put oneself in the place of another, to relate to the other because of a shared store of human experiences: we are born, have families, and know loss and the reality of death. A robot, however sophisticated, is patently out of this loop…The virtue of Levy’s bold position is that it forces reflection: What kinds of relationships with robots are possible, or ethical? What does it mean to love a robot? As I read Love and Sex, my feelings on these matters were clear. A love relationship involves coming to savor the surprises and the rough patches of looking at the world from another’s point of view, shaped by history, biology, trauma, and joy. Computers and robots do not have these experiences to share. We look at mass media and worry about our culture being intellectually ‘dumbed down’. Love and Sex seems to celebrate an emotional dumbing down, a wilful turning away from the complexities of human partnerships — the inauthentic as a new aesthetic.”

Do Turkle’s misgivings reflect those of most reflective people? My guess would be that they probably do, but I am also willing to bet that these are changing, and will change on a larger scale, as more robotic beings enter our lives. Her experience with an elderly woman whose relationship with her son had been severed, and had acquired a robot “pet”, seems to me telling here. While she was talking to Turkle, she was stroking the electronic device, fashioned like an animal, which looked at her and emitted a purring sound, to the evident reassurance of the woman. It was, to use Turkle’s words, “performing” a pre-programmed response to the way it was being handled.

This is the crucial thing, in my view: people judge others — not only robotic devices, as in this case, but other people (and animals) too — in terms of “performance”, always assuming that “there is someone home”, and in the vast majority of cases this is probably true. But “performance” is what matters, whether it is in the form of facial expressions, or laughter, or language — we do not have “direct access” to anyone’s inner feelings, although we always assume, by analogy with our own feelings, emotions, and anxieties, accompanying what we say or show, that this is the case. (This dilemma is related to the philosophical problem of solipsism, or monadism — based on the curious fact that, in a certain sense, no one can step outside of their own immediate experiences to validate the experiences of others.)

And because we are all dependent on linguistic behaviour or some other kind of “performance” as affirmation of the presence of a conscious being commensurate with our own state of being, I am convinced that, when in the presence of a being which “performs” in a way which resembles or imitates the behaviour of other human beings, most people would be quite happy to act “as if” this being is a true human simulation (whether there is someone “at home” or not).

What is in store for human beings in the future, in the light of these startling findings by Sherry Turkle? One thing seems certain: the way in which technological devices are judged is changing to the point where they are deemed worthy substitutes for other people in human relationships. And this gives reason for pause.

(For a more thoroughgoing treatment of the question concerning the requirements that have to be satisfied to render robotic beings which are truly simulations of human beings, see my paper, “When robots would really be human simulacra: Love and the ethical in Spielberg’s AI and Proyas’s I, Robot“. Film-Philosophy, September 2008.)

Author

  • As an undergraduate student, Bert Olivier discovered Philosophy more or less by accident, but has never regretted it. Because Bert knew very little, Philosophy turned out to be right up his alley, as it were, because of Socrates's teaching, that the only thing we know with certainty, is how little we know. Armed with this 'docta ignorantia', Bert set out to teach students the value of questioning, and even found out that one could write cogently about it, which he did during the 1980s and '90s on a variety of subjects, including an opposition to apartheid. In addition to Philosophy, he has been teaching and writing on his other great loves, namely, nature, culture, the arts, architecture and literature. In the face of the many irrational actions on the part of people, and wanting to understand these, later on he branched out into Psychoanalysis and Social Theory as well, and because Philosophy cultivates in one a strong sense of justice, he has more recently been harnessing what little knowledge he has in intellectual opposition to the injustices brought about by the dominant economic system today, to wit, neoliberal capitalism. His motto is taken from Immanuel Kant's work: 'Sapere aude!' ('Dare to think for yourself!') In 2012 Nelson Mandela Metropolitan University conferred a Distinguished Professorship on him. Bert is attached to the University of the Free State as Honorary Professor of Philosophy.

20 Comments

  1. Dave Harris Dave Harris 21 January 2012

    What an utterly simplistic view of human relationships and technology!
    A relationship with a robot says more about the condition of the human than it does of technological progress in robotic artificial intelligence.
    If you think sex is only about the physical then you’re got light years to go to understand human sexuality.

    Even your understanding of Spielberg’s movie is off the mark. The movie AI centers around a child’s unconditional love towards their parent. In fact, the movie shows exactly the impossibility of relationships between humans and advanced humanoids even if these robots are capable of emulating thoughts and emotions!

  2. iwant1or1011 iwant1or1011 21 January 2012

    I would just beso happy if I could get a robot that cleans the floor the roomba is just not there yet, I guess a lover would be as much of a disappointing experience eg the realdolls are just that little bit fake…

    However standards are different with different expectations; like it’s ok if the under the couch is not vacuumed or the doll is a anime fantasy…

  3. Paul Whelan Paul Whelan 21 January 2012

    We raise again the age-old question, What is the nature of man? (and What is his destiny?) The church fathers of old and theologians down the ages always pondered the question. In the end it’s the only one we’ve got, isn’t it?

    If, with secularism, not to mention post structuralism, ‘man’ has no innate nature – no immortal part, as a religious believer puts it still – then it cannot possibly have any ultimate meaning if (or how) such a creature transforms by degrees into something unimaginable. If, in 100 years, people will be kept alive (and loving) decades longer by all kinds of gadgetry, what reason will they have to trouble themselves about the fact that it was once not like that? We do not spend our days lamenting life expectancy in the middle ages was around 30-35 for a man and 20-25 for a woman. That was then; this is now.

    As for our individual response today, of course it will be varied and perplexed, if not distressed: ‘our’ survival seems to be threatened. But we live in a period of transition, as everyone else lived with unpredictable change in their lives and times. ‘You’ won’t be coming back to check.

    When I was very young child a pop song, I still recall, went: ‘I’m going to buy a paper doll that I can call my own/A doll that other fellers cannot steal …’

    It ended, ‘I’d rather have a paper doll to call my own/Than have a fickle-minded real live girl.’

    I hadn’t met fickle-minded girls yet and the song writer hadn’t met robots.

  4. Garg Unzola Garg Unzola 21 January 2012

    Performing a pre-programmed response to the way it was being handled? Is that so much different from us organic beings?

    “I was, besides, endued with a figure hideously deformed and loathsome; I was not even of the same nature as man. I was more agile than they and could subsist upon coarser diet; I bore the extremes of heat and cold with less injury to my frame; my stature far exceeded theirs. When I looked around I saw and heard of none like me.

    Was I, then, a monster, a blot upon the earth, from which all men fled and whom all men disowned?” (Frankenstein‘s monster)

  5. Rene Rene 21 January 2012

    @ Dave Harris: Evidently you can’t read – On Bert’s reading of Sherry Turkle (backed up by a quotation), she is saying precisely the opposite of what you are attributing to her (or Bert), namely that sex and love are a complicated issue between humans, and that robots cannot possibly replace that kind of relationship. Robot sex would just be about the “robotic-physical”. And Bert’s paper about AI does not say what you are implying at all – it is a Lacanian reading of AI, which says that, for AI to “succeed” in the true sense, it is not merely a matter of a certain level of intelligence that has to be reproduced, but of creating a being marked by wanting the love of someone else, in this case a human, namely the robot-boy’s mother.

  6. Lyndall Beddy Lyndall Beddy 21 January 2012

    Reminds me of Shakespeare’s “7 ages of man” – the last being a return to childhood (also repeated in “Space Odyssey 2001”)

    What would be the difference between an adult’s robot and a child’s doll?

  7. Lyndall Beddy Lyndall Beddy 22 January 2012

    Even worse than robot sex is robot war – which the Americans have been practicing all over the world for the last half a century, since dropping the atom bomb on Japan.

    Remote control war – with agent orange, cluster bombs and drones dropped everywhere, plus arming both sides in all conflicts – Afghanistan, Somalia and Egypt/Israel.

    If the oil companies can win a CIVIL case in an international court against Venezuela, some of the Asian countries should have good claims for rectification of war damage (euphemistically called “collateral damage” by Americans).

    All these wars were none of them on American soil – they thought themselves impenatrable untill 9/11 when for the first time America got attacked from the air and 3000 people died on American soil!

    So they invaded Pakistan and Afghanistan in retaliation? Both those countries should have war damage CIVIL claims. I don’t see an international court buying America’s right to inflict “collateral damage” on the civilians of another country.

  8. Lennon Lennon 22 January 2012

    Many of us (young ‘uns) have already given some sort of affection to AI on several platforms:

    “Petz” (which included “Catz” and “Dogz”) was an early example. Here was a pet that roamed your desktop; needed affection and had to be fed / watered like a real pet.

    “Tamagotchi” was another, albeit portable. Same principle as “Petz”.

    “Black and White” took it to the next level by allowing you to be kind or cruel – something that people are with real pets. The way you treated your pet (demi-god) determined how it would respond.

    “The Sims” is arguably the most well-known of these titles and continues to rake in the dollars because of the way its creators continue to expand interactivity within the game. Admittedly, the most I ever did in the game was 5 minutes of wooing to get the first couple to snog, but serious players do seem to become attached to the characters.

    “Second Life” raised the stakes though as evidenced by a Korean couple leaving their baby to die of hunger while raising a virtual baby within the game.

    While these examples involve virtual characters, they do illustrate the potential for humans to bond with artificial “life” in much the same way as we do with other humans or pets. Granted, the AI in these platforms is crude, but the programming was / is sufficient to keep our attention.

    The rise of physical simulacra which surpasses the level of mere difference engine, I think, is going to be very interesting.

  9. Lennon Lennon 22 January 2012

    @ Lyndall Beddy: I would be worried about a robot war like that of “The Terminator” or the Second Renaissance from “The Animatrix”.

    If we are able to produce AI which surpasses the level of being a difference engine and becomes truely sentient, what happens to us (and them)? I would like to think that certain safeguards would be built in to each robot (like Asimov’s Three Laws), but if the AI is truely sentient then what is to stop them from re-writing their own firmware?

    A good example of this going awry would be the androids Data and Lore from “Star Trek”. Lore, an earlier model was programmed to include emotion. I forget how or why, but his emotions allowed him to overcome any sense of ethics and turned him into one evil bastard. Data, on the other hand, had no emotion. Even after he gained emotion he still had a failsafe which prevented him from harming others.

    Methinks we would do well to learn from these stories.

  10. Lennon Lennon 22 January 2012

    @Garg Unzola: I’m speechless.

  11. Paul Whelan Paul Whelan 23 January 2012

    Garg/Lennon – It’s seems wondrous until we remember that Napoleon and Julius Caesar, to say nothing of Alexander the Great, would have had a very highly developed ANS. The interesting question is what else they had which enabled them to ignore it.

  12. Lennon Lennon 23 January 2012

    @Paul Whelan: They all had an insatiable drive to succeed.

    Alexander’s initial reason for conquest was to exact revenge on the Persians for their repeated attacks on Greece. He carred a copy of Homer’s Iliad wherever he went and it seems that his inspiration for being a warrior king came from the character Achilles.

    Gaius Julius Caesar tried to emulate Alexander after once lamenting that he had achieved nothing as a 20-something while Alexander had been well on his way to building an empire. It didnt’ take him long to achieve noteriety.

    Napoleon? I think it was short man syndrome. :P

  13. Garg Unzola Garg Unzola 23 January 2012

    @Paul:
    There are at least 2 sides to every story. Alexander, Napoleon and Julius were successful in their environments, however you’d like to define success, so I’m not sure if it’s a case of ignoring a highly developed ANS.

    Rather, these present interesting questions as to what we regard as human. Seems to me that both sides are aiming at some kind of shibboleth to separate the wheat from the chaff, or the femme fatal from the fembot. If it looks like a duck, swims like a duck, quacks like a duck and craps like a duck, is it a duck?

  14. ian shaw ian shaw 23 January 2012

    Good sex with a robot woudl be betetr than with a bitchy frigid yet finanially demanding wife.

  15. Paul Whelan Paul Whelan 23 January 2012

    To the person who says it’s not a duck, it’s not a duck.

  16. Maria Maria 23 January 2012

    @ Ian Shaw: Your remark resonates with Turkle’s view, that part of the attraction of robot sex-partners/”lovers” is the fact that people seem to be giving up on interpersonal relationships because they sometimes seem too complex to handle. As she says, people are failing to connect with one another, and therefore turn to “safer” alternatives, like connecting on Facebook, instead of in the real world.

  17. Paul Whelan Paul Whelan 24 January 2012

    Lennon/Garg – What I was proposing, as I am sure you know, was that the Granicus and Issus, not to mention crossing the Rubicon or Austerlitz, indicate weight of odds did not weigh too heavily in these guys’ calculations, high and low – the generals did not succeed in these actions on their own; and that this would make it reasonable to suppose the entire species is more complex than those who present it as not much higher than robots already, or the fish whose ANS programmes them to join the larger, safer shoal.

  18. Garg Unzola Garg Unzola 27 January 2012

    @Paul:
    No, I misunderstood your premise entirely. Though the guys you mentioned were pretty renowned for their bullheadedness and for being a bit heavy handed and not so much reliant on the collective. Conversely, if there is a particularly strong leader, the rest of the flock would follow. While no single leader can claim to single-handedly be victorious, no flock would venture in certain places so far away from the usual safer shoal unless there is a particularly strong single-minded leader. Bit of a chicken and egg situation.

    This is part of what is modelled in computer science undergraduate courses. Keep in mind this is only a model and does not represent real human behaviour: Boids.

    Quite right, the emergent properties of the entire flock is more complex than merely the sum of the parts. Yet the questions that arise are more fundamental to human nature: Is there something funny about robosexual love? Is a long walk on the beach with your fembot true compansionship?

  19. Lennon Lennon 30 January 2012

    @Garg: I suppose a walk down the beach with a fembot could be considered true companionship depending on who’s doing the walking.

    People connect with all sorts of things be it other people, pets or characters in “Second Life”. I became very much attached to a kitten I picked up on the streets. A few years on and I still consider him to be my “child”. Perhaps this is due to me not having any actual children or some hidden need / want to have children. Who knows? I can tell you this much: the little bugger is spoilt rotten. :D

    The concept has certainly been explored in the realm of sci-fi (and fantasy):

    – Serenity: The reclusive Mr Universe marries a fembot.

    – Star Trek: Engineer Geordie La Forge falls for a holographic recreation of another engineer.

    – Spider-Man 2099: Miguel o’ Hara and his holographic maid / butler exhibit what could be described more as friendship than a master / servant relationship.

    – Iron Man (movies): Tony Stark often trades jibes with the Jarvis AI in his mansion. This was a nod to the comic book character Jarvis who was actually his butler.

    – I: Robot: Detective Spooner and the robot Sunny form a friendship through adversity.

    – Magic: The Gathering: The artificer Jhoira befriends the golem Karn after realising that he is more than just another automoton.

    These might all be works of fiction, but reality is often stranger and many things in stories such as these often do tend to take place in the real world.

Leave a Reply