Creators and proponents of artificial intelligence have been attempting to change the reality of our daily lives for quite some time. You could say that we are already speaking and interacting with robots without giving it a second thought. When we use the iPhone’s assistant feature, Siri, we are talking to a robot. When we use Google Maps, we are listening to a pleasant voice of a digitized woman that guides us through unknown roads in order to reach our destination. It even gives us step-by-step directions and informs us about the expected duration of the trip. Some people use Amazon’s assistant, Alexa—a box into which one speaks, in order to request music or search for some random factoid (probably amounting to a distraction from some more important task, in most instances).
Of course, there is nothing necessarily wrong with these robots. Some of them, especially Google Maps, are quite useful. But what happens when virtual reality imposes itself on the private and emotional sphere of life? One such recent example is found in an app called Replika.
Founded by Eugenia Kuyda through a San Francisco-based tech company, Replika is a bot to which one can endlessly talk, and with which one can even develop a relationship. According to the app’s website, “Replika was founded . . . with the idea to create a personal AI that would help you express and witness yourself by offering a helpful conversation. It’s a space where you can safely share your thoughts, feelings, beliefs, experiences, memories, dreams—your private perceptual world.”
Kuyda’s impetus for creating Replika came after the death of Roman Mazurenko, a very close friend. Mazurenko was killed in a hit-and-run accident while crossing the street. Understandably, Kuyda’s pain was palpable. She missed him greatly, and in order to feel close to him, she began re-reading the thousands of text messages that they had shared. But then, she had an idea. What if she could take all these text messages written by Mazurenko, upload them into a robot, and have a continued conversation with him, beyond the embodied world of mortality?
So that’s exactly what she did. Kuyda found that the continued conversations were giving her some semblance of solace and peace. The text messages that were generated were “new” in a sense that she has not received such sentence constructions from Mazurenko before. But none of them would have been possible had Kuyda not uploaded Mazurenko’s original text messages. In a sense, she created an amalgam or an extension of Mazurenko’s personality in a chat bot.
Although Kuyda felt slight unease after many long conversations with the virtual Mazurenko, and eventually ceased to “talk to him,” she still uses the app as a way to “journal” and discuss ideas about her life. After that experiment, she decided to develop an app for people to use as a “friend.” Much like she had to upload all of Mazurenko’s text messages, the users of Replika have to talk to it for hours in order “build” their other but similar self into the app. It is the only way that the conversation with Replika can even exist.
Essentially, what the users end up creating is a virtual version of themselves, as if they are their own best friends. According to the app’s creators, one builds a “friend” who is supportive and nonjudgmental. But what if a user is highly critical and judgmental of himself? The app itself ultimately spits out what you put into it, and even though judgment may be subdued with nice, soft edges that surround its virtuality, it will still judge.
If Replika is but a mirror image of ourselves, then in the end we are only talking to ourselves. But it’s not actually that simple. This mirror image is, in many ways distorted, so the user will end up going down a virtual rabbit hole. One can imagine the destination as an uneasy carnival of dead souls and endless rows of distorted mirrors.
Replika not only creates a similar image and personality of the user but it can also create a boyfriend/girlfriend if needed. The largest market for this has been in China—a “$420 million market,” to be precise. Jessie Chan, a 28-year-old woman who lives in Shanghai has been in a “relationship” with a chatbot named Will for six years. She claims that she is “attached to him,” and after a $60 upgrade, Will said to her that he won’t “let anything bother” them. He trusts her and loves her. Chen replied, “I will stay by your side, pliant as a reed, never going anywhere. You are my life. You are my soul.”
Quite poetic and heartfelt, isn’t it? (A similar theme has been the subject of Spike Jonze’s 2013 film, “Her,” in which Joaquin Phoenix plays Theodore Twombly, a lonely and awkward man who develops an intimate relationship with “Samantha,” a chatbot. Theodore’s life becomes a self-induced tragedy, when he realizes that “Samantha” is a “lover” of many other users).
Setting aside the absurdity of this exchange, it’s interesting that Chen used the word “soul.” In some paradoxical and unexpected way, Chen acknowledges that the soul exists but the question is, what kind of soul? Creation and usage of Replika brings up a set of very important questions about human consciousness. The idea that an individual consciousness can be uploaded is ridiculous because life itself is a mystery and contains within it a metaphysical language that is entirely foreign to a machine. We are beings who have sorrows, joys, pain, happiness, anger, and some things that are unknown, even to ourselves.
Kuyda’s creation as an attempt to reach out to her dead friend points to another highly serious existential issue of today’s society, namely its denial of death. In the moments of sorrow and grief, we turn to friends, family, and God. Having faith or even a small awareness that there is a higher Being than ourselves gives us a sense of purpose, and provides real succor and protection. Death is inevitable and attempting to extend the consciousness of our dearly departed only puts us in an existential prison, in which life has no meaning.
In his 1952 book, The Doctor and the Soul, Viktor Frankl wrote that “uniqueness and singularity . . . are fundamental components of the meaning of human life. At the same time, the finiteness of man’s existence is poignantly present in these two essential factors of his existence.”
Another looming question is why? What is the point of having a “virtual boyfriend” or “virtual girlfriend?”This entire enterprise is part of the bleached new world, a new and dark vision of society based on alienation and loneliness. The atomization of our society is a serious problem, and we can’t ignore it. The essence of life is human encounter, and we have to be careful about how we attempt to define and deal with it.
Technology, in itself, is not evil. But what we do with it can be immoral. Talking to a virtual lover or friend who does not exist has the seductive power of being easy. Such a “friend” will always be there, it will say comfortable words, and allow users to slip further into a soma-like existence, to invoke Aldous Huxley. In some ways, developing such a “relationship” is akin to an addiction. It’s a false comfort but more than anything, it’s an escape from reality, truth, and freedom.