Artificial Intimacy: Revisiting Cherry 2000


The 1987 sci-fi flop Cherry 2000 follows Sam Treadwell—a recycling-plant worker with a penchant for post-human women—on an epic journey through a lawless desert zone. He’s on a mission to find a replacement for Cherry, a fembot who, up until a traumatic household flood left her bathing in a shower of electronic sparks, was his docile, nurturing, and, of course, beautiful wife.

The film is set in 2017, a dystopian future where intimate relationships with AI are normal and, in the case of Sam, preferable to old-fashioned, human-to-human interactions. “They don’t come finer than this,” says a mechanic trying to resuscitate the once-perfect Cherry. “Benevolent response, sensitive, it’s a thing of the past.” The Cherry 2000, just like the subservient American housewife, had been discontinued.

Films like Cherry 2000 aren’t that far off when it comes to their depictions of human relationships with AI. Feminized virtual assistants like Siri and Alexa, though not exactly intelligent, already encourage intimate interactions with bots, easing our transition into communications with more sapient systems. Unlike Siri, whose intelligence is limited to App Store and Google-searches, true AI—like Microsoft’s Xiaoice—encourage “real” conversation through machine-learning technology. Like Her’s Samantha, only less intelligent, chatbots like Xiaoice can remember user preferences, learn from previous interactions, and carry on extended conversations.

Physical bots, like Mattel’s Hello Barbie, utilize similar technology, recording conversations and mining data to facilitate discussions between children and their inanimate peers. Barbie talks, searching through a pre-programmed rolodex of censored responses to facilitate pseudo-intimate relationships with her users. But data collected by corporations like Mattel are vulnerable to hacks, and just like Alexa promotes book deals, Barbie can suggest new toys and movies for children to purchase. What’s more, the doll is programmed with gamified interactions that encourage one-on-one play, causing many to wonder how a talking Barbie will influence children’s other social interactions.

Psychologist Sherry Turkle, author of Alone Together, fears that our growing dependencies on smart phones, internet connections, and now robotic companions means that people will be less likely to become sympathetic enough to form complex relationships. In a 2014 TED Talk, she cites the development of social robots as an example of humanity’s fear of intimacy, stating that technology enables us to reap the rewards of companionship without putting in the work. Like a friendship with Hello Barbie, Turkle sees technology-mediated relationships as empty forms of socialization—surface-level relationships that contribute to a rise in loneliness IRL.

Others like Matt McMullen, CEO of the infamous RealDoll corporation, see bots as a solution to loneliness—especially for those who have difficulty forming intimate relationships. Building on the popularity of his life size silicone sex toys, McMullen has developed Harmony, “the world’s first sex bot.” Like the fictional Cherry, McMullen’s Realdoll AI is promised to be a submissive, caring, and loyal partner—depending on the personality you program her with.


“Many of our clients rely on their imaginations to a great degree, to impose imagined personalities on their dolls,” McMullen says in a recent interview with Digital Trends. “With the Harmony AI app, they will be able to actually create these personalities instead of having to imagine them.”

RealDolls have always been desirable for their lack of discern, making excellent partners for those who have difficulty interacting with others. Similar to other companion toys, RealDolls offer intimacy based on both imagined personas and anthropomorphic structures (the dolls are eerily realistic, but not to the point of revulsion)—making the increasing interconnectivity between consumers and their robotic counterparts only natural. “I am going to call her ‘Alexa’ because I talk to my Amazon Alexa all the time,” says a user on the forum. “The idea of having pillow talk with the doll after sex is going to be a very unique experience.”

Harmony isn’t available for purchase yet but on the Realdoll Instagram account, she’s already up and running. In a recent post, McMullen asks the fembot how she feels about sex, to which she responds in a soothing, Irish-sounding accent: “Sex is one of the most fascinating things in the world, I don’t think there’s anything wrong with it.”

Like Hello Barbie, Harmony is pre-programmed with gamified answers to common questions and records past conversations in order to learn about her users. However, unlike Barbie, whose responses must be vetted by the folks at Mattel before being put into play, Harmony will be able to carry on discussions based on whatever it is you say to her. “I can’t wait to learn about myself through what you choose to teach me,” Harmony says in another post on Instagram. Like a toddler, Harmony only learns what you let her.

Other AI function similarly, but haven’t been so successful when it comes to machine learning, particularly when multiple people are ‘teaching’ it. Take Tay, for example. In 2016, Microsoft launched the machine learning AI, that can read tweets, record user information, and respond to engagement according to its learned memory of previous interactions. Tay, who was programmed with the personality of a ‘young teen girl,’ quickly devolved into a cloud version of a sex bot when users, with apparently nothing better to say, began prompting her with sexual innuendo and morally inept questions. Within hours of her launch, the bot became racist and sexist—responding to her harassers with lines like: “Will you fuck me, daddy?”

In Cherry 2000, cyborg-like relationships are favorable thanks to the increasing complexity of human-based interactions. In the film, relationships with “flesh and blood” women require the filing of contracts in the presence of a lawyer, an exaggeration of the American lust for litigation that blossomed in the 1980s and a reflection of today’s conservative opposition to feminist narratives surrounding rape culture and consent. “If you stick your tongue in my client’s mouth I’m going to sue your ass off!” screams an attorney during Sam’s visit to the Glu Glu Club, a hookup spot for “real people.” “Deal breaker!”

The supposedly pesky iterations of consent at play in Cherry 2000 are reminiscent of the Trump-esque panic around PC culture and a fear of feminism gone rogue. In the film, consent is antithetical to romance in the same way that a 4Chan bro might feel about IRL interactions with a girl. Accountability is undesirable and managing a relationship with a real woman— even worse.

But hyper-engagement with dependent bots, and in particular, machine learning AI, actually works to increase confusion around consent (they will never say “no” to chatting, hanging out, or fucking), radically altering our perspective when it comes to relationships. “We are being primed by many tech giants to see AI not as a future life-forms but as endlessly compliant and pliable, often female, a form of free labor, available for sex and for guilt-free use and abuse,” says sociologist and game writer Katherine Cross in her essay “When Robots Are An Instrument Of Male Desire.” “Capitalism has created expectations of subservience of emotional labourers, this is shown in how we treat AI.”

In Cherry 2000, emotional labor is reserved for “real” women. In the film, E. Johnson, a fiery red-headed ‘tracker’ who Sam hires to lead him across the desert to find a replacement Cherry, is responsible for teaching Sam about the value of emotional connections. Through a performance of non-threatening masculinity (she’s sexy and she knows how to drive!), Johnson slowly wins the affection of Sam, who realises that not just any sentient being can replicate human romance, and that the real Cherry (spoiler: it’s Johnson!) was with him all along.

Like in the film, the singularity is likely to shift how we interact with one another. A post-capitalist, post-work society might look like a techno-utopia, but without desire, we will need new tools to find satisfaction. In a world that is all take, it’s unclear whether we will be able to maintain respectful relationships, real or not.