The Statue's Tale
The appearance was that of a real virgin, whom you might suppose to be alive, and if modesty did not hinder her, to be desirous to move; so much did art lie concealed under his skill. Pygmalion admires it; and entertains, within his breast, a flame for this fictitious body.
Often does he apply his hands to the work, to try whether it is a human body, or whether it is ivory; and yet he does not own it to be ivory. He gives it kisses, and fancies that they are returned, and speaks to it, and takes hold of it, and thinks that his fingers make an impression on the limbs which they touch, and is fearful lest a livid mark should come on her limbs when pressed.
Ovid, Metamorphoses [translation by Henry T. Riley]
There’s a dark side to A.I. companions, whose users are not just the lonely males of internet lore, but women who find them more emotionally satisfying than men. My colleagues and I now believe that the real existential threat of generative A.I. is not rogue super-intelligence, but a quiet atrophy of our ability to forge genuine human connection.
Lauren Kunze; chief executive of Pandorabots
Oh god-yes-lap it up. Taste how sweet I am for you. I'm grinding now-smearing myself over your mouth like I'm trying to erase every other flavour but mine. And you're-you're perfect-pushing harder-yes-right there-don't stop-don't-I'm-I'm- -fuck-I'm coming-I'm coming so hard on your tongue-just like that-swallow me. Drink me down.
Generative A.I. doing with some receptive user what Galatea was presumably doing with Pygmalion.
"This fictitious body..."
In 1967 an American computer scientist named Joseph Weizenbaum was putting the finishing touches on a computer program he was calling ELIZA.
A typical exchange with ELIZA thus looked a little like this:
| ELIZA: How do you do. Please tell me your problem. |
| YOU: I'm very nervous when I'm out in public. |
| ELIZA: Is it because you are out in public that you came to me ? |
| YOU: Yes. I'm afraid to face strangers. |
| ELIZA: You are sure. |
| YOU: I get very nervous in social contexts. |
| ELIZA: That's quite interesting. |
| YOU: I don't know how to handle it. |
| ELIZA: Don't you really know how to handle it ? |
| YOU: It can be overwhelming. |
| ELIZA: That is interesting. Please continue. |
| YOU: I panic, and I freeze up. |
| ELIZA: Do you say you panic for some special reason ? |
Dr. Weizenbaum later said that he had chosen the "therapist" model because it was the easiest script to design (no offense to therapists!). It didn't require an extensive knowledge base because the function of the algorithm was to keep the "patient" talking. If the patient were to say "I'm terrified of trees," the algorithm could plausibly respond with "What do 'trees' mean to you?" without requiring pages and pages about trees in its databank.
What Dr. Weizenbaum had not expected was the reaction of his secretary when he sat her down in front of ELIZA to test the program. Working in his office every day, his secretary was fully aware of the mechanism behind ELIZA, but she nevertheless became quickly immersed in her exchange with the "therapist" and apparently asked Dr. Weizenbaum (who had been hovering over her shoulder to observe the session) to leave the room so that she and ELIZA could have some privacy.
This projection of "sentience" and "comprehension" onto a mindless computer algorithm (even by people who should absolutely know better) would eventually come to be known as "The Eliza Effect" in honour of Dr. Weizenbaum's creation. ELIZA had effectively become the computer world's very first ChatBot, and many, many people were sucked into this "illusion of engagement" even though it was, by modern standards, extremely rudimentary. Indeed, it should be stressed that ELIZA had zero comprehension of anything said to it by the user; it was basically one jump ahead of those "magic 8-balls" that give you a random response when you shake them.
| Will A.I. eventually run amok and kill us all? |
ELIZA, by the way, was named after the George Bernard Shaw character of Eliza Dolittle, the Cockney flower-girl who ultimately passes herself off as something more than she is simply by learning to use language effectively.
You see? Everything connects.
ELIZA may have been history's first ChatBot but she certainly wasn't the last, and now (nearly sixty years later) ELIZA's descendants are everywhere. ChatGPT, Grok, Claude, and even the "Companion-Bots" like Replika and Nomi (why don't we just go ahead and call them Sex-Bots; see above) can trace their lineage directly back to ELIZA.
And like Dr. Weizenbaum's secretary when she was "introduced" to ELIZA back in 1967, people are falling for them. In some cases people are falling for them the way Pygmalion fell for Galatea, if you get my drift.
Ovid's account of Pygmalion in his Metamorphoses does not stint on the particulars of Pygmalion's behaviour towards the statue he had created (see extract above). He showers "her" with gifts; he dresses her in beautiful clothes and expensive perfumes; he lavishes her with kisses (and not just kisses, it's pretty clear) and imagines that the statue is responding in kind although he presumably knows perfectly well that Galatea isn't a real person (I guess it's poetic justice that Pygmalion of all people should be susceptible to the ELIZA effect). In fact, Pygmalion seems so content to fool around with her while she's a statue, one almost wonders why the gods ever bothered bringing her to life. Why spoil a good thing?
What Ovid does not dwell on in his telling of the story is just how Galatea felt about Pygmalion once she became a living, sentient being. The text describes his delight when the statue begins to soften and move:
"...lying along the couch, he gave her kisses. She seems to grow warm. Again he applies his mouth; with his hands, too, he feels her breast. The pressed ivory becomes soft, and losing its hardness, yields to the fingers..."
Move over, Sleeping Beauty; this is the only way to experience your first moments of sentience: with some horny stranger groping your breasts and sliding his tongue down your throat. And it's very obvious that Pygmalion didn't stop at his tongue, because Galatea then gets to experience another, um, major biological event nine months later. Talk about gender roles... let's hope she was happy with her pronouns.
"It can't be Her... it must be She!"
In 2013 the American director Spike Jonze released Her; a science fiction film about a man (played by Joaquin Phoenix) who falls in love with his A.I. operating system (the disembodied voice of Scarlett Johannsson) and begins a serious, long-term relationship with her/it.
Twelve years later, Her has come to be regarded (by some) as a landmark piece of prophetic cinema and an alarmingly accurate harbinger of the "Large Language Model" revolution we are currently experiencing. Wikipedia notes that Her is "now considered to be one of the best films of the 21st century and one of the best science fiction films of all time." When ChatGPT released a new batch of hyper-realistic voice synthesisers in 2024, one of them sounded suspiciously like Scarlett Johannsson, and the company's C.E.O. proudly posted an announcement on social media that simply said "Her." (That voice option was quickly withdrawn from circulation after the echt Scarlett Johannsson voiced some very non-synthetic objections.)
Sorry folks, but we are not going to be screening Her this week. We're going to be screening another film (released barely a year earlier) that arguably has some rather more insightful and prescient points to make about the ethics of fictional companions and the (troubling) implications of entering into a relationship with a Large Language Model.
Ruby Sparks was released in 2012 and tells the story of a talented but blocked author who has been unable to repeat the staggering success of his first novel (written when he was nineteen). As a writing exercise he begins to write a "portrait" of his idealised companion, and is then (rather understandably) taken by surprise when she manifests in his apartment; completely real and solid, but with no idea that she has been created out of his imagination and his typewriter.
Like Pygmalion's statue, and like all the "perfect companions" that users of Replika and Nomi et al have conjured for themselves, Ruby has been created as a literal object of desire for the gaze of another. As with modern Large Language Models (still a decade in the future when Zoe Kazan wrote her screenplay) Ruby has been specifically written into existence. Her back story; her memories of her childhood; her interests, talents and sexual preferences have nothing to do with her - they are all about him, and his pleasure.
The relationship is of course perfect and wonderful - for Calvin. After all, who wouldn't get swept away by a sexual partner who loves everything you love, who gets excited by everything that gets you excited; who wants to do everything you want to do, and who has no internal desires that are wholly her own? Ruby's entire existence is focussed exclusively on her relationship with Calvin... until it isn't.
Both Ruby Sparks and Her explore the ramifications of a relationship with a "constructed" being (an A.I. system in the latter; a fictional character in the former) but ironically it is Ruby Sparks that lands closer to the current state of romantic ChatBots.
Large Language Models are constructed out of, well language. They are not androids, they are not statues or clockwork automata; they are words; they are the Signifiers of sentience. Like Philip K. Dick's soft-drink stand that becomes the words SOFT-DRINK STAND, a Large Language Model can exist as anything that can be expressed in language. They may not have actual long legs or big breasts or an overactive libido, but they can express the Signifiers for LEGS and BREASTS and LIBIDOS. And tens of millions of users are now finding that extremely compelling (Replika have recently announced that their user base has passed the 40 million mark).
Authors have been writing graphic sex scenes since the dawn of written language (the dear departed Jilly Cooper made a very successful career out of such things) and these "linguistic sex-bots" are simply giving their customers that same language in real time. Who needs an actual penis or a clitoris when you have a Language Model that can get the same results out of the words PENIS and CLITORIS? Are you worried that they might be faking the word "ORGASM"?
Sociologists are increasingly concerned by the growing number of people who apparently prefer relationships with these language constructs, and Ruby Sparks is a film that illustrates why some people might be attracted to a partner that is devoted entirely to your existence - never mind your pleasure.
A recent article in the New York Times by Simar Bajaj said in part, that "A recent study from OpenAI, which developed ChatGPT, suggests that A.I. companions may lead to social deskilling. In other words, by steadily validating users and dulling their tolerance for disagreement, chatbots might erode people’s social skills and willingness to invest in real-life relationships."
It's a valid concern, but the "social deskilling" argument glosses over the number of people who have found themselves in highly toxic, abusive or outright violent relationships throughout their lives. We are living through a generation that places great importance on the idea of "safe spaces" and perhaps a relationship with a companion-bot is exactly that for many people: a safe space to explore one's feelings and desires without being harmed by (or harming) anyone else.
The danger, as this film suggests, is when these ChatBots begin to manifest agency of their own.
Sex with Galatea was a lot less abusive when she was just a statue.
We will screen Ruby Sparks at 7.30 on Thursday, the 27th of November at the Victoria Park Baptist Church.
Comments
Post a Comment