I find that my thoughts are sometimes drawn to things for reasons I don’t entirely understand at first. In his beautiful and pungent little book, Drawing Life (now out of print I’m afraid), David Gelernter described a similar phenomenon he found in himself, while convalescing from having been blown up by the Unabomber. He found that, for reasons he only later understood, his thoughts kept being drawn to American culture of the 1930’s. In the fog of pain and recovery from his injuries, he found himself devouring anything he could get his hands on related to 1930’s Americana. Only later, as he emerged from the physical pain and emotional turmoil of having been permanently injured by a serial killer, did he realize that somewhere in the recesses of his mind he was researching a book he intended to write, but hadn’t yet realized that was what he was doing.
In my own case, I have found myself repeatedly returning, in my thoughts, to a plot device introduced by J.K. Rowling in her Harry Potter series. The “Mirror of Erised” was a magical mirror, housed at Hogwarts School of Witchcraft and Wizardry, which would, the headmaster said, reflect to us the "deepest, most desperate desire of our hearts." Erised, of course, is the word desire spelled backward.
The danger represented by the mirror was in its ability to transfix those who looked into it by showing them their deepest desires made manifest. The propensity for being consumed by one’s own desires is so strong that some had even squandered their entire lives away, standing transfixed in front of the mirror.
I’ve written elsewhere about what seems to be the palantir-esque vibe of the information saturation that attends our online experience. But I have come to suspect that my recent inclination to ponder the Mirror of Erised is because it seems ever more prescient regarding how technology is being intentionally applied to manipulate our desire. It is, alas, hard to avoid the suspicion that manipulating our desire is merely a prelude to eating out our substance, as Thomas Jefferson so memorably complained in the Declaration of Independence. Such suspicion is hardly original to me.
Then it came burning hot into my mind, whatever he said, and however he flattered, when he got me home to his house, he would sell me for a slave. - John Bunyan, Pilgrim's Progress.
Natasha Shüll has written a fascinating book, though quite disturbing, entitled Addiction By Design: Machine Gambling in Las Vegas. In it, she recounts the explicit goal casinos have for machine gambling design as being to entice users to “play to extinction”. By which casinos mean that gamblers will continue to play until their financial resources have been completely exhausted. And the players are conscious of it, much as social media players are conscious of the compulsiveness of their own attraction to social media. For as one poker player sadly described how playing machines differs from playing against other human beings: “It’s a more direct way to your destiny.”
It is hard not to suspect, if you know anything at all about the business models of the largest online companies, that these companies will at best be driven by necessity - at worst they’ll be driven by malevolence - to adopt casino-like tactics in an effort utterly exhaust the reservoirs of attention of their users. For it is the attention of others that they covet, because it is the thing for which they are paid. So this leads me to believe that we would be wise to be awake to the manipulation of our desires by the attention merchants of social media.
I have long had a Facebook account, though several years ago I moved away from posting anything on Facebook myself. Any online posting I have done has been on my personal web site or, more recently, here on Substack. But I retain my Facebook account because there are a handful of groups I participate in which post information only on Facebook. (e.g. The school my 3rd grader attends.) I do not install social media apps on my phone, so I use only my web browser to access the Facebook groups I read. But I have recently observed that, on my phone and only on my phone, the ads I’m receiving in my newsfeed feature young women in ever-increasing states of undress. I even suspect that at least some of these images are actually being generated by artificial intelligence and are not photographs of actual human women. I have also observed that these ads do not show up when accessing Facebook from my laptop. I conclude from this that even Facebook assumes these ads are NSFW and presupposes that, when accessing Facebook from anywhere other than my phone, the ads are inappropriate. My own reaction to this state of affairs has been to stop accessing Facebook from my phone.
What I take from this is that Facebook advertisers have made the decision to be more aggressive in manipulating desire as a way of sustaining attention. Even when I have actively flagged and blocked these advertisers, eventually similar ads start showing up in my newsfeed again.
None of this is particularly surprising. In fact, advertisers have, for generations, preyed upon the gullibility of their targets by using images to place assumptions in our minds which we would otherwise laugh at if the assumptions were actually put into words. To offer just one example, there is an obvious and hilarious improbability to the idea that the practice of beer drinking makes men irresistible to women. Nevertheless, large advertising budgets have been spent using visual imagery to impress, unexamined, that very proposition into the minds of many men.
There is an emerging convergence of technologies that we should expect will further empower the effort to manipulate human desire as a vehicle in service to monopolizing attention.
Consider this interesting video from the robotics company, Boston Dynamics.
The combining of AI language models with robotics may seem innocuous when used to facilitate tour-guiding dogs. (As an aside, I note the employment of a British accent for the “dog” in this video. I have written before about my suspicions regarding the underlying motivations for using British accents when vocalizing AI-generated responses.) But I can’t help but notice that the illusion of consciousness is amplified when integrated with an embodiment, even if such embodiment is merely robotic. We must ask ourselves, how might this kind of technology redirect and manipulate human desire when language models are integrated with, say, increasingly lifelike sex robots?
Already we are seeing the early emergence of AI “girlfriends”. We should expect this technology to be aggressively integrated with “sexual” robotics to produce alternatives that relieve both men and women of the awkward dynamics that invariably attend actual flesh-and-blood relationships.
“But Keith”, you may say, “surely you aren’t suggesting that sex robots are going to show ads to their owners?” Well, never say never. But I will just observe that personalized online ads are fueled by data collected on the ad’s target. Privacy intrusion is the stock-in-trade of social media - smartphones are, like it or not, surveillance devices. And there is perhaps nothing more private than one’s sexuality. It would be prudent to assume that advertisers would like nothing more than to acquire insight into the sexual proclivities of whomever they are targeting with ads. AI-enabled sex robots are likely to be used for smartphone-like surveillance. They are devices connected to wifi which are packed with sensors and microphones and maybe even cameras. They will likely become veritable gold mines of information in service to human manipulation. To say nothing of the possibilities for sextortion.
The thoroughgoing anti-humanism of this convergence of technologies should be obvious, but I will make two observations that are top-of-mind for me as I contemplate this state of affairs. First there is the unrelenting pathos which flows from picturing someone sitting alone at home, conversing with, or having sex with, a robot. It reflects a dismal and dehumanized kind of life, but it is sobering when you contemplate that companies are nevertheless eagerly working away on technology to facilitate just this kind of scenario.
The second observation is regarding the human dissipation this facilitates. It is striking to me that the predominate approaches to manipulating human desire that can be observed in our time are consistently manipulations away from the procreative benefits of human sexuality. Mary Harrington has shrewdly observed that “the pill” marked the beginning of transhumanism in the West, because it was the first culturally accepted medical treatment that undermined the natural functioning of the female human body in pursuit of the unnatural. But the pill, while undermining fertility, did not fundamentally redirect the desire of men away from women. (There is some evidence that the pill does, in fact, play very curious games with the desires of women, but that’s probably a different post.) I will observe, however, that the sexual enthusiasms of the culture over the last generation, at least, have been entirely focused on redirecting male and female desire away from each other as opposites. This is a central feature of homosexuality, transgenderism, and now robotic sexuality. In each of these enthusiasms, human desire is being prodded away from even the potentiality of having a procreative effect.
And yet we feign surprise at impending demographic collapse.
The manipulation of human desire is a powerful vehicle for keeping us transfixed and devoid of more constructive, or especially procreative, pursuits. Yuval Harari describes those who are not part of the technocratic elite as “useless people” who will probably be given drugs and video games as a means of dissipating their presumptively useless lives. Perhaps, if they are also given AI-enabled sex robots, they will stop reproducing altogether. One suspects such notions, even if left unsaid, are nevertheless present in some very sinister minds.
I went to hit the like button but realized what a conundrum that was. While being wisely informed I was utterly disgusted at where we find ourselves in our culture. I LIKE that you spoke truth. But hate that it is so.