WORDS by HOLLY COLEMAN
Desire, historically a private act of self-discovery, has been turned inside out in the digital age. What was once a matter of interpersonal exchange is now mediated, captured, and repackaged by algorithms, with digital platforms offering us the illusion of infinite choice. The current landscape of sexual identity and intimate preference no longer takes shape through organic experience, but through the machinery of data collection and commodification. From Bumble to Grindr, AI now shapes the ways we swipe, flirt, and fantasize.
Image of a person discussing desire with a chat bot generated by dall-e 3 AI
The very structures of these apps, powered by machine learning, work to fine-tune suggestions based on a constant stream of personal data—our preferences, photos, likes, and even time spent lingering on particular profiles. But these algorithms, designed to predict what might spark our attraction, are trained on historical data sets riddled with social biases. This means that, even as AI claims to adapt to us, it’s often feeding us back the same stereotypes we’re already systemically steeped in—and the more we swipe, the more we conform to the patterns of selection encoded in these systems. This is to say, the app is not decoding your desires; it's feeding you a pre-made identity, one reinforced by centuries of colonization, gendered repression, and the commodification of bodies. Desire, in this context, is not discovered but sold.
The implications of this mechanized mediation are significant. Think of the phenomenon of racial fetishization, which now isn't merely a social dynamic—it’s also algorithmic. When data suggests that certain racial profiles get more likes or matches, the algorithm shifts, favoring these characteristics and presenting them to other users with increased frequency. Suddenly, your deepest attractions are no longer yours alone, but the results of a feedback loop designed to maximize clicks, engagement, and, ultimately, profit. The lines between human connection and capitalist exploitation have become obscured by companies whose earnings are built upon the commercialization of desire.
What’s perhaps most insidious is the way these systems are shaping not just who we desire but how we experience desire. The rise of AI-powered sexting bots and chat algorithms further complicates our understanding of intimacy. Programs like Replika, designed to emulate real human interaction, promise to adapt to your sexual preferences, even learning how best to arouse you. But the eroticism here is hollow—it’s a simulation designed to play into our insecurities, our fears of loneliness. AI doesn’t “know” your fantasies, it’s simply repurposing collective fantasies, molded from countless data points of previous interactions. It’s not intimacy—it’s industry.
The ripple effects of AI's intervention into human desire extend beyond individual relationships, threatening to reshape the very framework of intimacy itself. Attachment theory, which has gained widespread cultural currency in recent years, argues that our earliest relationships shape our romantic bonds as adults. (1) But what happens when, as adults, our most substantial romantic relationships are with an algorithm? We may risk not just losing control over whom we desire, but how we express that desire altogether. The algorithmic structure of modern intimacy may well be training us to seek fulfillment not from human connection, but from the systems that mediate it.
The structures of contemporary capitalism push us to conform, not just economically but emotionally and aesthetically. Under the guise of freedom, we are led into passivity. The consumer, in this context, becomes the consumed—trapped in a cycle where desire is both commodified and repackaged back to us. What we mistake for agency is often the result of carefully designed systems that narrow our options while giving the illusion of infinite choice. What we are actually being offered is not freedom, but a more sophisticated form of control.
So, how do we reclaim agency over our own desires and resist this mechanization of passion? Rather than refusing to engage with these platforms altogether, the answer may lie in cultivating a critical awareness of how they may be shaping our preferences. If we’ve learned anything from body-positivity movements, queer activism, and the reclamation of sexual autonomy, it’s that desire is inherently political. It’s not just about who we love, but how we love, and how that love challenges—or conforms to—the structures of power around us. (2)
We must begin by questioning the systems that seek to control our intimate lives. Desire, in its purest form, is unruly. It cannot be neatly categorized into data points or preferences. Desire is fluid, contradictory, and often inexplicable. It’s a force that resists easy quantification. To reclaim desire, we must embrace its ambiguity and complexity. We must resist the temptation to reduce it to something that can be understood, managed, and ultimately, controlled.
In the end, the fight against algorithmic desire is not just a personal one—it’s a collective struggle. It’s a struggle against a system that seeks to turn us into passive consumers of our own sexuality. But as long as we remain aware, as long as we continue to push back, there’s hope that we can reclaim our most intimate selves from the grip of the machine. And in that reclamation, we might just rediscover the radical potential of desire itself.
Holly Coleman is an adjunct instructor at the University of North Florida and a Ph.D. student at Old Dominion University. Her research centers on literature of the long 19th century, with a focus on British Romanticism and Gothic, examining its contemporary resonances in areas of gender, sexuality, and political resistance.
References
(1) Levine, A., & Heller, R. (2010). Attached: The new science of adult attachment and how it can help you find—and keep—love. Penguin Random House, p. 25.
(2) Paasonen, S. (2011). Carnal resonance: Affect and online pornography. MIT Press, p. 156.
Ngai, S. (2020). Interview with Sianne Ngai. The White Review. https://www.thewhitereview.org/feature/interview-with-sianne-ngai/
Thurston, B. (2024). How dating sites automate sexual racism. Harvard Gazette. https://news.harvard.edu/gazette/story/2024/04/how-dating-sites-automate-sexual-racism/