FEATURE

You Don’t Own Your Likeness

The Impact of AI Porn Generation on Sex Workers

Phoebe Robertson

In November 2017, an anonymous user shared an algorithm on Reddit, introducing the world to deepfake technology. This algorithm utilised existing AI technology to create realistic fake videos. The internet, being its usual self, swiftly shared and further developed the code.

I first became aware of this technology in 2018 when BuzzFeed released a deepfake video titled “You Won’t Believe What Obama Says In This Video!” The video featured former U.S president Barack Obama delivering a message about the need for caution when trusting online content. This wasn’t him, it was a deepfake of his likeness.

Over the following six years, deepfake technology continued to evolve, revealing its most significant threat: the proliferation of explicit deepfake pornography. It became evident that deepfakes posed a greater danger in the realm of pornographic content than in political or celebrity contexts.

Celebrities such as Kristen Bell, Jenna Ortega, and Emma Watson became victims of deepfake videos. In 2023, Twitter searches for TikTok creators Addison Rae Easterling, Charli D’Amelio, and Bella Poarch revealed sexually explicit deepfakes.

Kristen Bell, in an interview with Vox, expressed her shock and distress upon discovering deepfakes of herself circulating on the internet. She emphasised that seeing her own face exploited without consent was deeply unsettling.

A simple Google search for “deepfake porn” yields over 100 million results, including titles like “Best Celebrity DeepFake Porn Videos.” Celebrities are particularly vulnerable targets due to the abundance of videos and images featuring their faces and bodies online. This extensive content availability allows AI programs to generate higher-quality deepfakes. Moreover, there is also a significant demand for celebrity porn, as demonstrated by the fame Kim Kardashian achieved following the leak of her sex tape in 2007.

Moreover, there is also a significant demand for celebrity porn, as demonstrated by the fame Kim Kardashian achieved following the leak of her sex tape in 2007.

Deepfake technology also has severe consequences for regular individuals. The 2023 documentary, My Blonde GF, explores the experiences of writer Helen Mort, who discovered her own face on a pornographic website. The film delves into the profound psychological impact this event had on Helen and sheds light on the countless similar stories of women facing the same ordeal.

According to a 2019 report by Sensity AI, a company monitoring deepfakes, 96% of deepfake images were non-consensual sexual content, with 99% of the images specifically targeting women. Currently efforts are being made to ban deepfake technology, and platforms like Pornhub have already prohibited AI-generated pornographic material.

However, the exploitation extends beyond celebrities and individuals whose faces are manipulated. Deepfake pornography denies women the right to consent to the use of their likeness. It places the power solely in the hands of individuals seeking to exploit them for personal gratification. None of the women featured in pornographic deepfakes, videos, or other materials have given their consent. This includes sex workers whose bodies have been used alongside someone else’s face. Neither the sex workers, nor the celebrities, nor ordinary women have consented to having their images transformed into pornographic material spread across the Internet.

Online sex workers are already facing challenges such as increasing sanctions and content blacklisting. Platforms like Twitter and Tumblr have banned pornographic content, and OnlyFans announced, then retracted, a ban on pornographic material in 2021. Stolen content has always been an issue in online sex work.

Now, sex workers face an additional threat: their bodies and images can be stolen, repackaged, and distributed with someone else’s face through deepfakes. The income sex workers take home is crucial to their financial stability and like any creative industry, they possess rights to the content they create. Deepfakes unabashedly steal this content, objectify sex workers by removing or replacing their faces, and even profit from it.

I had the opportunity to speak with Kit, a sex worker with three years of experience, who sells pornographic content on Twitter and engages in individualised sessions. She acknowledged that her content is already being stolen and monetised without the need for deepfakes. However, she expressed concern that deepfakes would make it easier for her content to be misused and harder to have it removed.

An article from Vice titled “Deepfakes Were Created As a Way to Own Women’s Bodies—We Can’t Forget That” effectively summarises this idea. Drawing from Kit’s experience, it becomes evident that the internet, as a whole, was originally designed as a means to exploit women’s bodies. This fact should not be forgotten, whether we consider the manipulation of individual faces, or the theft of sex workers’ bodies.

The issue at hand extends beyond the exploitation of sex workers’ bodies; it also pertains to the emergence of AI-generated pornography in an already competitive and stigmatised industry. When discussing the future, Kit believes that AI-driven pornography will find its place in the industry, regardless of ethical concerns. The internet has already created spaces for animated porn, audio porn, cam streamers, and other forms of erotic content. These alternatives have existed since the early days of the internet, with genres like hentai gaining significant attention. Additionally, sites like Lessons of Passion offer animated porn games that allow users to make choices impacting the content they receive.

However, unlike these alternatives, deepfakes and AI algorithms exploit existing content and individuals. They bring nothing new to the ecosystem.

Platforms like OnlyFans, Pornhub, and ManyVids require creators to verify their age and identity before uploading content. This acts as a barrier that AI-generated porn models cannot overcome because they are not real.

Yet, it is disconcerting that generating one’s own realistic AI porn takes just minutes using a simple Google search, or a mobile app download. In the future, it is conceivable that sites similar to OnlyFans may emerge, featuring AI models capable of instantaneously responding to messages as if they were real people.

While Kit, who primarily engages in calling, FaceTime, and texting sessions, is not overly concerned about the impact of AI on her own career, as it revolves around personal relationships with clients, she worries about the ethical dilemmas posed by AI. Her primary concern lies in the customisation capabilities of AI, which could be used to create lifelike replicas of real people for malicious purposes, or to produce “illegal and immoral porn,” thus normalising these actions.

This discussion leads us back to a consistent critique of AI: it can only draw from the content it is provided. If AI models are built using the images, videos, and conversations of real-life sex workers, it further exploits them and undermines their platforms.

AI perpetuates society’s existing biases, including racism, sexism, and misogyny, which are deeply embedded in the fabric of the internet. These biases extend to all aspects, including the pornographic material generated and circulated.

The main takeaway from this article is the recognition that AI technology was born from the desire of men to have absolute control over women’s bodies. It has now become yet another example of the exploitation of sex workers’ bodies and content. It is essential to acknowledge and address this issue, not only by seeking to control the development of this technology or preventing the use of our images in pornographic material, but also by working towards destigmatising sex work, supporting those affected by deepfakes, and urging local governments to protect citizens’ online identities and sex workers’ intellectual property rights.