Part of a series on |
Artificial intelligence (AI) |
---|
![]() |
Generative AI pornography or simply AI pornography is a digitally created pornography produced through generative artificial intelligence (AI) technologies. [1] Unlike traditional pornography, which involves real actors and cameras, this content is synthesized entirely by AI algorithms. [2] These algorithms, including Generative adversarial network (GANs) and text-to-image models, generate lifelike images, videos, or animations from textual descriptions or datasets.
The use of generative AI in the adult industry began in the late 2010s, initially focusing on AI-generated art, music, and visual content. [3] This trend accelerated in 2022 with Stability AI's release of Stable Diffusion (SD), an open-source text-to-image model that enables users to generate images, including NSFW content, from text prompts using the LAION-Aesthetics subset of the LAION-5B dataset. [4] [5] [6] Despite Stability AI's warnings against sexual imagery, SD's public release led to dedicated communities exploring both artistic and explicit content, sparking ethical debates over open-access AI and its use in adult media. [7] [8] [9] By 2020[ dubious – discuss ], AI tools had advanced to generate highly realistic adult content, amplifying calls for regulation. [2] [10]
One application of generative AI technology is the creation of AI-generated influencers on platforms such as OnlyFans and Instagram. [11] [12] [3] These AI personas interact with users in ways that can mimic real human engagement, offering an entirely synthetic but convincing experience. [13] While popular among niche audiences, these virtual influencers have prompted discussions about authenticity, consent, and the blurring line between human and AI-generated content, especially in adult entertainment. [14]
By 2023, websites dedicated to AI-generated adult content had gained traction, catering to audiences seeking customizable experiences. [10] [11] These platforms allow users to create or view AI-generated pornography tailored to their preferences. [2] [15] These platforms enable users to create or view AI-generated adult content appealing to different preferences through prompts and tags, customizing body type, facial features, and art styles. [16] [17] Tags further refine the output, creating niche and diverse content. Many sites feature extensive image libraries and continuous content feeds, combining personalization with discovery and enhancing user engagement. AI porn sites, therefore, attract those seeking unique or niche experiences, sparking debates on creativity and the ethical boundaries of AI in adult media. [18] [10]
Popular sites like civitai.com allow users to create, upload, and download fine-tuned versions of open source models of SDXL, Flux, and that are specifically designed for generating various pornographic scenes or effects.
The growth of generative AI pornography has also attracted some cause for criticism. [10] [11] [15] AI technology can be exploited to create non-consensual pornographic material, posing risks similar to those seen with deepfake revenge porn and AI-generated NCII (Non-Consensual Intimate Image). [19] A 2023 analysis found that 98% of deepfake videos online are pornographic, with 99% of the victims being women. [20] Some famous celebrities victims of deepfake include Scarlett Johansson, Taylor Swift, and Maisie Williams. [13]
OpenAI is exploring whether NSFW content, such as erotica, can be responsibly generated in age-appropriate contexts while maintaining its ban on deepfakes. [21] This proposal has attracted criticism from child safety campaigners who argue it undermines OpenAI's mission to develop "safe and beneficial" AI. [8] Additionally, the Internet Watch Foundation has raised concerns about AI being used to generate sexual abuse content involving children. [22]
Several US states are taking actions against using deepfake apps and sharing them on the internet. [23] [24] In 2024, San Francisco filed a landmark lawsuit to shut down "undress" apps that allow users to generate non-consensual AI nude images, citing violations of state laws. [25] The case aligns with California's recent legislation—SB 926, SB 942, and SB 981—championed by Senators Aisha Wahab and Josh Becker and signed by Governor Gavin Newsom. These bills aim to protect individuals from AI-generated explicit images by criminalizing non-consensual distribution, mandating disclosures, and empowering victims to report and remove harmful content from platforms. [24] [26]
While both generative AI pornography and deepfake pornography rely on synthetic media, they differ in their methods and ethical considerations. [13] Deepfake pornography typically involves altering existing footage of real individuals, often without their consent, using AI to superimpose faces or modify scenes. [18] [20] In contrast, generative AI pornography is created using algorithms, producing hyper-realistic content without the need to upload real pictures of people. [9] [8] Hany Farid, digital image analysis expert, also described the difference between "AI porn" and "deepfake porn." [14]
The legality of generative AI pornography varies widely by jurisdiction and remains an evolving issue. In some countries, laws addressing digital impersonation, obscenity, or deepfake technologies may indirectly apply, particularly when AI-generated content involves the likeness of real individuals without consent. The absence of a physical performer further complicates traditional regulatory frameworks, which are often grounded in performer protection and distribution laws. [27]
In the United States, legal responses have primarily focused on non-consensual deepfakes and impersonation. [28] Some states, such as Virginia, California, and Texas, have enacted legislation criminalising the creation or distribution of non-consensual explicit deepfake content. However, there is no comprehensive federal law addressing AI-generated pornography, leaving a patchwork of legal interpretations and enforcement standards across different jurisdictions. [29]
According to a 2023 report, South Korea accounts for approximately 53% of global deepfake pornography production. [30] In September 2024, South Korea’s National Assembly amended the Act on Special Cases Concerning the Punishment of Sexual Crimes, introducing two significant reforms related to deepfake content. [31] The first criminalises the possession, viewing, purchase, and storage of non-consensual deepfake material, with penalties of up to three years in prison or fines of up to 30 million won (approximately USD 20,000). The second reform specifically addresses the exploitation of minors, establishing that individuals who use deepfakes to threaten or blackmail minors face a minimum of three years’ imprisonment, and at least five years if they coerce minors into unwanted acts. [32]