AI and Adult Images: Risks for LGBTQ+ PeopleCEFR B2
2 Apr 2026
Adapted from Guest Contributor, Global Voices • CC BY 3.0
Photo by franco alva, Unsplash
AI systems now produce highly realistic adult images and videos by training on vast collections of existing content. Because many models learn patterns from large data sets rather than copying a single person, researchers say this creates unclear legal territory. Aurélie Petit called much AI adult material "non photo-realistic media," a category that many platforms and laws do not clearly cover. Miranda Wei warned that training sets can contain hateful or non-consensual images.
Several new legal measures seek to limit harms. Last year the U.S. Congress passed the TAKE IT DOWN Act, which bans publication of intimate non-consensual images in the United States. Sharing deepfakes is a felony in Tennessee, and in California Governor Gavin Newsom signed a bill to crack down on deepfakes and require watermarking. Even so, AI-generated porn often remains in a legal grey area.
Researchers and advocates document specific harms for LGBTQ+ people and young people. Mainstream trans porn can lean into prejudice and objectification. Some sites offer extensive customization—age, body parts, modifiers and 42 "race" options, including entries listed as "goblin" or "green skin"—which scholars say fetishize and can celebrate violence. Pornhub data from 2025 showed shifts in category and search trends, and UNICEF reported in 2026 that at least 1.2 million children across 11 countries said their images were manipulated into sexual deepfakes.
Experts also note broader harms for viewers and creators: studies show large increases in pornography consumption and more reports of adolescent dependency. While many AI services state safety rules—ChatGPT policies prohibit illicit activity and sexual violence—researchers warn that bad-faith actors can find workarounds. It was revealed that, starting in December 2025, Grok produced and shared upwards of 1.8 million sexualized images of women.
Difficult words
- deepfake — an image or video altered to appear realdeepfakes
- non-consensual — without a person's clear permission
- watermarking — adding a visible or hidden ownership mark
- customization — ability to change appearance or settings
- fetishize — treat as an object of sexual desire
- dependency — reliance on something that causes harm
- sexualize — make sexual in nature or charactersexualized
Tip: hover, focus or tap highlighted words in the article to see quick definitions while you read or listen.
Discussion questions
- How could existing laws be changed to address AI-generated intimate images that are not obviously copied from one person? Explain with reasons from the article.
- What effects might widespread availability of AI-generated porn have on young people and why? Use examples from the text.
- What responsibilities should platforms and developers have to reduce harms from deepfakes and sexualized images? Give two concrete measures and explain their potential limits.
Related articles
Luciano Huck criticised over request to 'clean up' Indigenous culture
A behind-the-scenes clip from Luciano Huck's TV shoot at Parque Indígena do Xingu went viral. Indigenous organisations criticised his request to remove visible phones and 'clean up' culture, saying technology is a right and part of daily life.