AI and Adult Images: Risks for LGBTQ+ PeopleCEFR B1
2 Apr 2026
Adapted from Guest Contributor, Global Voices • CC BY 3.0
Photo by franco alva, Unsplash
Artificial intelligence can generate realistic adult images and videos on demand. Many tools teach models using large data sets of existing content, so the output is often not based on one person’s likeness. Researchers warn this leaves a legal and ethical gray area.
Aurélie Petit described much AI adult media as "non photo-realistic media," and Miranda Wei warned that data sets may include hateful or non-consensual images. Advocates say mainstream trans porn online often uses slurs and harmful tropes, and some sites offer extensive customization that can fetishize trans people.
In 2025 Pornhub also reported top gay searches such as "femboy" and "twink," and UNICEF reported in 2026 that at least 1.2 million children across 11 countries said their images were manipulated into sexual deepfakes. Experts note rising pornography consumption, adolescent dependency, and limits to platform safety rules, while some new laws target deepfakes.
Difficult words
- generate — to produce or create something, especially content
- data set — a collection of data used to teach modelsdata sets
- likeness — the appearance or look of a particular person
- gray area — an unclear legal or moral situation
- non-consensual — done without a person's clear and voluntary permission
- fetishize — to treat someone as an object of sexual desire
- deepfake — an altered image or video made to seem realdeepfakes
- dependency — a strong need for something that causes harm
Tip: hover, focus or tap highlighted words in the article to see quick definitions while you read or listen.
Discussion questions
- What changes could laws or platforms make to reduce harmful sexual deepfakes?
- How might widespread customizable adult media affect public attitudes toward transgender people?
- What can families or schools do to address adolescent dependency on pornography?