Research groups and non-government organisations across Latin America are experimenting with generative AI to expand access to sexual and reproductive health information for young and marginalised populations, especially where traditional services have fallen short. In Peru, TeleNanu — designed by obstetrician Ana Miluzka Baca Gamarra — is a Quechua-language chatbot built around a five-step counselling model that aims to build trust, identify needs, check understanding and keep communication open. Midwives trained the system using World Health Organization and Peru Ministry of Health guidelines, peer-reviewed literature and their professional experience. The platform logged more than 88,000 queries in the last year in Quechua and Spanish.
The Peruvian non-profit APROPO launched NOA, a generative AI service available on WhatsApp, the web and social media, and says it used accurate local and international data to train the tool. These initiatives come as Peru faces rising sexual health challenges: more than 8,000 new HIV cases were reported in 2024, young adults in their 20s were most affected, and 12 per cent of births were to mothers aged ten to 19. APROPO aims to reach 100,000 adolescents by 2026.
At the same time researchers warn of real harms. Argentina’s 2022 census counted almost 200,000 people as transgender or non-binary, and a 2021 report documented a life expectancy of 35-40 years alongside high discrimination in health centres. Tests by CIECTI found stigmatising responses and clinical errors in large language models, including denial of appropriate procedures depending on whether someone was identified as transgender or cisgender. The team created an assessment tool and plans to add data to reduce bias.
Experts and project groups such as CLIAS, which promoted 15 AI projects between 2023 and 2024 and produced a guide for high-quality health datasets, stress the need for better data, regulation, public-private coordination and community involvement. They conclude that AI can support sexual and reproductive health when tools are trained with representative information and used with timely human oversight.
Difficult words
- generative AI — computer systems that create new text or images
- marginalised — people pushed to the social or economic margins
- counselling model — a structured set of steps for providing advice
- peer-reviewed literature — research checked by other experts before publication
- stigmatising — treating someone as socially unacceptable
- bias — a systematic unfair preference or tendency
- representative — showing the variety of a whole grouprepresentative information
- oversight — careful supervision or monitoring of actionshuman oversight
Tip: hover, focus or tap highlighted words in the article to see quick definitions while you read or listen.
Discussion questions
- What benefits can AI chatbots bring to young and marginalised people seeking sexual and reproductive health information? Give reasons from the article.
- What risks related to bias and discrimination are mentioned, and how could projects reduce them?
- How might public-private coordination and community involvement improve AI health projects in your region?
Related articles
Shared social media and changing networks in rural families
A study of rural students and one of their parents finds that university often increases who young people meet, while social media usually broadens networks. Sharing platforms between parents and children has mixed effects on network diversity and tolerance.