Across Latin America research teams and NGOs are deploying AI to widen access to sexual and reproductive health information, aiming in particular at young people and marginalised groups such as Indigenous and transgender communities. In Peru obstetrician Ana Miluzka Baca Gamarra led the design of TeleNanu, a Quechua-language chatbot that uses generative AI and a five-step counselling model to build trust, identify needs, check understanding and keep communication open. Midwives trained the system using World Health Organization and Peru Ministry of Health guidelines, peer-reviewed literature and professional knowledge. TeleNanu has received more than 88,000 queries in the last year in Quechua and Spanish.
The Peruvian non-profit APROPO launched NOA, a generative AI platform available on WhatsApp, the web and social media, which it says was trained with accurate local and international data. The use of AI comes as Peru faces rising sexual health challenges: more than 8,000 new HIV cases were reported in 2024 and 12 per cent of births were to mothers aged ten to 19. APROPO aims to reach 100,000 adolescents by 2026.
Researchers also examine risks for transgender people. Argentina’s 2022 census counted almost 200,000 people as transgender or non-binary, and a 2021 report found life expectancy of 35-40 years and high discrimination in health centres. CIECTI tested large language models and found stigmatising responses and clinical errors, including denial of appropriate procedures depending on whether a person was identified as transgender or cisgender. The team built a tool to assess such harms and plans to add data to reduce bias. Experts stress the need for better data, regulation, public-private coordination and community involvement, and say AI can support sexual and reproductive health when paired with timely human oversight.
Difficult words
- obstetrician — doctor who cares for pregnant women
- chatbot — computer program that chats with people
- generative AI — technology that creates new text or content
- counselling — advice and support for personal problems
- midwife — health worker who assists childbirthMidwives
- guideline — official rules or advice for actionguidelines
- query — question or request for informationqueries
- discrimination — unfair treatment of a person or group
- bias — unfair preference that affects judgement
- marginalised — kept separate and given less support
Tip: hover, focus or tap highlighted words in the article to see quick definitions while you read or listen.
Discussion questions
- Do you think chatbots in local languages can make young people trust sexual health information more? Why or why not?
- What risks for transgender people does the article describe about AI tools and health services?
- How could organisations involve communities when they build AI tools for health information?
Related articles
AI to stop tobacco targeting young people
At a World Conference in Dublin (23–25 June), experts said artificial intelligence can help stop tobacco companies targeting young people online. They warned social media and new nicotine products draw youth into addiction, and poorer countries carry the heaviest burden.