Groups in Latin America are deploying artificial intelligence to broaden sexual and reproductive health information for young and marginalised people, targeting obstacles such as language, stigma and uneven service provision. Projects in Peru and Argentina combine generative AI, local knowledge and established health guidance to reach people who lack reliable information.
In Peru, obstetrician Ana Miluzka Baca Gamarra created TeleNanu at the University of San Martín de Porres. TeleNanu, meaning "confidant" in Quechua, is a Quechua-language chatbot that follows a five-step counselling model: build rapport, identify needs, respond, verify understanding and keep communication open. Midwives trained the model using World Health Organization and Peru Ministry of Health guidelines, peer-reviewed literature and professional expertise. The platform provides evidence-based answers and can refer users to human counselling. It handled more than 88,000 queries in the last year in Quechua and Spanish, including some from outside Peru.
In October, the Peruvian non-profit APROPO launched NOA, a generative AI platform available on WhatsApp, the web and social media. APROPO says NOA was trained with accurate local and international data and aims to reach 100,000 adolescents by 2026 using digital strategies for high-need areas.
These initiatives respond to worrying public health data: more than 8,000 new HIV cases were reported in 2024, young adults in their 20s are most affected, 12 per cent of births were to mothers aged ten to 19, and adolescent maternal mortality is rising. Experts and activists highlight key challenges: unequal access, lack of diverse and ethical training data, and the need for public–private coordination. Activists warn AI can reproduce historic discrimination against transgender people. Researchers at Argentina’s CIECTI tested large language models and found stigmatising responses and clinical gaps; they built a tool to classify harm and plan to create more representative data to reduce bias. Conicet researcher Marcelo Risk called bias in training data central and urged human oversight, while other specialists recommend linking scientific research with health systems and involving communities in design and evaluation.
Difficult words
- deploy — to put tools or resources into usedeploying
- generative — able to produce new text or images
- counselling — professional advice and emotional support for health
- obstetrician — a doctor who cares for pregnancy and childbirth
- stigma — a social mark that causes shame or exclusion
- bias — an unfair preference affecting judgements or results
Tip: hover, focus or tap highlighted words in the article to see quick definitions while you read or listen.
Discussion questions
- How could chatbots in local languages change access to sexual and reproductive health information for young and marginalised people?
- What measures should organisations take to reduce bias and protect transgender people when using AI in health information?
- What are the advantages and risks of connecting AI tools with official health systems and community input?
Related articles
Rwanda strengthens response to Rift Valley Fever outbreak
Rwanda reported a second Rift Valley Fever outbreak near the Tanzania border and has increased surveillance and livestock vaccination. Officials say lack of rapid diagnostic tests slows detection, while a human vaccine candidate enters Phase II trials.
After-work invitations can help some employees but harm others
New research shows after-work invitations often make socially confident employees feel connected, while shy workers can feel pressure and anxiety. Authors advise people to know their limits and for coworkers to think before inviting.
AI and citizen photos identify Anopheles stephensi in Madagascar
Scientists used AI and a citizen photo from the GLOBE Observer app to identify Anopheles stephensi in Madagascar. The study shows how apps, a 60x lens and a dashboard can help monitor this urban malaria mosquito, but access and awareness limit use.