LingVo.club
Level
AI expands sexual and reproductive health information in Peru and Argentina — Level B2 — a young boy standing against a yellow wall looking at a tablet

AI expands sexual and reproductive health information in Peru and ArgentinaCEFR B2

8 Dec 2025

Adapted from Agustín Gulman, SciDev CC BY 2.0

Photo by Nikolay Likomanov, Unsplash

Level B2 – Upper-intermediate
6 min
334 words

Groups in Latin America are deploying artificial intelligence to broaden sexual and reproductive health information for young and marginalised people, targeting obstacles such as language, stigma and uneven service provision. Projects in Peru and Argentina combine generative AI, local knowledge and established health guidance to reach people who lack reliable information.

In Peru, obstetrician Ana Miluzka Baca Gamarra created TeleNanu at the University of San Martín de Porres. TeleNanu, meaning "confidant" in Quechua, is a Quechua-language chatbot that follows a five-step counselling model: build rapport, identify needs, respond, verify understanding and keep communication open. Midwives trained the model using World Health Organization and Peru Ministry of Health guidelines, peer-reviewed literature and professional expertise. The platform provides evidence-based answers and can refer users to human counselling. It handled more than 88,000 queries in the last year in Quechua and Spanish, including some from outside Peru.

In October, the Peruvian non-profit APROPO launched NOA, a generative AI platform available on WhatsApp, the web and social media. APROPO says NOA was trained with accurate local and international data and aims to reach 100,000 adolescents by 2026 using digital strategies for high-need areas.

These initiatives respond to worrying public health data: more than 8,000 new HIV cases were reported in 2024, young adults in their 20s are most affected, 12 per cent of births were to mothers aged ten to 19, and adolescent maternal mortality is rising. Experts and activists highlight key challenges: unequal access, lack of diverse and ethical training data, and the need for public–private coordination. Activists warn AI can reproduce historic discrimination against transgender people. Researchers at Argentina’s CIECTI tested large language models and found stigmatising responses and clinical gaps; they built a tool to classify harm and plan to create more representative data to reduce bias. Conicet researcher Marcelo Risk called bias in training data central and urged human oversight, while other specialists recommend linking scientific research with health systems and involving communities in design and evaluation.

Difficult words

  • deployto put tools or resources into use
    deploying
  • generativeable to produce new text or images
  • counsellingprofessional advice and emotional support for health
  • obstetriciana doctor who cares for pregnancy and childbirth
  • stigmaa social mark that causes shame or exclusion
  • biasan unfair preference affecting judgements or results

Tip: hover, focus or tap highlighted words in the article to see quick definitions while you read or listen.

Discussion questions

  • How could chatbots in local languages change access to sexual and reproductive health information for young and marginalised people?
  • What measures should organisations take to reduce bias and protect transgender people when using AI in health information?
  • What are the advantages and risks of connecting AI tools with official health systems and community input?

Related articles

Hair can record chemical exposure — Level B2
15 Dec 2025

Hair can record chemical exposure

Researchers at the University of Texas at Austin found that human hair stores a timeline of chemical exposure. By heating intact strands and scanning the released molecules, the team reconstructed past exposures that blood or urine cannot show.