Health groups and researchers in Latin America are using artificial intelligence to share sexual and reproductive health information. The work aims to help young and marginalised people who face language barriers, stigma and limited services.
In Peru, TeleNanu is a Quechua-language chatbot that uses generative AI and simple counselling steps. Midwives trained it with World Health Organization and Peru health guidelines. The system gives evidence-based answers and can suggest human counselling when needed.
In October, a Peruvian non-profit launched NOA, an AI platform on WhatsApp, the web and social media. It aims to reach 100,000 adolescents by 2026. Experts say access, biased data and coordination are important challenges.
Difficult words
- artificial intelligence — Computer systems that can perform smart tasks.
- marginalised — People with little social or economic power.
- stigma — A negative social idea about a person or group.
- chatbot — A computer program that chats with people.
- counselling — Help and advice about personal or health problems.
- evidence-based — Based on reliable research or trusted information.
Tip: hover, focus or tap highlighted words in the article to see quick definitions while you read or listen.
Discussion questions
- Do you think AI chatbots can help people who speak different languages? Why?
- Would you use a WhatsApp health platform like NOA? Why or why not?
- What problems could make these AI health projects hard to use?
Related articles
Teen drug use in the US stays near pandemic low
For the fifth year in a row, use of most substances among US teenagers remains close to the low point reached in 2021, according to the University of Michigan's Monitoring the Future survey. Some drugs rose slightly, and researchers say monitoring must continue.
AI expands sexual and reproductive health access in Latin America
Research groups in Peru and Argentina use AI tools to give sexual and reproductive health information to young and marginalised people. Experts praise potential but warn of bias and call for better data, rules and oversight.