📖+10 XP
🎧+10 XP
✅+15 XP
Level A1 – BeginnerCEFR A1
2 min
71 words
- Research groups use AI to share health information.
- They work across Latin America with local partners.
- In Peru a Quechua chatbot helps young people.
- Midwives trained the chatbot with health guidelines.
- Another group made a tool for WhatsApp and web.
- Many young and marginalised people can ask it.
- Researchers say AI can make problems for trans people.
- Tests found biased answers and clinical errors.
- Experts want better data, rules and oversight.
Difficult words
- research — work to find new knowledge or facts
- partner — a person or group who helps with workpartners
- midwife — a health worker who helps birth and babiesMidwives
- chatbot — a computer program that talks with people
- marginalised — people left out of normal services or society
- biased — not fair or showing a strong opinion
Tip: hover, focus or tap highlighted words in the article to see quick definitions while you read or listen.
Discussion questions
- Have you ever used a chatbot?
- Would you ask a chatbot about health?
- Do you trust information on WhatsApp?
Related articles
10 Jan 2022
17 Mar 2026
29 Dec 2025
10 Mar 2026
Rumeen Farhana wins Brahmanbaria-2 as an independent
Rumeen Farhana, a Bangladeshi barrister long linked to the BNP, ran as an independent in Brahmanbaria-2 after losing her party nomination. She used a duck symbol, won by 38,000 votes, and says she faced harassment and party conflict.
17 Apr 2026
Indonesia tightens rules for digital platforms
Indonesia is increasing regulation of global digital platforms to curb misinformation and protect public safety. Officials inspected a major company's office, require platform registration, and use takedown systems, which has drawn criticism over unclear rules and rights.