- Research groups use AI to share health information.
- They work across Latin America with local partners.
- In Peru a Quechua chatbot helps young people.
- Midwives trained the chatbot with health guidelines.
- Another group made a tool for WhatsApp and web.
- Many young and marginalised people can ask it.
- Researchers say AI can make problems for trans people.
- Tests found biased answers and clinical errors.
- Experts want better data, rules and oversight.
Difficult words
- research — work to find new knowledge or facts
- partner — a person or group who helps with workpartners
- midwife — a health worker who helps birth and babiesMidwives
- chatbot — a computer program that talks with people
- marginalised — people left out of normal services or society
- biased — not fair or showing a strong opinion
Tip: hover, focus or tap highlighted words in the article to see quick definitions while you read or listen.
Discussion questions
- Have you ever used a chatbot?
- Would you ask a chatbot about health?
- Do you trust information on WhatsApp?
Related articles
Two studies find mixed results on grouping English learners
Two recent studies compared grouping English learners together in school classes. A high school study of 31,303 students linked higher EL concentrations to lower graduation and college entry; an elementary trial found no overall difference but different benefits by skill level.
Wearable 10‑Minute Antibody Sensors from University of Pittsburgh
Researchers at the University of Pittsburgh made a wearable biosensor that detects antibodies in interstitial fluid in 10 minutes without a blood draw. The tiny carbon nanotube sensors are highly sensitive and the work appears in Analytical Chemistry.
New acid-free way to recycle lithium-ion batteries
Researchers at Rice University developed a two-step FJH-ClO process that separates lithium and other metals from spent batteries. The lab-scale method recovers valuable materials with less energy, fewer chemicals and less wastewater.