📖+10 XP
🎧+10 XP
✅+15 XP
AI and racial bias at US bordersCEFR A1
25 Apr 2026
Adapted from UntoldMag, Global Voices • CC BY 3.0
Photo by Mathias Reding, Unsplash
Level A1 – BeginnerCEFR A1
2 min
71 words
- AI is used at many international borders.
- Rights groups warn it can harm migrants.
- Towers and drones watch people near borders.
- Apps ask for selfies to check identity.
- Face software fails on darker skin.
- Some systems mark migrants as security threats.
- Groups ask for notice and opt-out.
- They want governments to stop biased AI.
- They call for diverse voices in design.
- Until then they say AI should not be used.
Difficult words
- migrant — a person who moves to another countrymigrants
- drone — a small flying machine without a pilotdrones
- identity — information that shows who a person is
- biased — unfair and showing one side more
- notice — official information that tells people about something
- threat — a possible danger to safety or peoplethreats
Tip: hover, focus or tap highlighted words in the article to see quick definitions while you read or listen.
Discussion questions
- Have you ever taken a selfie for an app?
- Would you choose to opt out if AI watched people at a border?
Related articles
28 Nov 2025
New acid-free way to recycle lithium-ion batteries
Researchers at Rice University developed a two-step FJH-ClO process that separates lithium and other metals from spent batteries. The lab-scale method recovers valuable materials with less energy, fewer chemicals and less wastewater.
6 Nov 2025
21 Apr 2026
24 Feb 2026
AI audio summaries of research can help — and err
Researchers tested Google’s NotebookLM, which turns research papers into podcast-style audio. The summaries were engaging and clearer for teaching, but every audio overview contained mistakes, so the authors advise reading the original papers to check claims.
29 Dec 2025