Artificial intelligence tools are now used by many people, but they can cause harm when they are biased. Reports say biased systems gave different medical treatments and that hiring tools discriminated against female and Black candidates. New research from the University of Texas at Austin studied 363 algorithms from a public repository and compared each flagged algorithm with a similar one that was not called out.
The researchers identified three main risks. First, there is no clear ground truth for some tasks, for example guessing a bones age from an X‑ray or judging hate speech when people disagree. Second, models can miss important real‑world details; one Arkansas change replaced nurse visits and cut off help for disabled people. Third, systems built by one group can miss other perspectives. The study says developers must look inside algorithms and include diverse input.
Difficult words
- biased — showing unfair treatment of some people
- discriminate — treat people unfairly because of groupdiscriminated
- algorithm — a set of rules a computer followsalgorithms
- repository — a place where many files or data are kept
- ground truth — the real correct answer for a task
- hate speech — words that show strong dislike or attack
- developer — a person who makes computer systems or softwaredevelopers
- diverse — including many different kinds of people
Tip: hover, focus or tap highlighted words in the article to see quick definitions while you read or listen.
Discussion questions
- Do you think it is important for developers to include different people? Why?
- Can you name a task that has no clear ground truth? Give an example.
- How could a computer model miss real-world details where you live?
Related articles
Gut has a backup system for IgA antibodies
Researchers found two different routes that make IgA antibodies in the gut. Early IgA often comes from non‑germinal center cells but later from germinal centers; both types showed similar specificity and mutations, which may help vaccine design.