Artificial intelligence tools are becoming widely used; in April, OpenAIs ChatGPT reached a billion active weekly users. Alongside rapid adoption, reports have documented concrete harms from biased AI, including different medical treatments across demographic groups and hiring systems that disadvantaged female and Black candidates. New research from the University of Texas at Austin aims to explain one important source of such bias: the failure of models to represent complex real‑world conditions.
Hüseyin Tanriverdi and John‑Patrick Akinyemi studied a set of 363 algorithms collected in the AI Algorithmic and Automation Incidents and Controversies repository. For each flagged algorithm they compared a similar algorithm that had not been criticized, and they examined both the technical models and the organizations that deployed them. The study identifies three interrelated factors that increase the risk of unfair outcomes:
- Ground truth: some tasks lack an established objective answer, for example estimating bone age from an X‑ray or treating contested social media judgments as facts.
- Real‑world complexity: models often omit important variables; one Arkansas example replaced nurse home visits with automated rulings and cut off disabled peoples help with eating and showering.
- Stakeholder involvement: systems serving diverse groups can become biased when designed mainly by one demographic, so broader input can reveal conflicting goals and possible compromises.
The authors conclude that addressing AI bias requires more than improving accuracy. Developers need to open the black box, include diverse inputs, and ensure clear ground truths so models better reflect real conditions. The research appears in MIS Quarterly. Source: UT Austin.
Difficult words
- algorithm — a set of rules a computer followsalgorithms
- bias — an unfair preference affecting decisions or resultsbiased
- ground truth — an objective correct answer or factual reference
- stakeholder — a person or group affected by a decision
- deploy — to put a system or tool into usedeployed
- demographic — a population group defined by characteristicsdemographic groups
- repository — a place where records or data are stored
Tip: hover, focus or tap highlighted words in the article to see quick definitions while you read or listen.
Discussion questions
- How could including more diverse stakeholders change the design of an AI system you know about?
- What problems can arise when a task lacks a clear ground truth? Give an example.
- Do you think improving accuracy is enough to prevent unfair outcomes? Why or why not?
Related articles
Dopamine helps lock in new skills during sleep
A study from the University of Michigan finds that dopamine neurons become active during NREM sleep soon after a person learns a movement. Their activity, together with sleep spindles, strengthens motor memories and improves skills after sleep.
Leather waste turned into coffee fertiliser in Uganda
Researchers in Uganda have turned leather production waste into an organic fertiliser for coffee. Trials showed strong results, and the team plans a market-ready product by November to sell in several East and Central African countries.
Small pause to slow misinformation on social media
Researchers at the University of Copenhagen propose a small pause before sharing on platforms like X, Bluesky and Mastodon. A computer model shows that a short delay plus a brief learning step can reduce reshares and improve shared content quality.