LingVo.club
📖+40 XP
🎧+25 XP
+45 XP
When AI Favors Profit over People — Level B2 — a sticker on the side of a wall

When AI Favors Profit over PeopleCEFR B2

21 Apr 2026

Level B2 – Upper-intermediate
6 min
303 words

The piece appears in the series “Don't ask AI, ask a peer,” a collaboration among Global Voices, the Association for Progressive Communication (APC) and GenderIT, and is part of Global Voices' April 2026 Spotlight series, "Human perspectives on AI." The author, Hija Kamran, says her work makes her cautious about new technologies and that she has often been a late adopter.

Kamran argues technology is not neutral and that tech companies repeatedly show their primary commitment is to business models rather than to people. To illustrate this, she cites the remark "Senator, we run ads," attributed to Mark Zuckerberg, and reports a company representative told her, "I encourage people to read our terms of service." She sees these responses as evidence of limited transparency and accountability.

Training data for AI systems is drawn from the internet and public records, and it reflects histories of exclusion, racism, sexism and economic inequality. When models learn from that data, they can encode and amplify existing harms while presenting outputs as neutral. Corporate incentives — profit motives, shareholders and growth targets — influence which problems are prioritised, how fast products are deployed, and whose land, knowledge or lives become "collateral damage." Kamran also highlights the risk of dehumanisation, especially in militarised contexts where people can be reduced to data points and treated as targets.

AI systems generate outputs from probabilities and do not understand context, history or responsibility; they can imitate human patterns but cannot be human, care or hold relationships. Kamran calls for a human rights approach that shifts accountability to those with power and urges scepticism early in the development and commercialisation of technologies. She recommends asking:

  • Who built the system?
  • How does the system work?
  • Who benefits from it?

Hija Kamran is the lead editor of GenderIT.org and an advocacy strategist within APC’s Women’s Rights Programme.

Difficult words

  • accountabilityduty to explain or accept responsibility
  • transparencyopen and clear information about actions
  • encodeconvert information into a coded form
  • amplifymake something stronger or more noticeable
  • collateral damageunintended harm to people or things
  • dehumanisationtreating people as less than human
  • probabilitymeasure of how likely something is
    probabilities

Tip: hover, focus or tap highlighted words in the article to see quick definitions while you read or listen.

Discussion questions

  • How might asking "Who benefits from it?" change the development or use of AI systems?
  • What risks come from dehumanisation in militarised contexts, and how could they be reduced?
  • Do you think a human rights approach could make companies more accountable? Why or why not?

Related articles