When AI Favors Profit over PeopleCEFR B1
21 Apr 2026
Adapted from Guest Contributor, Global Voices • CC BY 3.0
Photo by Marija Zaric, Unsplash
This piece is part of a collaboration among Global Voices, the Association for Progressive Communication and GenderIT. The author, Hija Kamran, says her work makes her cautious about new technologies and that she has often been a late adopter.
Kamran argues that tech companies repeatedly show their primary commitment is to business models rather than to people. She cites the remark "Senator, we run ads," attributed to Mark Zuckerberg, and recalls a company representative who told her, "I encourage people to read our terms of service." She says these responses reveal a lack of meaningful transparency and accountability.
The article explains that technology is not neutral. Training data drawn from the internet and public records reflects histories of exclusion, racism, sexism and economic inequality. When AI learns from that data, it can encode and amplify those harms. Corporate incentives — profit motives, shareholders and growth targets — shape which problems are prioritised and how quickly products roll out. Kamran calls for a human rights approach and urges asking who built a system, how it works and who benefits.
Difficult words
- collaboration — work done together by two or more groups
- cautious — careful about risks or possible problems
- late adopter — person who accepts new technology later
- business model — way a company makes money from productsbusiness models
- transparency — openness about actions and decision-making processes
- terms of service — rules users must agree to use a product
- training data — examples used to teach a computer system
- encode — to convert information into a different form
- corporate incentive — motivation or reward that companies seekCorporate incentives
- human rights approach — way of working that protects basic rights
Tip: hover, focus or tap highlighted words in the article to see quick definitions while you read or listen.
Discussion questions
- Do you trust technology companies to protect users? Why or why not?
- What questions would you ask before using a new AI product?
- How could a human rights approach change the way companies build technology?
Related articles
When to Give a Child a Phone and Why Some Families Use Landlines
Child development experts say middle school is often a good time for a personal phone. Some parents choose a home landline because it limits apps and supports family conversations. Experts advise guided use rather than banning technology.
Fishermen, trawlers and new local committees in Douala-Edea
Local fishing communities around Douala-Edea National Park face violent attacks and illegal fishing that damage mangroves and reduce fish. New local collaborative management committees were installed to help monitor and protect resources.