Activists and researchers in Latin America are developing open, local AI tools to study gender inequalities and violence. DataGénero, founded by Ivana Feldfeber, created AymurAI to search court case documents for material relevant to gender-based violence. The tool runs on local servers so data stays secure; it collects texts without interpreting them and sends exactly what it finds to a database. AymurAI was developed before the rise of Chat GPT and has been used in courts in Argentina, Chile and Costa Rica since 2021, with data from more than 10,000 rulings.
Other groups take different approaches. Derechos Digitales, led by Jamila Venturini, analyses technology policy and warns that many AI systems are built far from the region and carry biases. In Mexico, PIT Policy Lab worked with Guanajuato to predict school dropouts and found 4,000 young people misidentified as not at risk; the team added open-source bias detection tools and provided training on human rights and gender in AI.
Computer scientist Daniel Yankelevich of Fundar stresses the need for local models and data because behaviour varies by culture. Common next steps are improving training data, adding audio transcription, strengthening protection frameworks and promoting public policies to limit biased or opaque algorithms.
Difficult words
- initiative — A new plan to achieve a goal.initiatives
- reshape — To change the shape or form of something.
- violence — Behavior causing physical harm or pain.
- implement — To put a plan into action.implemented
- security — Protection from danger or loss.
- expert — A person with special knowledge.Experts
- population — Groups of people living in an area.populations
Tip: hover, focus or tap highlighted words in the article to see quick definitions while you read or listen.
Discussion questions
- Why do you think local contexts are important for AI solutions?
- How can technology protect vulnerable populations?
- What impacts do you see from AI in addressing gender-based violence?