LingVo.club
📖+40 XP
🎧+25 XP
+45 XP
How AI and Drones Are Changing Conflict in Colombia — Level B2 — a white and black remote controlled flying over a mountain

How AI and Drones Are Changing Conflict in ColombiaCEFR B2

25 Apr 2026

Adapted from Liam Anderson, Global Voices CC BY 3.0

Photo by Jaime Maldonado, Unsplash

Level B2 – Upper-intermediate
5 min
300 words

Artificial intelligence and related technologies are reshaping Colombia’s long-running armed conflict. Since 2024 several non-state armed groups have attacked police stations and military positions with modified commercial drones carrying explosives, causing hundreds of uniformed personnel to be injured or killed. Those low-cost devices do not rely on AI, but they illustrate how armed actors with limited resources can adapt commercial tools; in other conflicts, drones appear as more complex, mass-produced systems.

From 2025 the government began to build a national anti-drone shield: a hybrid technological platform that combines specialized sensors, micro-Doppler and radio-frequency systems, signal jamming and physical neutralizing mechanisms. This architecture mixes sensors, data-processing algorithms and human decision-making and changes how threats are detected, evaluated and countered. Security institutions are also adopting algorithmic tools to guide operations. The Police Service Model, adopted in 2024, seeks focused deployment using real-time data, while the Colombian Aerospace Force reports using surveillance and reconnaissance systems that combine advanced sensors and data processing to build risk models for conflict zones. These systems shape the operational maps used for action but do not act alone.

Digital manipulation and social control are rising problems. In 2023 AI-generated audio and videos were used during regional elections and afterward to impersonate candidates, medical staff and military personnel, pushing messages and calls for protest. Community social media groups meant to share security alerts have been co-opted by armed actors to circulate photographs and profiles of local leaders. Closed WhatsApp groups and anonymous Facebook profiles amplify accusations that become digital blacklists tied to threats, forced displacement and targeted killings. The immediate risk is not only a shift toward more autonomous systems but that these tools will make surveillance more efficient, disinformation faster, and state intervention potentially more unequal.

Difficult words

  • non-statenot controlled by a national government
  • dronean unmanned aircraft controlled remotely
    drones
  • hybridmade from two different types combined
  • neutralizemake something harmless or ineffective
    neutralizing
  • algorithmicrelating to computer rules that process data
  • impersonatepretend to be someone else
  • co-opttake over for a different purpose
    co-opted
  • disinformationfalse information meant to deceive people

Tip: hover, focus or tap highlighted words in the article to see quick definitions while you read or listen.

Discussion questions

  • How might a national anti-drone shield change how security forces operate in conflict zones?
  • What steps could reduce the harm caused by AI-generated impersonations during elections?
  • How can communities protect local leaders when social media groups are co-opted by armed actors?

Related articles