What some would classify as “science fiction” has actually taken place in Africa.
Last year “an autonomous weaponized drone hunted down a human target last year” and attacked them without being specifically ordered to, according to a report from the UN Security Council’s Panel of Experts on Libya, published in March 2021.
The “killer drone” acted on its own without human intervention, “hunted down” and killed a human target without being instructed to do so.
Lethal autonomous weapons systems, sometimes called “killer robots,” are weapon systems that use artificial intelligence, software and algorithm to identify, select, and kill human targets without human control. This means that the decision to kill a human target is no longer made by humans, but software.
The incident took place during clashes in Libya last year, a country that has witnessed the largest drones war in humankind history.
In the incident, a KARGU-2 quadcopter drone manufactured by STM, a Turkish company autonomously attacked a Libyan National Army (LNA) personnel during a conflict.
The Turkish-built KARGU-2, targeted, hunted down and killed one of Haftar’s soldiers while he tried to retreat.
The killer drone was fitted with an explosive charge, and it nosedived on the human target, exploding its charge thereby killing the unsuspecting victim.
According to STM, the KARGU is a rotary wing attack drone that has been designed for asymmetric warfare or anti-terrorist operations. It can be carried by a single personnel in both autonomous and manual modes.
This, experts believes is likely the first time drones have attacked humans on its own without instructions to do so.
The UN report states: “The lethal autonomous weapons systems were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true ‘fire, forget and find’ capability.”