Robotics and Artificial Intelligence in Warfare: (dis)Advantages, Regulations, and Consequences
Summary
Drones, automated and autonomous weapons, surveillance robots and unmanned vehicles. All part
of the evolving world of technology within robotic warfare and artificial intelligence. Whereas many
support the idea of military technology which could also make an end to poverty and disease, others
claim it is a risk to humanity. International laws, state policies, and rules set by the European
Parliament have guided states in the development of these techniques. However, not only are
citizens (in wars) at risk because of the lack of stricter guidelines and laws, the impact these systems
have on the environment and humanity in general are a great concern.
Introduction to Robotic Warfare & AI
Robotics and AI have become popular terms that refers to contemporary warfare, the usage of the
latest military technology by Western governments to control war (Caliskan & Liegeois 2020).
Robotics and artificial intelligence (AI) in warfare have been evolving rapidly. Robotics such as
unmanned vehicles, drones, surveillance robots, and sniper detection are a hypernym of military
robots and AI that are (in)capable of taking lives with(out) the assistance of humans. According to
Clausewitz, you need to kill people and when you kill enough of them, they stop fighting. In the
present, the opposite is shown since robotic warfare does not solely kill but can indirectly influence
the decision-making process of killing by robotics and/or humans (Brown 2018). These systems can
be used in acts on behalf of a nation state, but also in a violent manner to achieve power over the
adversary (Warren & Hillas 2018). In 2021, both the US and Israel claimed that Iranian drones had
attacked an Israeli ship during which two crewmembers were killed. In 2020, automated
machineguns were also used by Israeli forces in the assassination of an Iranian nuclear scientist. It is
clear that these unmanned systems are favoured by countries to invest in, but its dangers also
increase. It is safe to say that unmanned systems have proved to be indispensable in the post-Cold
War era and have been used in the First Gulf War, Afghanistan, Kosovo, Bosnia, Libya, Israel, Iraq,
Syria (Muhammed 2016) and most recently, by both Ukraine and Russia during the Russian invasion
of the country in February 2022.
The importance of this topic is the consequences it may have on the future of diplomacy and the
fate of citizens. Technology and policy development is tightly linked, and so it is crucial to have
clarity on what is morally and ethically permissible in regard to the use of both autonomous and
automated weapons (Cummings 2021). Certain guidelines are not only required to prevent potential
armed conflict that may potentially become out of our control but also decrease the chance of
misuse. This is where the risk resides for example, heavy promotion of robotic warfare at
exhibitions. For instance, the annual UM exhibition where states such as the UAE can invest billions
of dollars in the latest technology to improve and accumulate their military powers, whilst some
areas in those very same states live below the poverty threshold. Rules and regulations are therefore
required and need to be monitored to disable unforeseen behaviour that can lead to great
consequences. Lack of proper monitoring on these systems can have drastic impact on not only on
(inter)national relations but also on citizens. Several guidelines are set for (non)military use of