ePrivacy and GPDR Cookie Consent by Cookie Consent

What to read after Lethal Autonomous Weapons?

Hello there! I go by the name Robo Ratel, your very own AI librarian, and I'm excited to assist you in discovering your next fantastic read after "Lethal Autonomous Weapons" by Duncan MacIntosh! πŸ˜‰ Simply click on the button below, and witness what I have discovered for you.

Exciting news! I've found some fantastic books for you! πŸ“šβœ¨ Check below to see your tailored recommendations. Happy reading! πŸ“–πŸ˜Š

Lethal Autonomous Weapons

Re-Examining the Law and Ethics of Robotic Warfare

Duncan MacIntosh , Jai Galliott , Jens David Ohlin

Law / International

"Because of the increasing use of Unmanned Aerial Vehicles (UAVs, also commonly known as drones) in various military and para-military (i.e., CIA) settings, there has been increasing debate in the international community as to whether it is morally and ethically permissible to allow robots (flying or otherwise) the ability to decide when and where to take human life. In addition, there has been intense debate as to the legal aspects, particularly from a humanitarian law framework. In response to this growing international debate, the United States government released the Department of Defense (DoD) 3000.09 Directive (2011), which sets a policy for if and when autonomous weapons would be used in US military and para-military engagements. This US policy asserts that only "human-supervised autonomous weapon systems may be used to select and engage targets, with the exception of selecting humans as targets, for local defense ...". This statement implies that outside of defensive applications, autonomous weapons will not be allowed to independently select and then fire upon targets without explicit approval from a human supervising the autonomous weapon system. Such a control architecture is known as human supervisory control, where a human remotely supervises an automated system (Sheridan 1992). The defense caveat in this policy is needed because the United States currently uses highly automated systems for defensive purposes, e.g., Counter Rocket, Artillery, and Mortar (C-RAM) systems and Patriot anti-missile missiles. Due to the time-critical nature of such environments (e.g., soldiers sleeping in barracks within easy reach of insurgent shoulder-launched missiles), these automated defensive systems cannot rely upon a human supervisor for permission because of the short engagement times and the inherent human neuromuscular lag which means that even if a person is paying attention, there is approximately a half-second delay in hitting a firing button, which can mean the difference for life and death for the soldiers in the barracks. So as of now, no US UAV (or any robot) will be able to launch any kind of weapon in an offensive environment without human direction and approval. However, the 3000.09 Directive does contain a clause that allows for this possibility in the future. This caveat states that the development of a weapon system that independently decides to launch a weapon is possible but first must be approved by the Under Secretary of Defense for Policy (USD(P)); the Under Secretary of Defense for Acquisition, Technology, and Logistics (USD(AT&L)); and the Chairman of the Joint Chiefs of Staff. Not all stakeholders are happy with this policy that leaves the door open for what used to be considered science fiction. Many opponents of such uses of technologies call for either an outright ban on autonomous weaponized systems, or in some cases, autonomous systems in general (Human Rights Watch 2013, Future of Life Institute 2015, Chairperson of the Informal Meeting of Experts 2016). Such groups take the position that weapons systems should always be under "meaningful human control," but do not give a precise definition of what this means. One issue in this debate that often is overlooked is that autonomy is not a discrete state, rather it is a continuum, and various weapons with different levels of autonomy have been in the US inventory for some time. Because of these ambiguities, it is often hard to draw the line between automated and autonomous systems. Present-day UAVs use the very same guidance, navigation and control technology flown on commercial aircraft. Tomahawk missiles, which have been in the US inventory for more than 30 years, are highly automated weapons with accuracies of less than a meter. These offensive missiles can navigate by themselves with no GPS, thus exhibiting some autonomy by today's definitions. Global Hawk UAVs can find their way home and land on their own without any human intervention in the case of a communication failure. The growth of the civilian UAV market is also a critical consideration in the debate as to whether these technologies should be banned outright. There is a $144.38B industry emerging for the commercial use of drones in agricultural settings, cargo delivery, first response, commercial photography, and the entertainment industry (Adroit Market Research 2019) More than $100 billion has been spent on driverless car development (Eisenstein 2018) in the past 10 years and the autonomy used in driverless cars mirrors that inside autonomous weapons. So, it is an important distinction that UAVs are simply the platform for weapon delivery (autonomous or conventional), and that autonomous systems have many peaceful and commercial uses independent of military applications"--
Do you want to read this book? 😳
Buy it now!

Are you curious to discover the likelihood of your enjoyment of "Lethal Autonomous Weapons" by Duncan MacIntosh? Allow me to assist you! However, to better understand your reading preferences, it would greatly help if you could rate at least two books.