Should Humans Create "Killer Robots"?
"War...Has Changed" - Solid Snake
Summary
Fully autonomous weapons, also known as “killer robots,” raise serious moral and legal concerns because they would possess the ability to select and engage their targets without meaningful human control. Many people question whether the decision to kill a human being should be left to a machine. There are also grave doubts that fully autonomous weapons would ever be able to replicate human judgment and comply with the legal requirement to distinguish civilian from military targets. Other potential threats include the prospect of an arms race and proliferation to armed forces with little regard for the law.
These concerns are compounded by the obstacles to accountability that would exist for unlawful harm caused by fully autonomous weapons. This report analyzes in depth the hurdles to holding anyone responsible for the actions of this type of weapon. It also shows that even if a case succeeded in assigning liability, the nature of the accountability that resulted might not realize the aims of deterring future harm and providing retributive justice to victims.
Fully autonomous weapons themselves cannot substitute for responsible humans as defendants in any legal proceeding that seeks to achieve deterrence and retribution. Furthermore, a variety of legal obstacles make it likely that humans associated with the use or production of these weapons—notably operators and commanders, programmers and manufacturers—would escape liability for the suffering caused by fully autonomous weapons. Neither criminal law nor civil law guarantees adequate accountability for individuals directly or indirectly involved in the use of fully autonomous weapons.
The need for personal accountability derives from the goals of criminal law and the specific duties that international humanitarian and human rights law impose. Regarding goals, punishment of past unlawful acts aims to deter the commission of future ones by both perpetrators and observers aware of the consequences. In addition, holding a perpetrator responsible serves a retributive function. It gives victims the satisfaction that a guilty party was condemned and punished for the harm they suffered and helps avoid collective blame and promote reconciliation. Regarding duties, international humanitarian law mandates personal accountability for grave breaches, also known as war crimes. International human rights law, moreover, establishes a right to a remedy, which encompasses various forms of redress; for example, it obliges states to investigate and prosecute gross violations of human rights law and to enforce judgments in victims’ civil suits against private actors.
Existing mechanisms for legal accountability are ill suited and inadequate to address the unlawful harms fully autonomous weapons might cause. These weapons have the potential to commit criminal acts—unlawful acts that would constitute a crime if done with intent—for which no one could be held responsible.[1] A fully autonomous weapon itself could not be found accountable for criminal acts that it might commit because it would lack intentionality. In addition, such a robot would not fall within the “natural person” jurisdiction of international courts. Even if such jurisdiction were amended to encompass a machine, a judgment would not fulfill the purposes of punishment for society or the victim because the robot could neither be deterred by condemnation nor perceive or appreciate being “punished.”
Human commanders or operators could not be assigned direct responsibility for the wrongful actions of a fully autonomous weapon, except in rare circumstances when those people could be shown to have possessed the specific intention and capability to commit criminal acts through the misuse of fully autonomous weapons. In most cases, it would also be unreasonable to impose criminal punishment on the programmer or manufacturer, who might not specifically intend, or even foresee, the robot’s commission of wrongful acts.[2]
The autonomous nature of killer robots would make them legally analogous to human soldiers in some ways, and thus it could trigger the doctrine of indirect responsibility, or command responsibility. A commander would nevertheless still escape liability in most cases. Command responsibility holds superiors accountable only if they knew or should have known of a subordinate’s criminal act and failed to prevent or punish it. These criteria set a high bar for accountability for the actions of a fully autonomous weapon.
Command responsibility deals with prevention of a crime, and since robots could not have the mental state to commit an underlying crime, command responsibility would never be available in situations involving these weapons. If that issue were set aside, however, given that the weapons are designed to operate independently, a commander would not always have sufficient reason or technological knowledge to anticipate the robot would commit a specific unlawful act. Even if he or she knew of a possible unlawful act, the commander would often be unable to prevent the act, for example, if communications had broken down, the robot acted too fast to be stopped, or reprogramming was too difficult for all but specialists. In addition, “punishing” the robot after the fact would not make sense. In the end, fully autonomous weapons would not fit well into the scheme of criminal liability designed for humans, and their use would create the risk of unlawful acts and significant civilian harm for which no one could be held criminally responsible.
An alternative approach would be to hold a commander or a programmer liable for negligence if, for example, the unlawful acts brought about by robots were reasonably foreseeable, even if not intended. Such civil liability can be a useful tool for providing compensation for victims and provides a degree of deterrence and some sense of justice for those harmed. It imposes lesser penalties than criminal law, however, and thus does not achieve the same level of social condemnation associated with punishment of a crime.
Regardless of the nature of the penalties, attempts to use civil liability mechanisms to establish accountability for harm caused by fully autonomous weapons would be equally unlikely to succeed. On a practical level, even in a functional legal system, most victims would find suing a user or manufacturer difficult because their lawsuits would likely be expensive, time consuming, and dependent on the assistance of experts who could deal with the complex legal and technical issues implicated by the use of fully autonomous weapons. The legal barriers to civil accountability are even more imposing than the practical barriers. They are exemplified by the limitations of the civil liability system of the United States, a country which is generally friendly to litigation and a leader in the development of autonomous technology.
Immunity for the US military and its defense contractors presents an almost insurmountable hurdle to civil accountability for users or producers of fully autonomous weapons. The military is immune from lawsuits related to: (1) its policy determinations, which would likely include a choice of weapons, (2) the wartime combat activities of military forces, and (3) acts committed in a foreign country. Manufacturers contracted by the military are similarly immune from suit when they design a weapon in accordance with government specifications and without deliberately misleading the military. These same manufacturers are also immune from civil claims relating to acts committed during wartime.
Even without these rules of immunity, a plaintiff would find it challenging to establish that a fully autonomous weapon was legally defective for the purposes of a product liability suit. The complexity of an autonomous robot’s software would make it difficult to prove that it had a manufacturing defect, that is, a production flaw that prevented it from operating as designed. The fact that a fully autonomous weapon killed civilians would also not necessarily indicate a manufacturing defect: a robot could have acted within the bounds of international humanitarian law, or the deaths could have been attributable to a programmer who failed to foresee and plan for the situation. The plaintiffs’ ability to show that the weapons’ design was in some way defective would be impeded by the complexity of the technology the unavailability of existing alternative weapons to serve as points of comparison, and the limited utility of warnings where the hazards inherent in a weapon that operates independently are generally apparent but unpredictable in specifics.
A system of providing compensation without establishing fault has been proposed for other autonomous technologies. Under such a scheme, victims would have to provide only proof that they had been harmed, not proof that the product was defective. This approach would not, however, fill the accountability gap that would exist were fully autonomous weapons used. No-fault compensation is not the same as accountability, and victims of fully autonomous weapons are entitled to a system that punishes those responsible for grave harm, deters further harm, and shows that justice has been done.
Some proponents of fully autonomous weapons argue that the use of the weapons would be acceptable in limited circumstances, but once they are developed and deployed, it would be difficult to restrict them to such situations. Proponents also note that a programmer or operator could be held accountable in certain cases, such as when criminal intent is proven. As explained in this report, however, there are many other foreseeable cases involving fully autonomous weapons where criminal and civil liability would not succeed. Even if the law adopted a strict liability regime that allowed for compensation to victims, it would not serve the purposes of deterrence and retribution that international humanitarian and human rights law seek to achieve. This report argues that states should eliminate this accountability gap by adopting an international ban on fully autonomous weapons.
______________________________________________________________
Click the link to read the full report!