Slaughterbots | |
---|---|
Directed by | Stewart Sugg |
Written by | Matt Wood |
Produced by | Matt Nelson |
Narrated by | Stuart Russell |
Production company | Space Digital |
Release date |
|
Running time | 8 minutes |
Language | English |
Slaughterbots is a 2017 arms-control advocacy video presenting a dramatized near-future scenario where swarms of inexpensive microdrones use artificial intelligence and facial recognition software to assassinate political opponents based on preprogrammed criteria. It was released by the Future of Life Institute and Stuart Russell, a professor of computer science at Berkeley. [1] On YouTube, the video quickly went viral, garnering over two million views [2] [3] and was screened at the United Nations Convention on Certain Conventional Weapons meeting in Geneva the same month. [4]
The film's implication that swarms of such "slaughterbots" — miniature, flying lethal autonomous weapons — could become real weapons of mass destruction in the near future proved controversial. [2] [5] [6]
A sequel, Slaughterbots – if human: kill() (2021), presented additional hypothetical scenarios of attacks on civilians, and again called on the UN to ban autonomous weapons that target people. [7]
The dramatization, seven minutes in length, is set in a Black Mirror -style near future. [8] [9] Small, palm-sized autonomous drones using facial recognition and shaped explosives can be programmed to seek out and eliminate known individuals or classes of individuals (such as individuals wearing an enemy military uniform). A tech executive pitches that nuclear weapons are now "obsolete": a $25 million order of "unstoppable" drones can kill half a city. As the video unfolds, the technology get re-purposed by unknown parties to assassinate political opponents, from sitting congressmen to student activists identified via their Facebook profiles. In one scene, the swarming drones coordinate with each other to gain entrance to a building: a larger drone blasts a hole in a wall to give access to smaller ones. [1] [10] [11]
The dramatization is followed by a forty-second entreaty by Russell: "This short film is more than just speculation; it shows the results of integrating and miniaturizing technologies that we already have... AI's potential to benefit humanity is enormous, even in defense, but allowing machines to choose to kill humans will be devastating to our security and freedom." [10] [12]
According to Russell, "What we were trying to show was the property of autonomous weapons to turn into weapons of mass destruction automatically because you can launch as many as you want... and so we thought a video would make it very clear." Russell also expressed a desire to displace the unrealistic and unhelpful Hollywood Terminator conception of autonomous weapons with something more realistic. [13] The video was produced by Space Digital at MediaCityUK and directed by Stewart Sugg with location shots at Hertfordshire University [14] and in Edinburgh. Edinburgh was chosen because the filmmakers "needed streets that would be empty on a Sunday morning" for the shots of armed police patrolling deserted streets, and because the location is recognizable to international audiences. [15] All of the drones were added in post-production. [13] [16]
In December 2017 The Economist assessed the feasibility of Slaughterbots in relation to the U.S. MAST and DCIST microdrone programs. MAST currently has a cyclocopter that weighs less than 30 grams, but that has the downside of being easily disturbed by its own reflected turbulence when too close to a wall. Another candidate is something like Salto, a 98-gram hopping robot, which performs better than cyclocopters in confined spaces. The level of autonomous inter-drone coordination shown in Slaughterbots is currently not available, but that is starting to change, with drone swarms being used for aerial displays. Overall The Economist agreed that "slaughterbots" may become feasible in the foreseeable future: "In 2008, a spy drone that you could hold in the palm of your hand was an idea from science fiction. Such drones are now commonplace... When DCIST wraps up in 2022, the idea of Slaughterbots may seem a lot less fictional than it does now." The Economist is skeptical that arms control could prevent such a militarization of drone swarms: "As someone said of nuclear weapons after the first one was detonated, the only secret worth keeping is now out: the damn things work". [1]
In April 2018 the governmental Swiss Drones and Robotics Centre , referencing Slaughterbots, tested a 3-gram shaped charge on a head model and concluded that "injuries are so severe that the chances of survival are very small". [17] [18]
DARPA is actively working on making swarms of autonomous lethal drones available to the US military. [19]
In December 2017, Paul Scharre of the Center for a New American Security disputed the feasibility of the video's scenario, stating that "Every military technology has a countermeasure, and countermeasures against small drones aren't even hypothetical. The U.S. government is actively working on ways to shoot down, jam, fry, hack, ensnare, or otherwise defeat small drones. The microdrones in the video could be defeated by something as simple as chicken wire. The video shows heavier-payload drones blasting holes through walls so that other drones can get inside, but the solution is simply layered defenses." Scharre also stated that Russell's implied proposal, a legally binding treaty banning autonomous weapons, "won't solve the real problems humanity faces as autonomy advances in weapons. A ban won't stop terrorists from fashioning crude DIY robotic weapons... In fact, it's not even clear whether a ban would prohibit the weapons shown in the video, which are actually fairly discriminate." [2]
In January 2018, Stuart Russell and three other authors responded to Scharre in detail. Their disagreement centered primarily on the question of whether "slaughterbots", as presented in the video, were "potentially scalable weapons of mass destruction (WMDs)". They concluded that "We, and many other experts, continue to find plausible the view that autonomous weapons can become scalable weapons of mass destruction. Scharre's claim that a ban will be ineffective or counterproductive is inconsistent with the historical record. Finally, the idea that human security will be enhanced by an unregulated arms race in autonomous weapons is, at best, wishful thinking." [5]
Matt McFarland of CNN opined that "Perhaps the most nightmarish, dystopian film of 2017 didn't come from Hollywood". McFarland also stated that the debate over banning killer robots had taken a "sensationalistic" turn: In 2015, "they relied on open letters and petitions with academic language", and used dry language like 'armed quadcopters'. Now, in 2017, "they are warning of 'slaughterbots'". [20]
Andrew Yang linked to Slaughterbots from a tweet during his 2020 U.S. Presidential primary candidacy. [21]
The sequel video, published 30 November 2021, had over two million views on YouTube by 8 December. [22]
A robot is a machine—especially one programmable by a computer—capable of carrying out a complex series of actions automatically. A robot can be guided by an external control device, or the control may be embedded within. Robots may be constructed to evoke human form, but most robots are task-performing machines, designed with an emphasis on stark functionality, rather than expressive aesthetics.
An autonomous robot is a robot that acts without recourse to human control. The first autonomous robots environment were known as Elmer and Elsie, which were constructed in the late 1940s by W. Grey Walter. They were the first robots in history that were programmed to "think" the way biological brains do and meant to have free will. Elmer and Elsie were often labeled as tortoises because of how they were shaped and the manner in which they moved. They were capable of phototaxis which is the movement that occurs in response to light stimulus.
An unmanned aerial vehicle (UAV), commonly known as a drone, is an aircraft without any human pilot, crew, or passengers on board. UAVs were originally developed through the twentieth century for military missions too "dull, dirty or dangerous" for humans, and by the twenty-first, they had become essential assets to most militaries. As control technologies improved and costs fell, their use expanded to many non-military applications. These include aerial photography, precision agriculture, forest fire monitoring, river monitoring, environmental monitoring, policing and surveillance, infrastructure inspections, smuggling, product deliveries, entertainment, and drone racing.
Military robots are autonomous robots or remote-controlled mobile robots designed for military applications, from transport to search & rescue and attack.
An unmanned combat aerial vehicle (UCAV), also known as a combat drone, colloquially shortened as drone or battlefield UAV, is an unmanned aerial vehicle (UAV) that is used for intelligence, surveillance, target acquisition, and reconnaissance and carries aircraft ordnance such as missiles, ATGMs, and/or bombs in hardpoints for drone strikes. These drones are usually under real-time human control, with varying levels of autonomy. Unlike unmanned surveillance and reconnaissance aerial vehicles, UCAVs are used for both drone strikes and battlefield intelligence.
Stuart Jonathan Russell is a British computer scientist known for his contributions to artificial intelligence (AI). He is a professor of computer science at the University of California, Berkeley and was from 2008 to 2011 an adjunct professor of neurological surgery at the University of California, San Francisco. He holds the Smith-Zadeh Chair in Engineering at University of California, Berkeley. He founded and leads the Center for Human-Compatible Artificial Intelligence (CHAI) at UC Berkeley. Russell is the co-author with Peter Norvig of the authoritative textbook of the field of AI: Artificial Intelligence: A Modern Approach used in more than 1,500 universities in 135 countries.
Swarm robotics is an approach to the coordination of multiple robots as a system which consist of large numbers of mostly simple physical robots. ″In a robot swarm, the collective behavior of the robots results from local interactions between the robots and between the robots and the environment in which they act.″ It is supposed that a desired collective behavior emerges from the interactions between the robots and interactions of robots with the environment. This approach emerged on the field of artificial swarm intelligence, as well as the biological studies of insects, ants and other fields in nature, where swarm behaviour occurs.
A sentry gun is a weapon that is automatically aimed and fired at targets that are detected by sensors. The earliest functioning military sentry guns were the close-in weapon systems point-defense weapons, such as the Phalanx CIWS, used for detecting and destroying short range incoming missiles and enemy aircraft, first used exclusively on naval assets, and now also as land-based defenses.
The Modular Advanced Armed Robotic System (MAARS) is a robot that is being developed by Qinetiq. A member of the TALON family, it will be the successor to the armed SWORDS robot. It has a different, larger chassis than the SWORDS robot, so has little physically in common with the SWORDS and TALON
Lethal autonomous weapons (LAWs) are a type of autonomous military system that can independently search for and engage targets based on programmed constraints and descriptions. LAWs are also known as lethal autonomous weapon systems (LAWS), autonomous weapon systems (AWS), robotic weapons or killer robots. LAWs may operate in the air, on land, on water, underwater, or in space. The autonomy of current systems as of 2018 was restricted in the sense that a human gives the final command to attack—though there are exceptions with certain "defensive" systems.
Drone warfare is a form of aerial warfare using unmanned combat aerial vehicles (UCAV) or weaponized commercial unmanned aerial vehicles (UAV). The United States, United Kingdom, Israel, China, South Korea, Iran, Italy, France, India, Pakistan, Russia, Turkey, and Poland are known to have manufactured operational UCAVs as of 2019. As of 2022, the Ukrainian enterprise Ukroboronprom and NGO group Aerorozvidka have built strike-capable drones and used them in combat.
The Campaign to Stop Killer Robots is a coalition of non-governmental organizations who seek to pre-emptively ban lethal autonomous weapons.
An autonomous aircraft is an aircraft which flies under the control of automatic systems and needs no intervention from a human pilot. Most autonomous aircraft are unmanned aerial vehicle or drones. However, autonomous control systems are reaching a point where several air taxis and associated regulatory regimes are being developed.
Lily was a California-based drone brand that shut down after filing for bankruptcy in 2017. It is owned by Mota Group, Inc. and headquartered in San Jose, California.
A loitering munition is a kind of aerial weapon with a built-in munition (warhead), which can loiter around the target area until a target is located; it then attacks the target by crashing into it. Loitering munitions enable faster reaction times against hidden targets that emerge for short periods without placing high-value platforms near the target area, and also allow more selective targeting as the attack can be changed midflight or aborted.
The International Committee for Robot Arms Control (ICRAC) is a "not-for-profit association committed to the peaceful use of robotics in the service of humanity and the regulation of robot weapons." It is concerned about the dangers that autonomous military robots, or lethal autonomous weapons, pose to peace and international security and to civilians in war.
A military artificial intelligence arms race is an arms race between two or more states to develop and deploy lethal autonomous weapons systems (LAWS). Since the mid-2010s, many analysts have noted the emergence of such an arms race between global superpowers for better military AI, driven by increasing geopolitical and military tensions. An AI arms race is sometimes placed in the context of an AI Cold War between the US and China.
The HAL Combat Air Teaming System (CATS) is an Indian unmanned and manned combat aircraft air teaming system being developed by Hindustan Aeronautics Limited (HAL). The system will consist of a manned fighter aircraft acting as "mothership" of the system and a set of swarming UAVs and UCAVs governed by the mothership aircraft. A twin-seated HAL Tejas is likely to be the mothership aircraft. Various other sub components of the system are currently under development and will be jointly produced by HAL, National Aerospace Laboratories (NAL), Defence Research and Development Organisation (DRDO) and Newspace Research & Technologies.