Slaughterbots

Last updated
Slaughterbots
Slaughterbots low-quality cropped screen capture @00.01.05.jpg
Directed byStewart Sugg
Written byMatt Wood
Produced byMatt Nelson
Narrated by Stuart Russell
Production
company
Space Digital
Release date
  • 12 November 2017 (2017-11-12)
Running time
8 minutes
LanguageEnglish

Slaughterbots is a 2017 arms-control advocacy video presenting a dramatized near-future scenario where swarms of inexpensive microdrones use artificial intelligence and facial recognition software to assassinate political opponents based on preprogrammed criteria. It was released by the Future of Life Institute and Stuart Russell, a professor of computer science at Berkeley. [1] On YouTube, the video quickly went viral, garnering over two million views [2] [3] and was screened at the United Nations Convention on Certain Conventional Weapons meeting in Geneva the same month. [4]

Contents

The film's implication that swarms of such "slaughterbots" — miniature, flying lethal autonomous weapons — could become real weapons of mass destruction in the near future proved controversial. [2] [5] [6]

A sequel, Slaughterbots if human: kill() (2021), presented additional hypothetical scenarios of attacks on civilians, and again called on the UN to ban autonomous weapons that target people. [7]

Synopsis

Students attempt to flee lethal microdrones Slaughterbots low-quality cropped screen capture @00.05.42.jpg
Students attempt to flee lethal microdrones

The dramatization, seven minutes in length, is set in a Black Mirror -style near future. [8] [9] Small, palm-sized autonomous drones using facial recognition and shaped explosives can be programmed to seek out and eliminate known individuals or classes of individuals (such as individuals wearing an enemy military uniform). A tech executive pitches that nuclear weapons are now "obsolete": a $25 million order of "unstoppable" drones can kill half a city. As the video unfolds, the technology get re-purposed by unknown parties to assassinate political opponents, from sitting congressmen to student activists identified via their Facebook profiles. In one scene, the swarming drones coordinate with each other to gain entrance to a building: a larger drone blasts a hole in a wall to give access to smaller ones. [1] [10] [11]

The dramatization is followed by a forty-second entreaty by Russell: "This short film is more than just speculation; it shows the results of integrating and miniaturizing technologies that we already have... AI's potential to benefit humanity is enormous, even in defense, but allowing machines to choose to kill humans will be devastating to our security and freedom." [10] [12]

Production

According to Russell, "What we were trying to show was the property of autonomous weapons to turn into weapons of mass destruction automatically because you can launch as many as you want... and so we thought a video would make it very clear." Russell also expressed a desire to displace the unrealistic and unhelpful Hollywood Terminator conception of autonomous weapons with something more realistic. [13] The video was produced by Space Digital at MediaCityUK and directed by Stewart Sugg with location shots at Hertfordshire University [14] and in Edinburgh. Edinburgh was chosen because the filmmakers "needed streets that would be empty on a Sunday morning" for the shots of armed police patrolling deserted streets, and because the location is recognizable to international audiences. [15] All of the drones were added in post-production. [13] [16]

Reception

Technical feasibility

In December 2017 The Economist assessed the feasibility of Slaughterbots in relation to the U.S. MAST and DCIST microdrone programs. MAST currently has a cyclocopter that weighs less than 30 grams, but that has the downside of being easily disturbed by its own reflected turbulence when too close to a wall. Another candidate is something like Salto, a 98-gram hopping robot, which performs better than cyclocopters in confined spaces. The level of autonomous inter-drone coordination shown in Slaughterbots is currently not available, but that is starting to change, with drone swarms being used for aerial displays. Overall The Economist agreed that "slaughterbots" may become feasible in the foreseeable future: "In 2008, a spy drone that you could hold in the palm of your hand was an idea from science fiction. Such drones are now commonplace... When DCIST wraps up in 2022, the idea of Slaughterbots may seem a lot less fictional than it does now." The Economist is skeptical that arms control could prevent such a militarization of drone swarms: "As someone said of nuclear weapons after the first one was detonated, the only secret worth keeping is now out: the damn things work". [1]

In April 2018 the governmental Swiss Drones and Robotics Centre , referencing Slaughterbots, tested a 3-gram shaped charge on a head model and concluded that "injuries are so severe that the chances of survival are very small". [17] [18]

DARPA is actively working on making swarms of autonomous lethal drones available to the US military. [19]

Threat plausibility

In December 2017, Paul Scharre of the Center for a New American Security disputed the feasibility of the video's scenario, stating that "Every military technology has a countermeasure, and countermeasures against small drones aren't even hypothetical. The U.S. government is actively working on ways to shoot down, jam, fry, hack, ensnare, or otherwise defeat small drones. The microdrones in the video could be defeated by something as simple as chicken wire. The video shows heavier-payload drones blasting holes through walls so that other drones can get inside, but the solution is simply layered defenses." Scharre also stated that Russell's implied proposal, a legally binding treaty banning autonomous weapons, "won't solve the real problems humanity faces as autonomy advances in weapons. A ban won't stop terrorists from fashioning crude DIY robotic weapons... In fact, it's not even clear whether a ban would prohibit the weapons shown in the video, which are actually fairly discriminate." [2]

In January 2018, Stuart Russell and three other authors responded to Scharre in detail. Their disagreement centered primarily on the question of whether "slaughterbots", as presented in the video, were "potentially scalable weapons of mass destruction (WMDs)". They concluded that "We, and many other experts, continue to find plausible the view that autonomous weapons can become scalable weapons of mass destruction. Scharre's claim that a ban will be ineffective or counterproductive is inconsistent with the historical record. Finally, the idea that human security will be enhanced by an unregulated arms race in autonomous weapons is, at best, wishful thinking." [5]

Cultural reception

Matt McFarland of CNN opined that "Perhaps the most nightmarish, dystopian film of 2017 didn't come from Hollywood". McFarland also stated that the debate over banning killer robots had taken a "sensationalistic" turn: In 2015, "they relied on open letters and petitions with academic language", and used dry language like 'armed quadcopters'. Now, in 2017, "they are warning of 'slaughterbots'". [20]

Andrew Yang linked to Slaughterbots from a tweet during his 2020 U.S. Presidential primary candidacy. [21]

The sequel video, published 30 November 2021, had over two million views on YouTube by 8 December. [22]

See also

Related Research Articles

<span class="mw-page-title-main">Robot</span> Machine capable of carrying out a complex series of actions automatically

A robot is a machine—especially one programmable by a computer—capable of carrying out a complex series of actions automatically. A robot can be guided by an external control device, or the control may be embedded within. Robots may be constructed to evoke human form, but most robots are task-performing machines, designed with an emphasis on stark functionality, rather than expressive aesthetics.

An autonomous robot is a robot that acts without recourse to human control. The first autonomous robots environment were known as Elmer and Elsie, which were constructed in the late 1940s by W. Grey Walter. They were the first robots in history that were programmed to "think" the way biological brains do and meant to have free will. Elmer and Elsie were often labeled as tortoises because of how they were shaped and the manner in which they moved. They were capable of phototaxis which is the movement that occurs in response to light stimulus.

<span class="mw-page-title-main">Unmanned aerial vehicle</span> Aircraft without any human pilot or passengers on board

An unmanned aerial vehicle (UAV), commonly known as a drone, is an aircraft without any human pilot, crew, or passengers on board. UAVs were originally developed through the twentieth century for military missions too "dull, dirty or dangerous" for humans, and by the twenty-first, they had become essential assets to most militaries. As control technologies improved and costs fell, their use expanded to many non-military applications. These include aerial photography, precision agriculture, forest fire monitoring, river monitoring, environmental monitoring, policing and surveillance, infrastructure inspections, smuggling, product deliveries, entertainment, and drone racing.

<span class="mw-page-title-main">Military robot</span> Robotic devices designed for military applications

Military robots are autonomous robots or remote-controlled mobile robots designed for military applications, from transport to search & rescue and attack.

<span class="mw-page-title-main">Unmanned combat aerial vehicle</span> Unmanned aerial vehicle that is usually armed

An unmanned combat aerial vehicle (UCAV), also known as a combat drone, colloquially shortened as drone or battlefield UAV, is an unmanned aerial vehicle (UAV) that is used for intelligence, surveillance, target acquisition, and reconnaissance and carries aircraft ordnance such as missiles, ATGMs, and/or bombs in hardpoints for drone strikes. These drones are usually under real-time human control, with varying levels of autonomy. Unlike unmanned surveillance and reconnaissance aerial vehicles, UCAVs are used for both drone strikes and battlefield intelligence.

<span class="mw-page-title-main">Stuart J. Russell</span> British computer scientist and author (born 1962)

Stuart Jonathan Russell is a British computer scientist known for his contributions to artificial intelligence (AI). He is a professor of computer science at the University of California, Berkeley and was from 2008 to 2011 an adjunct professor of neurological surgery at the University of California, San Francisco. He holds the Smith-Zadeh Chair in Engineering at University of California, Berkeley. He founded and leads the Center for Human-Compatible Artificial Intelligence (CHAI) at UC Berkeley. Russell is the co-author with Peter Norvig of the authoritative textbook of the field of AI: Artificial Intelligence: A Modern Approach used in more than 1,500 universities in 135 countries.

<span class="mw-page-title-main">Swarm robotics</span> Coordination of multiple robots as a system

Swarm robotics is an approach to the coordination of multiple robots as a system which consist of large numbers of mostly simple physical robots. ″In a robot swarm, the collective behavior of the robots results from local interactions between the robots and between the robots and the environment in which they act.″ It is supposed that a desired collective behavior emerges from the interactions between the robots and interactions of robots with the environment. This approach emerged on the field of artificial swarm intelligence, as well as the biological studies of insects, ants and other fields in nature, where swarm behaviour occurs.

<span class="mw-page-title-main">Sentry gun</span> Weapon that automatically aims and fires at targets

A sentry gun is a weapon that is automatically aimed and fired at targets that are detected by sensors. The earliest functioning military sentry guns were the close-in weapon systems point-defense weapons, such as the Phalanx CIWS, used for detecting and destroying short range incoming missiles and enemy aircraft, first used exclusively on naval assets, and now also as land-based defenses.

The Modular Advanced Armed Robotic System (MAARS) is a robot that is being developed by Qinetiq. A member of the TALON family, it will be the successor to the armed SWORDS robot. It has a different, larger chassis than the SWORDS robot, so has little physically in common with the SWORDS and TALON

<span class="mw-page-title-main">Lethal autonomous weapon</span> Autonomous military technology system

Lethal autonomous weapons (LAWs) are a type of autonomous military system that can independently search for and engage targets based on programmed constraints and descriptions. LAWs are also known as lethal autonomous weapon systems (LAWS), autonomous weapon systems (AWS), robotic weapons or killer robots. LAWs may operate in the air, on land, on water, underwater, or in space. The autonomy of current systems as of 2018 was restricted in the sense that a human gives the final command to attack—though there are exceptions with certain "defensive" systems.

Drone warfare is a form of aerial warfare using unmanned combat aerial vehicles (UCAV) or weaponized commercial unmanned aerial vehicles (UAV). The United States, United Kingdom, Israel, China, South Korea, Iran, Italy, France, India, Pakistan, Russia, Turkey, and Poland are known to have manufactured operational UCAVs as of 2019. As of 2022, the Ukrainian enterprise Ukroboronprom and NGO group Aerorozvidka have built strike-capable drones and used them in combat.

<span class="mw-page-title-main">Future of Life Institute</span> International nonprofit research institute

<span class="mw-page-title-main">Campaign to Stop Killer Robots</span> Coalition of organizations

The Campaign to Stop Killer Robots is a coalition of non-governmental organizations who seek to pre-emptively ban lethal autonomous weapons.

An autonomous aircraft is an aircraft which flies under the control of automatic systems and needs no intervention from a human pilot. Most autonomous aircraft are unmanned aerial vehicle or drones. However, autonomous control systems are reaching a point where several air taxis and associated regulatory regimes are being developed.

Lily was a California-based drone brand that shut down after filing for bankruptcy in 2017. It is owned by Mota Group, Inc. and headquartered in San Jose, California.

A loitering munition is a kind of aerial weapon with a built-in munition (warhead), which can loiter around the target area until a target is located; it then attacks the target by crashing into it. Loitering munitions enable faster reaction times against hidden targets that emerge for short periods without placing high-value platforms near the target area, and also allow more selective targeting as the attack can be changed midflight or aborted.

The International Committee for Robot Arms Control (ICRAC) is a "not-for-profit association committed to the peaceful use of robotics in the service of humanity and the regulation of robot weapons." It is concerned about the dangers that autonomous military robots, or lethal autonomous weapons, pose to peace and international security and to civilians in war.

A military artificial intelligence arms race is an arms race between two or more states to develop and deploy lethal autonomous weapons systems (LAWS). Since the mid-2010s, many analysts have noted the emergence of such an arms race between global superpowers for better military AI, driven by increasing geopolitical and military tensions. An AI arms race is sometimes placed in the context of an AI Cold War between the US and China.

<span class="mw-page-title-main">HAL Combat Air Teaming System</span> Indian air teaming system

The HAL Combat Air Teaming System (CATS) is an Indian unmanned and manned combat aircraft air teaming system being developed by Hindustan Aeronautics Limited (HAL). The system will consist of a manned fighter aircraft acting as "mothership" of the system and a set of swarming UAVs and UCAVs governed by the mothership aircraft. A twin-seated HAL Tejas is likely to be the mothership aircraft. Various other sub components of the system are currently under development and will be jointly produced by HAL, National Aerospace Laboratories (NAL), Defence Research and Development Organisation (DRDO) and Newspace Research & Technologies.

References

  1. 1 2 3 "Military robots are getting smaller and more capable", The Economist, 14 December 2017, retrieved 21 January 2018
  2. 1 2 3 Scharre, Paul (22 December 2017). "Why You Shouldn't Fear 'Slaughterbots'". IEEE Spectrum: Technology, Engineering, and Science News. Retrieved 21 January 2018.
  3. "Slaughterbots". YouTube . 12 November 2017. Retrieved 21 January 2018.
  4. Ting, Eric (18 November 2017). "UC Berkeley professor's eerie lethal drone video goes viral". SFGate. Retrieved 21 January 2018.
  5. 1 2 Russell, Stuart; Aguirre, Anthony; Conn, Ariel; and Tegmark, Max (23 January 2018).Why You Should Fear 'Slaughterbots' — A Response IEEE Spectrum: Technology, Engineering, and Science News. Retrieved 8 April 2023.
  6. Scharre, Paul (1 February 2018). Debating Slaughterbots and the Future of Autonomous Weapons IEEE Spectrum: Technology, Engineering, and Science News. Retrieved 8 April 2023.
  7. Knight, Will (2021). "Autonomous Weapons Are Here, but the World Isn't Ready for Them". Wired. Retrieved 31 December 2021.
  8. Oberhaus, Daniel (13 November 2017). "Watch 'Slaughterbots,' A Warning About the Future of Killer Bots". Motherboard (Vice Media). Retrieved 21 January 2018.
  9. Dvorsky, George. "Artificially Intelligent Drones Become Terrifying Killing Machines in Dystopian Short Film". Gizmodo . Retrieved 21 January 2018.
  10. 1 2 Ian Sample (13 November 2017), "Ban on killer robots urgently needed, say scientists", The Guardian, retrieved 21 January 2018
  11. Mikelionis, Lukas (21 November 2017). "UC Berkeley professor's 'slaughterbots' video on killer drones goes viral". Fox News. Retrieved 21 January 2018.
  12. May, Patrick (20 November 2017). "Watch out for 'killer robots,' UC Berkeley professor warns in video". The Mercury News. Retrieved 21 January 2018.
  13. 1 2 ""As much death as you want": UC Berkeley's Stuart Russell on "Slaughterbots"". Bulletin of the Atomic Scientists . 5 December 2017. Retrieved 21 January 2018.
  14. "Film produced by Hertfordshire University staff and students goes viral". Hertfordshire University . 28 November 2017. Retrieved 21 January 2018.
  15. "Edinburgh used for 'killer drone' film". BBC News. 21 November 2017. Retrieved 21 January 2018.
  16. "Killer drone attacks filmed in Edinburgh to highlight artificial intelligence fears". The Scotsman . 21 November 2017. Retrieved 21 January 2018.
  17. "Fake news? Lethal effect of micro drones". www.ar.admin.ch. Retrieved 2018-05-31.
  18. ""Einstein" bei den Robotern - TV - Play SRF". Play SRF (in German). Retrieved 2018-05-31.
  19. McMillan, Tim (1 December 2020). "DARPA's Dream of a Tiny Robot Army is Close to Becoming a Reality". thedebrief.org.
  20. McFarland, Matt (14 November 2017). "'Slaughterbots' film shows potential horrors of killer drones". CNNMoney. Retrieved 21 January 2018.
  21. "Andrew Yang calls for global ban on killer robots". New York Post. 1 February 2020. Retrieved 31 December 2021.
  22. Mizokami, Kyle (8 December 2021). "A New Video Explains, in Graphic Terms, Why the United Nations Must Ban 'Slaughterbots'". Popular Mechanics. Retrieved 31 December 2021.