The International Committee for Robot Arms Control (ICRAC) is a "not-for-profit association committed to the peaceful use of robotics in the service of humanity and the regulation of robot weapons." It is concerned about the dangers that autonomous military robots, or lethal autonomous weapons, pose to peace and international security and to civilians in war. [1]
The international non governmental organisation was founded in 2009 [2] by Noel Sharkey, [3] Jurgen Altmann, Peter M. Asaro, [4] and Robert Sparrow. [5] Sharkey is its chairman. [6] [7] [8] The committee is composed of people involved in robotics technology, robot ethics, international relations, international security, arms control, international humanitarian law, international human rights law, and public campaigns. [1]
Lethal autonomous weapons are being developed that will be able to select and engage targets without human oversight. [9] ICRAC has argued at the United Nations (UN) over the ramifications of such weapons and for them to be banned by including them under the UN Convention on Certain Conventional Weapons (CCW). [9] [10] [11]
ICRAC is on the steering committee of the Campaign to Stop Killer Robots. [12]
Nuclear disarmament is the act of reducing or eliminating nuclear weapons. Its end state can also be a nuclear-weapons-free world, in which nuclear weapons are completely eliminated. The term denuclearization is also used to describe the process leading to complete nuclear disarmament.
An autonomous robot is a robot that acts without recourse to human control. The first autonomous robots environment were known as Elmer and Elsie, which were constructed in the late 1940s by W. Grey Walter. They were the first robots in history that were programmed to "think" the way biological brains do and meant to have free will. Elmer and Elsie were often labeled as tortoises because of how they were shaped and the manner in which they moved. They were capable of phototaxis which is the movement that occurs in response to light stimulus.
Military robots are autonomous robots or remote-controlled mobile robots designed for military applications, from transport to search & rescue and attack.
Stuart Jonathan Russell is a British computer scientist known for his contributions to artificial intelligence (AI). He is a professor of computer science at the University of California, Berkeley and was from 2008 to 2011 an adjunct professor of neurological surgery at the University of California, San Francisco. He holds the Smith-Zadeh Chair in Engineering at University of California, Berkeley. He founded and leads the Center for Human-Compatible Artificial Intelligence (CHAI) at UC Berkeley. Russell is the co-author with Peter Norvig of the authoritative textbook of the field of AI: Artificial Intelligence: A Modern Approach used in more than 1,500 universities in 135 countries.
Noel Sharkey is a computer scientist born in Belfast, Northern Ireland. He is best known to the British public for his appearances on television as an expert on robotics; including the BBC Two television series Robot Wars and Techno Games, and co-hosting Bright Sparks for BBC Northern Ireland. He is emeritus professor of artificial intelligence and robotics at the University of Sheffield.
The United Nations Convention on Certain Conventional Weapons, concluded at Geneva on October 10, 1980, and entered into force in December 1983, seeks to prohibit or restrict the use of certain conventional weapons which are considered excessively injurious or whose effects are indiscriminate. The full title is Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects. The convention covers land mines, booby traps, incendiary devices, blinding laser weapons and clearance of explosive remnants of war.
United Nations Security Council Resolution 1718 was adopted unanimously by the United Nations Security Council on October 14, 2006. The resolution, passed under Chapter VII, Article 41, of the UN Charter, imposes a series of economic and commercial sanctions on the Democratic People's Republic of Korea in the aftermath of that nation's claimed nuclear test of October 9, 2006.
Robot ethics, sometimes known as "roboethics", concerns ethical problems that occur with robots, such as whether robots pose a threat to humans in the long or short run, whether some uses of robots are problematic, and how robots should be designed such that they act 'ethically'. Alternatively, roboethics refers specifically to the ethics of human behavior towards robots, as robots become increasingly advanced. Robot ethics is a sub-field of ethics of technology, specifically information technology, and it has close links to legal as well as socio-economic concerns. Researchers from diverse areas are beginning to tackle ethical questions about creating robotic technology and implementing it in societies, in a way that will still ensure the safety of the human race.
The Arms Trade Treaty (ATT) is a multilateral treaty that regulates the international trade in conventional weapons.
The SGR-A1 is a type of autonomous sentry gun that was jointly developed by Samsung Techwin and Korea University to assist South Korean troops in the Korean Demilitarized Zone. It is widely considered as the first unit of its kind to have an integrated system that includes surveillance, tracking, firing, and voice recognition. While units of the SGR-A1 have been reportedly deployed, their number is unknown due to the project being "highly classified".
Lethal autonomous weapons (LAWs) are a type of autonomous military system that can independently search for and engage targets based on programmed constraints and descriptions. LAWs are also known as lethal autonomous weapon systems (LAWS), autonomous weapon systems (AWS), robotic weapons or killer robots. LAWs may operate in the air, on land, on water, underwater, or in space. The autonomy of current systems as of 2018 was restricted in the sense that a human gives the final command to attack—though there are exceptions with certain "defensive" systems.
The Campaign to Stop Killer Robots is a coalition of non-governmental organizations who seek to pre-emptively ban lethal autonomous weapons.
Existential risk from artificial general intelligence is the idea that substantial progress in artificial general intelligence (AGI) could result in human extinction or an irreversible global catastrophe.
A number of countries and international bodies have imposed international sanctions against North Korea. Currently, many sanctions are concerned with North Korea's nuclear weapons program and were imposed after its first nuclear test in 2006.
A loitering munition is a kind of aerial weapon with a built-in munition (warhead), which can loiter around the target area until a target is located; it then attacks the target by crashing into it. Loitering munitions enable faster reaction times against hidden targets that emerge for short periods without placing high-value platforms near the target area and also allow more selective targeting as the attack can be changed mid-flight or aborted.
Slaughterbots is a 2017 arms-control advocacy video presenting a dramatized near-future scenario where swarms of inexpensive microdrones use artificial intelligence and facial recognition software to assassinate political opponents based on preprogrammed criteria. It was released by the Future of Life Institute and Stuart Russell, a professor of computer science at Berkeley. On YouTube, the video quickly went viral, garnering over two million views and was screened at the United Nations Convention on Certain Conventional Weapons meeting in Geneva the same month.
A military artificial intelligence arms race is an arms race between two or more states to develop and deploy lethal autonomous weapons systems (LAWS). Since the mid-2010s, many analysts have noted the emergence of such an arms race between global superpowers for better military AI, driven by increasing geopolitical and military tensions.
Ajung Moon is a Korean-Canadian experimental roboticist specializing in ethics and responsible design of interactive robots and autonomous intelligent systems. She is an assistant professor of electrical and computer engineering at McGill University and the Director of the McGill Responsible Autonomy & Intelligent System Ethics (RAISE) lab. Her research interests lie in human-robot interaction, AI ethics, and robot ethics.
Convention on Certain Conventional Weapons – Group of Governmental Experts on Lethal Autonomous Weapons Systems refers to a group of experts created by the United Nations in order to study legal, ethical, societal and moral questions that arise from the increased use of autonomous robots to carry weapons and to be programmed to engage in combat in various situations that might arise, including battles between countries, or in patrolling border areas or sensitive areas, or other similar roles.