| Part of a series on |
| Self-driving cars and self-driving vehicles |
|---|
| |
| Enablers |
| Topics |
| Related topics |
A self-driving car, also known as an autonomous car, driverless car, robotic car or robo-car, is a car that is capable of operating with reduced or no human input. They are sometimes called robotaxis , though this term refers specifically to self-driving cars operated for a ridesharing company.
The term "self-driving" currrently lacks an agreed standard definition and is also subject to commercial advertising and branding considerations. In 2020, Waymo was the first to offer rides in driverless taxis in the operational design domain (ODD) of limited geographic areas, but as of late2025 [update] , no system has achieved full autonomy in all domains - sometimes referred to as "Level 5" on a scale of 0 to 5 levels of automation defined by the global standards organisation SAE International, or simply "no driver" as given by the classification system proposed by Mobileye in the US.
Following a history of experimentation and development of advanced driver assistance systems (ADAS) after WWII, two main technologies are now primarily used: LiDAR (Light Detection and Ranging), and visual sensors (cameras) which capture images and video like human eyes. These are combined with systems such as GPS, neural networks, artificial intelligence, and established ADAS engineering to deliver levels of driving autonomy.
With more self-driving cars on public roads, an increasing number of safety incidents, collisions and even deaths have been recorded around the world. The primary obstacle to self-driving is the advanced software and mapping required to make them work safely across the wide variety of conditions that drivers experience. Other issues include security of over-the-air updates, legal and regulatory issues, ethics and consumer confidence. Methods of testing and monitoring the reliability of cars have evolved in parallel with the deployment of cars with self-driving capabilities, with various standards for this being proposed. Should autonomous cars gain mass adoption, wider implications for urban infrastrucure and the economy have also been discussed.
Public perception and acceptance of autonomous cars has been found to be mixed. A 2014 telephone poll in the US found 31.7% would not continue to drive once an automated car was available to them, while a survey in 2022 found only a quarter (27%) of the world's population would feel safe in one.
Self-driving cars were anticipated by experiments in radio control during the 1920s, and the development of advanced driver assistance (ADAS) after WWII. Trials of self-driving vehicles began in the 1950s with the first semi-autonomous car developed in 1977 by Japan's Tsukuba Mechanical Engineering Laboratory.
In the United States, Carnegie Mellon University's Navlab began semi-autonomous vehicle projects in 1984, funded by the Defense Advanced Research Projects Agency (DARPA). In Europe, similar projects were led by Mercedes-Benz and Bundeswehr University Munich's EUREKA Prometheus Project, beginning in 1987.
The United States allocated US$650 million in 1991 for research on the National Automated Highway System, which demonstrated automated driving combining highway-embedded automation with vehicle technology. Until the second DARPA Grand Challenge in 2005, automated vehicle research in the United States was primarily funded by DARPA, the US Army and the US Navy, producing incremental advances in speed, driving competence, control, and sensor systems.
Since then, numerous private companies and both private and public research organizations around the world have developed working autonomous vehicles. In 2015, the Cruise subsidiary of General Motors began road testing in California. Two years later, Waymo was the first to commercialize a robotaxi service in Phoenix, Arizona, followed by a similar service by DeepRoute.ai in Shenzhen. Cruise later shut down in 2023, with several other manufacturers scaling back plans for self-driving technology in 2022, including Ford and Volkswagen.
Legal and regulatory developments to accommodate the testing and facilitation of self-driving vehicles have also taken place world wide. In the 2010s and 2020s, some UNECE and EU members developed rules and regulations related to automated vehicles, with various cities planning to operate transport systems for driverless cars and to allow testing of robotic cars in traffic. In 2016, the US National Economic Council and US Department of Transportation (USDOT) released the Federal Automated Vehicles Policy. The first known fatal accident involving a vehicle being driven by itself took place in Williston, Florida, in 2016, while the first reported pedestrian killed by a self-driving car was in 2018.
From the 2010s, increasingly rapid progress in research and development has taken place, often accompanied by innaccurate predictions of complete autonomy, which capability is so far [update] confined to driverless taxi services in designated cities. As of early 2024, several manufacturers sell cars with automated driving systems in the US, Japan and Europe.
Organizations such as the global standards body SAE International (SAE) have proposed terminology to describe technical capabilities. However, most terms have no standard definition and are employed variously by vendors and others. Proposals to adopt aviation automation terminology for cars has also not prevailed. [1]
Names such as AutonoDrive, PilotAssist, "Full-Self Driving" or DrivePilot are used even though the products offer an assortment of features that may not match the names. [2] Despite offering a system dubbed Full Self-Driving, Tesla stated that its system did not autonomously handle all driving tasks. [3] In the United Kingdom, a fully self-driving car is defined as a car so registered, rather than one that supports a specific feature set. [4] The Association of British Insurers claimed that the usage of the word autonomous in marketing was dangerous because car ads make motorists think "autonomous" and "autopilot" imply that the driver can rely on the car to control itself, even though they do not. [5]
SAE identified 6 levels for driving automation from level 0 to level 5. [6] An ADS is an SAE J3016 level 3 or higher system.
An ADAS is a system that automates specific driving features, such as Forward Collision Warning (FCW), Automatic Emergency Braking (AEB), Lane Departure Warning (LDW), Lane Keeping Assistance (LKA) or Blind Spot Warning (BSW). [7] An ADAS requires a human driver to handle tasks that the ADAS does not support.
Autonomy implies that an automation system is under the control of the vehicle rather than a driver. Automation is function-specific, handling issues such as speed control, but leaves broader decision-making to the driver. [8]
Euro NCAP defined autonomous as "the system acts independently of the driver to avoid or mitigate the accident". [9]
In Europe, the words automated and autonomous can be used together. For instance, Regulation (EU) 2019/2144 supplied: [10]
A remote driver is a driver that operates a vehicle at a distance, using a video and data connection. [11]
According to SAE J3016,
Some driving automation systems may indeed be autonomous if they perform all of their functions independently and self-sufficiently, but if they depend on communication and/or cooperation with outside entities, they should be considered cooperative rather than autonomous.
Operational design domain (ODD) is a term for a particular operating context for an automated system, often used in the field of autonomous vehicles. The context is defined by a set of conditions, including environmental, geographical, time of day, and other conditions. For vehicles, traffic and roadway characteristics are included. Manufacturers use ODD to indicate where/how their product operates safely. A given system may operate differently according to the immediate ODD. [12]
The concept presumes that automated systems have limitations. [13] Relating system function to the ODD it supports is important for developers and regulators to establish and communicate safe operating conditions. Systems should operate within those limitations. Some systems recognize the ODD and modify their behavior accordingly. For example, an autonomous car might recognize that traffic is heavy and disable its automated lane change feature. [13]
Vendors have taken a variety of approaches to the self-driving problem. Tesla's approach is to allow their "full self-driving" (FSD) system to be used in all ODDs as a Level 2 (hands/on, eyes/on) ADAS. [14] Waymo picked specific ODDs (city streets in Phoenix and San Francisco) for their Level 5 robotaxi service. [15] Mercedes Benz offers Level 3 service in Las Vegas in highway traffic jams at speeds up to 40 miles per hour (64 km/h). [16] Mobileye's SuperVision system offers hands-off/eyes-on driving on all road types at speeds up to 130 km/h (81 mph). [17] GM's hands-free Super Cruise operates on specific roads in specific conditions, stopping or returning control to the driver when ODD changes. In 2024 the company announced plans to expand road coverage from 400,000 miles to 750,000 miles (1,210,000 km). [18] Ford's BlueCruise hands-off system operates on 130,000 miles (210,000 km) of US divided highways. [19]
The Union of Concerned Scientists defined self-driving as "cars or trucks in which human drivers are never required to take control to safely operate the vehicle. Also known as autonomous or 'driverless' cars, they combine sensors and software to control, navigate, and drive the vehicle." [20]
The British Automated and Electric Vehicles Act 2018 law defines a vehicle as "driving itself" if the vehicle is "not being controlled, and does not need to be monitored, by an individual". [21]
Another British government definition stated, "Self-driving vehicles are vehicles that can safely and lawfully drive themselves". [22]
In British English, the word automated alone has several meanings, such as in the sentence: "Thatcham also found that the automated lane keeping systems could only meet two out of the twelve principles required to guarantee safety, going on to say they cannot, therefore, be classed as 'automated driving', preferring 'assisted driving'". [23] The first occurrence of the "automated" word refers to an Unece automated system, while the second refers to the British legal definition of an automated vehicle. British law interprets the meaning of "automated vehicle" based on the interpretation section related to a vehicle "driving itself" and an insured vehicle. [24]
In November 2023 the British Government introduced the Automated Vehicles Bill. It proposed definitions for related terms: [25]
A six-level classification system – ranging from fully manual to fully automated – was published in 2014 by SAE International as J3016, Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems; the details are revised occasionally. [28] This classification is based on the role of the driver, rather than the vehicle's capabilities, although these are related. After SAE updated its classification in 2016, (J3016_201609), [29] the National Highway Traffic Safety Administration (NHTSA) adopted the SAE standard. [30] [31]
The classification is a topic of debate, having been criticized for its technological focus, with various revisions proposed. [32] [33] It has been argued that the structure of the levels suggests that automation increases linearly and that more automation is better, which may not be the case. [34] SAE Levels also do not account for changes that may be required to infrastructure [35] and road user behavior. [36] [37]
This section needs expansionwith: a better introduction and more examples to aid understanding. You can help by adding to it.(December 2025) |
A "driving mode", aka driving scenario, combines an operational design domain (ODD) with matched driving requirements (e.g., expressway merging, traffic jam). [38] [39] Cars may switch levels in accord with the driving mode.
Above Level 1, level differences are related to how responsibility for safe movement is divided/shared between the advanced driver-assistance system (ADAS) and driver rather than specific driving features.
| Level | Name | Narrative | Responsibility for: | Mode coverage | |||
|---|---|---|---|---|---|---|---|
| Vehicle direction & speed | Monitoring environment | Fallback | |||||
| 0 | No Automation | Full-time performance by the driver of all aspects of driving, even when "enhanced by warning or intervention systems" | Driver | Driver | Driver | n/a | |
| 1 | Driver Assistance | Driving mode-specific control by an ADAS of either steering or speed | ADAS uses information about the driving environment; driver is expected to perform all other driving tasks. | Driver and system | Some | ||
| 2 | Partial Automation | Driving mode-specific execution by one or more ADAS for both steering and speed | System | ||||
| 3 | Conditional Automation | Driving mode-specific control by an ADAS of all aspects of driving | Driver must appropriately respond to a request to intervene. | System | |||
| 4 | High Automation | If a driver does not respond appropriately to a request to intervene, the car can stop safely. | System | Many | |||
| 5 | Full Automation | System controls the vehicle under all conditions and circumstances. | All | ||||
Mobileye CEO Amnon Shashua and CTO Shai Shalev-Shwartz proposed an alternative taxonomy for autonomous driving systems, claiming that a more consumer-friendly approach was needed. Its categories reflect the amount of driver engagement that is required. [40] [41] Some vehicle makers have informally adopted some of the terminology involved, while not formally committing to it. [42] [43] [44] [45]
The first level, hands-on/eyes-on, implies that the driver is fully engaged in operating the vehicle, but is supervised by the system, which intervenes according to the features it supports (e.g., adaptive cruise control, automatic emergency braking). The driver is entirely responsible, with hands on the wheel and eyes on the road. [41]
Eyes-on/hands-off allows the driver to let go of the wheel. The system drives, the driver monitors, and remains prepared to resume control as needed. [41]
Eyes-off/hands-off means that the driver can stop monitoring the system, leaving the system in full control. Eyes-off requires that no errors be reproducible (not triggered by exotic transitory conditions) or frequent, that speeds are contextually appropriate (e.g., 80 mph (130 km/h) on limited-access roads), and that the system handles typical maneuvers (e.g., getting cut off by another vehicle). The automation level could vary according to the road (e.g., eyes-off on freeways, eyes-on on side streets). [41]
The highest level does not require a human driver in the car: monitoring is done either remotely (telepresence) or not at all. [41]
A critical requirement for the higher two levels is that the vehicle be able to conduct a Minimum Risk Maneuver and stop safely out of traffic without driver intervention. [41]
The perception system processes visual and audio data from outside and inside the car to create a local model of the vehicle, the road, traffic, traffic controls and other observable objects, and their relative motion. The control system then takes actions to move the vehicle, considering the local model, road map, and driving regulations. [46] [47] [48] [49]
Several classifications have been proposed to describe ADAS technology. One proposal is to adopt these categories: navigation, path planning, perception, and car control. [50]
Navigation involves the use of maps to define a path between origin and destination. Hybrid navigation is the use of multiple navigation systems. Some systems use basic maps, relying on perception to deal with anomalies. Such a map understands which roads lead to which others, whether a road is a freeway, a highway, are one-way, etc. Other systems require highly detailed maps, including lane maps, obstacles, traffic controls, etc.
ACs need to be able to perceive the world around them. Supporting technologies include combinations of cameras, LiDAR, radar, audio, and ultrasound, [51] GPS, and inertial measurement. [52] [53] [54] Deep neural networks are used to analyse inputs from these sensors to detect and identify objects and their trajectories. [55] Some systems use Bayesian simultaneous localization and mapping (SLAM) algorithms. Another technique is detection and tracking of other moving objects (DATMO), used to handle potential obstacles. [56] [57] Other systems use roadside real-time locating system (RTLS) technologies to aid localization. Tesla's "vision only" system uses eight cameras, without LIDAR or radar, to create its bird's-eye view of the environment. [58]
Path planning finds a sequence of segments that a vehicle can use to move from origin to destination. Techniques used for path planning include graph-based search and variational-based optimization techniques. Graph-based techniques can make harder decisions such as how to pass another vehicle/obstacle. Variational-based optimization techniques require more stringent restrictions on the vehicle's path to prevent collisions. [59] The large scale path of the vehicle can be determined by using a voronoi diagram, an occupancy grid mapping, or a driving corridor algorithm. The latter allows the vehicle to locate and drive within open space that is bounded by lanes or barriers. [60]
Maps are necessary for navigation. Map sophistication varies from simple graphs that show which roads connect to each other, with details such as one-way vs two-way, to those that are highly detailed, with information about lanes, traffic controls, roadworks, and more. [51] Researchers at the MITComputer Science and Artificial Intelligence Laboratory (CSAIL) developed a system called MapLite, which allows self-driving cars to drive with simple maps. The system combines the GPS position of the vehicle, a "sparse topological map" such as OpenStreetMap (which has only 2D road features), with sensors that observe road conditions. [61] One issue with highly detailed maps is updating them as the world changes. Vehicles that can operate with less-detailed maps do not require frequent updates or geo-fencing.
Sensors are necessary for the vehicle to properly respond to the driving environment. Sensor types include cameras, LiDAR, ultrasound, and radar. Control systems typically combine data from multiple sensors. [62] Multiple sensors can provide a more complete view of the surroundings and can be used to cross-check each other to correct errors. [63] For example, radar can image a scene in, e.g., a nighttime snowstorm, that defeats cameras and LiDAR, albeit at reduced precision. After experimenting with radar and ultrasound, Tesla adopted a vision-only approach, asserting that humans drive using only vision, and that cars should be able to do the same, while citing the lower cost of cameras versus other sensor types. [64] By contrast, Waymo makes use of the higher resolution of LiDAR sensors and cites the declining cost of that technology. [65]
Drive by wire is the use of electrical or electro-mechanical systems for performing vehicle functions such as steering or speed control that are traditionally achieved by mechanical linkages.
Driver monitoring is used to assess the driver's attention and alertness. Techniques in use include eye monitoring, and requiring the driver to maintain torque on the steering wheel. [66] It attempts to understand driver status and identify dangerous driving behaviors. [67]
Vehicles can potentially benefit from communicating with others to share information about traffic, road obstacles, to receive map and software updates, etc. [68] [69] [51]
ISO/TC 22 specifies in-vehicle transport information and control systems, [70] while ISO/TC 204 specifies information, communication and control systems in surface transport. [71] International standards have been developed for ADAS functions, connectivity, human interaction, in-vehicle systems, management/engineering, dynamic map and positioning, privacy and security. [72]
Rather than communicating among vehicles, they can communicate with road-based systems to receive similar information.
Software controls the vehicle, and can provide entertainment and other services. Over-the-air updates can deliver bug fixes and additional features over the internet. Software updates are one way to accomplish recalls that in the past required a visit to a service center. In March 2021, the UNECE regulation on software update and software update management systems was published. [73]
A safety model is software that attempts to formalize rules that ensure that ACs operate safely. [74]
IEEE is attempting to forge a standard for safety models as "IEEE P2846: A Formal Model for Safety Considerations in Automated Vehicle Decision Making". [75] In 2022, a research group at National Institute of Informatics (NII, Japan) enhanced Mobileye's Reliable Safety System as "Goal-Aware RSS" to enable RSS rules to deal with complex scenarios via program logic. [76]
The US has standardized the use of turquoise lights to inform other drivers that a vehicle is driving autonomously. It will be used in the 2026 Mercedes-Benz EQS and S-Class sedans with Drive Pilot, an SAE Level 3 driving system.[ citation needed ]
As of 2023, the Turquoise light had not been standardized by the P.R.C or the UN-ECE. [77]
Artificial intelligence (AI) plays a pivotal role in the development and operation of autonomous vehicles (AVs), enabling them to perceive their surroundings, make decisions, and navigate safely without human intervention. AI algorithms empower AVs to interpret sensory data from various onboard sensors, such as cameras, LiDAR, radar, and GPS, to understand their environment and improve its technological ability and overall safety over time. [78]
The primary obstacle to ACs is the advanced software and mapping required to make them work safely across the wide variety of conditions that drivers experience. [79] In addition to handling day/night driving in good and bad weather [80] on roads of arbitrary quality, ACs must cope with other vehicles, road obstacles, poor/missing traffic controls, flawed maps, and handle endless edge cases, such as following the instructions of a police officer managing traffic at a crash site.
Other obstacles include cost, liability, [81] [82] consumer reluctance, [83] ethical dilemmas, [84] [85] security, [86] [87] [88] [89] privacy, [80] and legal/regulatory framework. [90] Further, AVs could automate the work of professional drivers, eliminating many jobs, which could slow acceptance. [91]
Tesla calls its Level 2 ADAS "Full Self-Driving (FSD) Beta". [92] US Senators Richard Blumenthal and Edward Markey called on the Federal Trade Commission (FTC) to investigate this marketing in 2021. [93] In December 2021 in Japan, Mercedes-Benz was punished by the Consumer Affairs Agency for misleading product descriptions. [94]
Mercedes-Benz was criticized for a misleading US commercial advertising E-Class models. [95] At that time, Mercedes-Benz rejected the claims and stopped its "self-driving car" ad campaign that had been running. [96] [97] In August 2022, the California Department of Motor Vehicles (DMV) accused Tesla of deceptive marketing practices. [98]
With the Automated Vehicles Bill (AVB) self-driving car-makers could face prison for misleading adverts in the United-Kingdom. [99]
In the 2020s, concerns over ACs' vulnerability to cyberattacks and data theft emerged. [100]
In 2018 and 2019, former Apple engineers were charged with stealing information related to Apple's self-driving car project. [101] [102] [103] In 2021 the United States Department of Justice (DOJ) accused Chinese security officials of coordinating a hacking campaign to steal information from government entities, including research related to autonomous vehicles. [104] [105] China has prepared "the Provisions on Management of Automotive Data Security (Trial) to protect its own data". [106] [107]
Cellular Vehicle-to-Everything technologies are based on 5G wireless networks. [108] As of November 2022 [update] , the US Congress was considering the possibility that imported Chinese AC technology could facilitate espionage. [109]
Testing of Chinese automated cars in the US has raised concern over which US data are collected by Chinese vehicles to be stored in China and any link with the Chinese communist party. [110]
ACs complicate the need for drivers to communicate with each other, e.g., to decide which car enters an intersection first. In an AC without a driver, traditional means such as hand signals do not work (no driver, no hands). [111]
ACs must be able to predict the behavior of possibly moving vehicles, pedestrians, etc, in real time in order to proceed safely. [48] The task becomes more challenging the further into the future the prediction extends, requiring rapid revisions to the estimate to cope with unpredicted behavior. One approach is to wholly recompute the position and trajectory of each object many times per second. Another is to cache the results of an earlier prediction for use in the next one to reduce computational complexity. [112] [113]
The ADAS has to be able to safely accept control from and return control to the driver. [114]
Consumers will avoid ACs unless they trust them as safe. [115] [116] Robotaxis operating in San Francisco received pushback over perceived safety risks. [117] Automatic elevators were invented in 1900, but did not become common until operator strikes and trust was built with advertising and features such as an emergency stop button. [118] [119] However, with repeated use of autonomous driving functions, drivers' behavior and trust in autonomous vehicles gradually improved and both entered a more stable state. At the same time, this also improved the performance and reliability of the vehicle in complex conditions, thereby increasing public trust. [120]
Autonomy also presents various political and economic implications. The transportation sector holds significant sway in many political and economic landscapes. For instance, many US states generate much annual revenue from transportation fees and taxes. [121] The advent of self-driving cars could profoundly affect the economy by potentially altering state tax revenue streams. Furthermore, the transition to autonomous vehicles might disrupt employment patterns and labor markets, particularly in industries heavily reliant on driving professions. [121] Data from the US Bureau of Labor Statistics indicates that in 2019, the sector employed over two million individuals as tractor-trailer truck drivers. [122] Additionally, taxi and delivery drivers represented approximately 370,400 positions, and bus drivers constituted a workforce of over 680,000. [123] [124] [125] Collectively, this amounts to a conceivable displacement of nearly 2.9 million jobs, surpassing the job losses experienced in the 2008 Great Recession. [126]
The prominence of certain demographic groups within the tech industry inevitably shapes the trajectory of autonomous vehicle (AV) development, potentially perpetuating existing inequalities. [127]
Research from Georgia Tech revealed that autonomous vehicle detection systems were generally five percent less effective at recognizing darker-skinned individuals. This accuracy gap persisted despite adjustments for environmental variables like lighting and visual obstructions. [128]
Standards for liability have yet to be adopted to address crashes and other incidents. Liability could rest with the vehicle occupant, its owner, the vehicle manufacturer, or even the ADAS technology supplier, possibly depending on the circumstances of the crash. [129] Additionally, the infusion of ArtificiaI Intelligence technology in autonomous vehicles adds layers of complexity to ownership and ethical dynamics. Given that AI systems are inherently self-learning, a question arises of whether accountability should rest with the vehicle owner, the manufacturer, or the AI developer. [130]
The trolley problem is a thought experiment in ethics. Adapted for ACs, it considers an AC carrying one passenger confronting a pedestrian who steps in its way. The ADAS notionally has to choose between killing the pedestrian or swerving into a wall, killing the passenger. [131] Possible frameworks include deontology (formal rules) and utilitarianism (harm reduction). [48] [132] [133]
One public opinion survey reported that harm reduction was preferred, except that passengers wanted the vehicle to prefer them, while pedestrians took the opposite view. Utilitarian regulations were unpopular. [134] Additionally, cultural viewpoints exert substantial influence on shaping responses to these ethical quandaries. Another study found that cultural biases impact preferences in prioritizing the rescue of certain individuals over others in car accident scenarios. [130]
Some ACs require an internet connection to function, opening the possibility that a hacker might gain access to private information such as destinations, routes, camera recordings, media preferences, and/or behavioral patterns, although this is true of an internet-connected device. [135] [136] [137]
ACs make use of road infrastructure (e.g., traffic signs, turn lanes) and may require modifications to that infrastructure to fully achieve their safety and other goals. [138] In March 2023, the Japanese government unveiled a plan to set up a dedicated highway lane for ACs. [139] In April 2023, JR East announced their challenge to raise their self-driving level of Kesennuma Line bus rapid transit (BRT) in rural area from the current Level 2 to Level 4 at 60 km/h (37 mph). [140]
ACs can be tested via digital simulations, [141] [142] in a controlled test environment, [143] and/or on public roads. Road testing typically requires some form of permit [144] or a commitment to adhere to acceptable operating principles. [145] For example, New York requires a test driver to be in the vehicle, prepared to override the ADAS as necessary. [146]
In California, self-driving car manufacturers are required to submit annual reports describing how often their vehicles autonomously disengaged from autonomous mode. [147] This is one measure of system robustness (ideally, the system should never disengage). [148]
In 2017, Waymo reported 63 disengagements over 352,545 mi (567,366 km) of testing, an average distance of 5,596 mi (9,006 km) between disengagements, the highest (best) among companies reporting such figures. Waymo also logged more autonomous miles than other companies. Their 2017 rate of 0.18 disengagements per 1,000 mi (1,600 km) was an improvement over the 0.2 disengagements per 1,000 mi (1,600 km) in 2016, and 0.8 in 2015. In March 2017, Uber reported an average of 0.67 mi (1.08 km) per disengagement. In the final three months of 2017, Cruise (owned by GM) averaged 5,224 mi (8,407 km) per disengagement over 62,689 mi (100,888 km). [149]
| Car maker | California, 2016 [149] | California, 2018[ citation needed ] | California, 2019 [150] | |||
|---|---|---|---|---|---|---|
| Distance between disengagements | Total distance traveled | Distance between disengagements | Total distance traveled | Distance between disengagements | Total distance traveled | |
| Waymo | 5,128 mi (8,253 km) | 635,868 mi (1,023,330 km) | 11,154 mi (17,951 km) | 1,271,587 mi (2,046,421 km) | 11,017 mi (17,730 km) | 1,450,000 mi (2,330,000 km) |
| BMW | 638 mi (1,027 km) | 638 mi (1,027 km) | ||||
| Nissan | 263 mi (423 km) | 6,056 mi (9,746 km) | 210 mi (340 km) | 5,473 mi (8,808 km) | ||
| Ford | 197 mi (317 km) | 590 mi (950 km) | ||||
| General Motors | 55 mi (89 km) | 8,156 mi (13,126 km) | 5,205 mi (8,377 km) | 447,621 mi (720,376 km) | 12,221 mi (19,668 km) | 831,040 mi (1,337,430 km) |
| Aptiv | 15 mi (24 km) | 2,658 mi (4,278 km) | ||||
| Tesla | 3 mi (4.8 km) | 550 mi (890 km) | ||||
| Mercedes-Benz | 2 mi (3.2 km) | 673 mi (1,083 km) | 1.5 mi (2.4 km) | 1,749 mi (2,815 km) | ||
| Bosch | 7 mi (11 km) | 983 mi (1,582 km) | ||||
| Zoox | 1,923 mi (3,095 km) | 30,764 mi (49,510 km) | 1,595 mi (2,567 km) | 67,015 mi (107,850 km) | ||
| Nuro | 1,028 mi (1,654 km) | 24,680 mi (39,720 km) | 2,022 mi (3,254 km) | 68,762 mi (110,662 km) | ||
| Pony.ai | 1,022 mi (1,645 km) | 16,356 mi (26,322 km) | 6,476 mi (10,422 km) | 174,845 mi (281,386 km) | ||
| Baidu (Apolong) | 206 mi (332 km) | 18,093 mi (29,118 km) | 18,050 mi (29,050 km) | 108,300 mi (174,300 km) | ||
| Aurora | 100 mi (160 km) | 32,858 mi (52,880 km) | 280 mi (450 km) | 39,729 mi (63,938 km) | ||
| Apple | 1.1 mi (1.8 km) | 79,745 mi (128,337 km) | 118 mi (190 km) | 7,544 mi (12,141 km) | ||
| Uber | 0.4 mi (0.64 km) | 26,899 mi (43,290 km) | 0 mi (0 km) | |||
Reporting companies use varying definitions of what qualifies as a disengagement, and such definitions can change over time. [151] [148] Executives of self-driving car companies have criticized disengagements as a deceptive metric, because it does not consider varying road conditions. [152]
In April 2021, WP.29 GRVA proposed a "Test Method for Automated Driving (NATM)". [153]
In October 2021, Europe's pilot test, L3Pilot, demonstrated ADAS for cars in Hamburg, Germany, in conjunction with ITS World Congress 2021. SAE Level 3 and 4 functions were tested on ordinary roads. [154] [155] [156]
In November 2022, an International Standard ISO 34502 on "Scenario based safety evaluation framework" was published. [157] [158]
In April 2022, collision avoidance testing was demonstrated by Nissan. [159] [160] Waymo published a document about collision avoidance testing in December 2022. [161]
In September 2022, Biprogy released Driving Intelligence Validation Platform (DIVP) as part of Japanese national project "SIP-adus", which is interoperable with Open Simulation Interface (OSI) of ASAM. [162] [163] [164]
In November 2022, Toyota demonstrated one of its GR Yaris test cars, which had been trained using professional rally drivers. [165] Toyota used its collaboration with Microsoft in FIA World Rally Championship since the 2017 season. [166]
In 2023 David R. Large, senior research fellow with the Human Factors Research Group at the University of Nottingham, disguised himself as a car seat in a study to test people's reactions to driverless cars. He said, "We wanted to explore how pedestrians would interact with a driverless car and developed this unique methodology to explore their reactions." The study found that, in the absence of someone in the driving seat, pedestrians trust certain visual prompts more than others when deciding whether to cross the road. [167]
A meta-analysis published in Nature Communications in 2024 compared various sources of safety data for autonomous (AV) and human-driven vehicles (HDV). This collected 2,100 AV and 35,133 HDV incident records which accurately reflected the incident details. [168] Some AVs in the comparisons (such as robotaxis) were effectively autonomous, while others were equipped with Advanced Driving Systems (ADS) or Advanced Driver Assistance Systems (ADAS).
The study concluded that AVs are safer in most circumstances, [169] and AVs had far fewer crashes involving pedestrians (3% against 15%) per mile travelled. [170] The comparison also highlighed some other disaparities: while AVs were found to be significantly less likely to crash in heavy rain or fog than an unassisted human, they were more than five times more vulnerable to collisions at dawn and dusk. [170]
As of 2023, Tesla's ADAS Autopilot/Full Self Driving (beta) was classified as Level 2 ADAS. [171]
On 20 January 2016, the first of five known fatal crashes of a Tesla with Autopilot occurred, in China's Hubei province. [172] Initially, Tesla stated that the vehicle was so badly damaged from the impact that their recorder was not able to determine whether the car had been on Autopilot at the time. However, the car failed to take evasive action.
Another fatal Autopilot crash occurred in May in Florida in a Tesla Model S [173] [174] that crashed into a tractor-trailer. In a civil suit between the father of the driver killed and Tesla, Tesla documented that the car had been on Autopilot. [175] According to Tesla, "neither Autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied." Tesla claimed that this was Tesla's first known Autopilot death in over 130 million miles (210 million kilometers) with Autopilot engaged. Tesla claimed that on average one fatality occurs every 94 million miles (151 million kilometers) across all vehicle types in the US. [176] [177] [178] However, this number also includes motorcycle/pedestrian fatalities. [179] [180] The ultimate National Transportation Safety Board (NTSB) report concluded Tesla was not at fault; the investigation revealed that for Tesla cars, the crash rate dropped by 40 percent after Autopilot was installed. [181]
In February 2025, a Tesla Cybertruck crashed while in Full-Self Driving mode, raising concerns about autonomous driving and prompting an investigation from Tesla who said the crash would be probed "“in line with standard protocol when any of our electric vehicles are involved in an accident while in FSD mode.” [182] [183]
In June 2015, Google confirmed that 12 vehicles had suffered collisions as of that date. Eight involved rear-end collisions at a stop sign or traffic light, in two of which the vehicle was side-swiped by another driver, one in which another driver rolled a stop sign, and one where a driver was controlling the car manually. [184] In July 2015, three employees suffered minor injuries when their vehicle was rear-ended by a car whose driver failed to brake. This was the first collision that resulted in injuries. [185]
According to Google Waymo's accident reports as of early 2016, their test cars had been involved in 14 collisions, of which other drivers were at fault 13 times, although in 2016 the car's software caused a crash. [186] On 14 February 2016 a Google vehicle attempted to avoid sandbags blocking its path. During the maneuver it struck a bus. Google stated, "In this case, we clearly bear some responsibility, because if our car hadn't moved, there wouldn't have been a collision." [187] [188] Google characterized the crash as a misunderstanding and a learning experience. No injuries were reported. [186]
In March 2018, Elaine Herzberg died after she was hit by an AC tested by Uber's Advanced Technologies Group (ATG) in Arizona. A safety driver was in the car. Herzberg was crossing the road about 400 feet from an intersection. [189] Some experts said a human driver could have avoided the crash. [190] Arizona governor Doug Ducey suspended the company's ability to test its ACs citing an "unquestionable failure" of Uber to protect public safety. [191] Uber also stopped testing in California until receiving a new permit in 2020. [192] [193]
NTSB's final report determined that the immediate cause of the accident was that safety driver Rafaela Vasquez failed to monitor the road, because she was distracted by her phone, but that Uber's "inadequate safety culture" contributed. The report noted that the victim had "a very high level" of methamphetamine in her body. [194] The board called on federal regulators to carry out a review before allowing automated test vehicles to operate on public roads. [195] [196]
In September 2020, Vasquez pled guilty to endangerment and was sentenced to three years' probation. [197] [198]
On 12 August 2021, a 31-year-old Chinese man was killed after his NIO ES8 crashed in a tunnel. [199] NIO's self-driving feature was in beta and could not deal with static obstacles. [200] The vehicle's manual clearly stated that the driver must take over near construction sites. Lawyers of the deceased's family questioned NIO's private access to the vehicle, which they argued did not guarantee the integrity of the data. [201]
In November 2021, the California Department of Motor Vehicles (DMV) notified Pony.ai that it was suspending its testing permit following a reported collision in Fremont on 28 October. [202] In May 2022, DMV revoked Pony.ai's permit for failing to monitor the driving records of its safety drivers. [203]
In April 2022, Cruise's testing vehicle was reported to have blocked a fire engine on emergency call, and sparked questions about its ability to handle unexpected circumstances. [204] [205]
In February 2024, a driver using the Ford BlueCruise hands-free driving feature struck and killed the driver of a stationary car with no lights on in the middle lane of a freeway in Texas. [206]
In March 2024, a drunk driver who was speeding, holding her cell phone, and using BlueCruise on a Pennsylvania freeway struck and killed two people who had been driving two cars. [207] The first car had become disabled and was on the left shoulder with part of the car in the left driving lane. [207] The second driver had parked his car behind the first car presumably to help the first driver. [207]
The NTSB is investigating both incidents. [208]
The NHTSA began mandating incident reports from autonomous vehicle companies in June 2021. Some reports cite incidents from as early as August 2019, with current data available through June 17, 2024. [209]
There have been a total of 3,979 autonomous vehicle incidents (both ADS and ADAS) reported during this timeframe. 2,146 of those incidents (53.9%) involved Tesla vehicles. [210]
In a 2011 online survey of 2,006 US and UK consumers, 49% said they would be comfortable using a "driverless car". [211]
A 2012 survey of 17,400 vehicle owners found 37% who initially said they would be interested in purchasing a "fully autonomous car". However, that figure dropped to 20% if told the technology would cost US$3,000 more. [212]
In a 2012 survey of about 1,000 German drivers, 22% had a positive attitude, 10% were undecided, 44% were skeptical and 24% were hostile. [213]
A 2013 survey of 1,500 consumers across 10 countries found 57% "stated they would be likely to ride in a car controlled entirely by technology that does not require a human driver", with Brazil, India and China the most willing to trust automated technology. [214]
In a 2014 US telephone survey, over three-quarters of licensed drivers said they would consider buying a self-driving car, rising to 86% if car insurance were cheaper. 31.7% said they would not continue to drive once an automated car was available. [215]
In 2015, a survey of 5,000 people from 109 countries reported that average respondents found manual driving the most enjoyable. 22% did not want to pay more money for autonomy. Respondents were found to be most concerned about hacking/misuse, and were also concerned about legal issues and safety. Finally, respondents from more developed countries were less comfortable with their vehicle sharing data. [216] The survey reported consumer interest in purchasing an AC, stating that 37% of surveyed current owners were either "definitely" or "probably" interested. [216]
In 2016, a survey of 1,603 people in Germany that controlled for age, gender, and education reported that men felt less anxiety and more enthusiasm, whereas women showed the opposite. The difference was pronounced between young men and women and decreased with age. [217]
In a 2016 US survey of 1,584 people, "66 percent of respondents said they think autonomous cars are probably smarter than the average human driver". People were worried about safety and hacking risk. Nevertheless, only 13% of the interviewees saw no advantages in this new kind of cars. [218]
In a 2017 survey of 4,135 US adults found that many Americans anticipated significant impacts from various automation technologies including the widespread adoption of automated vehicles. [219]
In 2019, results from two opinion surveys of 54 and 187 US adults respectively were published. The questionnaire was termed the autonomous vehicle acceptance model (AVAM), including additional description to help respondents better understand the implications of various automation levels. Users were less accepting of high autonomy levels and displayed significantly lower intention to use autonomous vehicles. Additionally, partial autonomy (regardless of level) was perceived as requiring uniformly higher driver engagement (usage of hands, feet and eyes) than full autonomy. [220]
In 2022, a survey reported that only a quarter (27%) of the world's population would feel safe in self-driving cars. [221]
In 2024, a study by Saravanos et al. [222] at New York University reported that 87% of their respondents (from a sample of 358) believed that conditionally automated cars (at Level 3) would be easy to use.
Regulation of self-driving cars refers to legislation passed in various jurisdictions around the world to regulate, standardise, test and monitor the use of autonomous vehicles and automated driving systems on public roads. Existing liability laws are also evolving to fairly identify the parties responsible for damage and injury, and to address the potential for conflicts of interest between human occupants, system operators, insurers, and the public purse. Autonomous vehicle regulations may also apply to robotaxis and self-driving trucks, depending upon local legislation.
In June 2011, the US state of Nevada became the first jurisdiction in the world to legislate for self-driving. Since then, many state, national and suprantional bodies have introduced similar laws with varying characteristics. Regulations governing the testing of automated cars on public roads (as opposed to full consumer operations) include those introduced by the government of the United Kingdom in 2013, with similar laws coming into effect in France in 2015.
Modification of existing international agreements on the use of vehicles on public roads has also taken place. The 1949 Vienna Geneva Convention on Road Traffic had assumed that a driver is always fully in control and responsible for the behaviour of a vehicle in traffic, but this was amended in 2016 to allow the possibility of automated features in vehicles.
Specific international regulations also began to be formulated in 2018 with the Working Party on Automated/Autonomous and Connected Vehicles (GRVA) reccommending safety provisions related to the dynamics of vehicles (braking, steering), Advanced Driver Assistance Systems, Automated Driving Systems (ADS) and cyber security provisions for the World Forum for Harmonization of Vehicle Regulations (WP.29).
Individual jurisdictions around the world have also set up legal frameworks of various kinds. China introduced testing regulations for autonomous cars in 2018, and in 2020 issued the "Strategy for Innovation and Development of Intelligent Vehicles", a roadmap plan until 2025 relating to road traffic safety, surveying and mapping laws relating to intelligent vehicles. In Europe, Regulation 2019/2144, commonly known as the General Safety Regulation (GSR), came into force for all new cars in the European Union after 6th July 2022. Similar legislation followed in Japan with the law modified to reflect the finalized UNECE WP.29 GRVA regulations.
SAE wording is generally adopted as the de facto standard for definitions in regulations throughout most jurisdictions. This gives descriptions for levels of autonomy ranging from fully manual (Level 0) to fully automated (Level 5). Terminology in ADS, Operational Design Domain (ODD) and Dynamic Driving Task (DDT) is also used.
While various models of car may be described by their manufacturers as being at a certain Level, the SAE technical specification means that a car can move between levels depending on the driving task and the circumstances it is operating in at a given time. [223] This means that, for example, when a manufacturer says they have a "Level 4 car," they mean the vehicle has one or more specific features (like automated valet parking) that are capable of operating at Level 4, but on the open highway they might operate at a lower Level.
As of 2023 [update] most commercially available ADAS vehicles are SAE Level 2. A couple of companies reached higher levels, but only in restricted (geofenced) locations. [224] Vehicles operating below Level 5 still offer many advantages. [225]
SAE Level 2 features are available as part of the ADAS systems in many vehicles. In the US, 50% of new cars provide driver assistance for both steering and speed. [226]
Ford started offering BlueCruise service on certain vehicles in 2022; the system is named ActiveGlide in Lincoln vehicles. The system provided features such as lane centering, street sign recognition, and hands-free highway driving on more than 130,000 miles of divided highways. The 2022 1.2 version added features including hands-free lane changing, in-lane repositioning, and predictive speed assist. [227] [228] In April 2023 BlueCruise was approved in the UK for use on certain motorways, starting with 2023 models of Ford's electric Mustang Mach-E SUV. [229]
Tesla's Autopilot and its Full Self-Driving (FSD) ADAS suites are available on all Tesla cars since 2016. FSD offers highway and street driving (without geofencing), navigation/turn management, steering, and dynamic cruise control, collision avoidance, lane-keeping/switching, emergency braking, obstacle avoidance, but still requires the driver to remain ready to control the vehicle at any moment. Its driver management system combines eye tracking with monitoring pressure on the steering wheel to ensure that drives are both eyes on and hands on. [230] [231]
Tesla's FSD rewrite V12 (released in March 2024) uses a single deep learning transformer model for all aspects of perception, monitoring, and control. [232] [233] It relies on its eight cameras for its vision-only perception system, without use of LiDAR, radar, or ultrasound. [233] As of January 2024, Tesla has not initiated requests for Level 3 status for its systems and has not disclosed its reason for not doing so. [231]
General Motors is developing the "Ultra Cruise" ADAS system, that will be a dramatic improvement over their current "Super Cruise" system. Ultra Cruise will cover "95 percent" of driving scenarios on 2 million miles of roads in the US, according to the company. The system hardware in and around the car includes multiple cameras, short- and long-range radar, and a LiDAR sensor, and will be powered by the Qualcomm Snapdragon Ride Platform. The luxury Cadillac Celestiq electric vehicle will be one of the first vehicles to feature Ultra Cruise. [234]
As of April 2024 [update] , two car manufacturers have sold or leased Level 3 cars: Honda in Japan, and Mercedes in Germany, Nevada and California. [235]
Mercedes Drive Pilot has been available on the EQS and S-class sedan in Germany since 2022, and in California and Nevada since 2023. [16] A subscription costs between €5,000 and €7,000 for three years in Germany and $2,500 for one year in the United States. [236] Drive Pilot can only be used when the vehicle is traveling under 40 miles per hour (64 km/h), there is a vehicle in front, readable line markings, during the day, clear weather, and on freeways mapped by Mercedes down to the centimeter (100,000 miles in California). [236] [16] As of April 2024, one Mercedes vehicle with this capability has been sold in California. [236]
Honda continued to enhance its Level 3 technology. [237] [238] As of 2023, 80 vehicles with Level 3 support had been sold. [239]
Mercedes-Benz received authorization in early 2023 to pilot its Level 3 software in Las Vegas. [240] California also authorized Drive Pilot in 2023. [241]
BMW commercialized its AC in 2021. [242] In 2023 BMW stated that its Level-3 technology was nearing release. It would be the second manufacturer to deliver Level-3 technology, but the only one with a Level 3 technology which works in the dark. [243]
In 2023, in China, IM Motors, Mercedes, and BMW obtained authorization to test vehicles with Level 3 systems on motorways. [244] [245]
In September 2021, Stellantis presented its findings from its Level 3 pilot testing on Italian highways. Stellantis's Highway Chauffeur claimed Level 3 capabilities, as tested on the Maserati Ghibli and Fiat 500X prototypes. [246]
Polestar, a Volvo Cars' brand, announced in January 2022 its plan to offer Level 3 autonomous driving system in the Polestar 3 SUV, a Volvo XC90 successor, with technologies from Luminar Technologies, Nvidia, and Zenseact. [247]
In January 2022, Bosch and the Volkswagen Group subsidiary CARIAD released a collaboration for autonomous driving up to Level 3. This joint development targets Level 4 capabilities. [248]
Hyundai Motor Company is enhancing cybersecurity of connected cars to offer a Level 3 self-driving Genesis G90. [249] Kia and Hyundai Korean car makers delayed their Level 3 plans, and will not deliver Level 3 vehicles in 2023. [250]
In 2024, companies such as Waymo began offering robotaxi services in parts of the US with fully autonomous vehicles without safety drivers. [251] These services all operate at a loss as of 2025, with operating costs of about $4.5–$5.5/km ($7–$9 per mile), compared to $0.6/km ($1 per mile) for personal cars. The consultancy McKinsey estimated that bringing costs down to less than $1.2/km ($2 per mile) would take until 2035. [252]
In April 2023 in Japan, a Level 4 protocol became part of the amended Road Traffic Act. [253] ZEN drive Pilot Level 4 made by AIST operates there. [254]
In July 2020, Toyota started public demonstration rides on Lexus LS (fifth generation) based TRI-P4 with Level 4 capability. [255] In August 2021, Toyota operated a potentially Level 4 service using e-Palette around the Tokyo 2020 Olympic Village. [256]
In September 2020, Mercedes-Benz introduced world's first commercial Level 4 Automated Valet Parking (AVP) system named Intelligent Park Pilot for its new S-Class. [257] [258] In November 2022, Germany’s Federal Motor Transport Authority (KBA) approved the system for use at Stuttgart Airport. [259]
In September 2021, Cruise, General Motors, and Honda started a joint testing programme, using Cruise AV. [260] In 2023, the Origin was put on indefinite hold following Cruise's loss of its operating permit. [261]
In January 2023, Holon announced an autonomous shuttle during the 2023 Consumer Electronics Show (CES). The company claimed that the vehicle is the world's first Level 4 shuttle built to automotive standard. [262]
{{cite web}}: CS1 maint: multiple names: authors list (link)This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.
Media related to Autonomous automobiles at Wikimedia Commons
These books are based on presentations and discussions at the Automated Vehicles Symposium organized annually by TRB and AUVSI.