Part of a series on |
Self-driving cars & self-driving vehicles |
---|
Enablers |
Topics |
Related topics |
Increases in the use of autonomous car technologies (e.g., advanced driver-assistance systems) are causing incremental shifts in the responsibility of driving, with the primary motivation of reducing the frequency of traffic collisions. [1] Liability for incidents involving self-driving cars is a developing area of law and policy that will determine who is liable when a car causes physical damage to persons or property. [2] As autonomous cars shift the responsibility of driving from humans to autonomous car technology, there is a need for existing liability laws to evolve to reasonably identify the appropriate remedies for damage and injury. [3] As higher levels of autonomy are commercially introduced (SAE automation levels 3 and 4), the insurance industry stands to see higher proportions of commercial and product liability lines of business, while the personal automobile insurance line of business shrinks. [4]
Self-driving car liability and self-driving vehicle liability may be impacted by changes in regulation of self-driving vehicles being developing in some countries.
Self-driving car liability is a developing area of law and policy that will determine who is liable when an automated car causes physical damage to persons, or breaks road rules. [5] [2]
Similar considerations may also be raised with other automated vehicles and also with damages other than damage to persons.
When automated cars shift the control of driving from humans to automated car technology the driver will need to consent to share operational responsibility [6] which will require a legal framework. [ ambiguous ]
There may be a need for existing liability laws to evolve in order to fairly identify the parties responsible for damage and injury, and to address the potential for conflicts of interest between human occupants, system operator, insurers, and the public purse. [3] Increases in the use of automated car technologies (e.g. advanced driver-assistance systems) may prompt incremental shifts in this responsibility for driving. It is claimed by proponents to have potential to affect the frequency of road accidents, although it is difficult to assess this claim in the absence of data from substantial actual use. [7] If there was a dramatic improvement in safety, the operators may seek to project their liability for the remaining accidents onto others as part of their reward for the improvement. However, there is no obvious reason why they should escape liability if any such effects were found to be modest or nonexistent, since part of the purpose of such liability is to give an incentive to the party controlling something to do whatever is necessary to avoid it causing harm. Potential users may be reluctant to trust an operator if it seeks to pass its normal liability on to others.
In any case, a well-advised person who is not controlling a car at all (Level 5) would be understandably reluctant to accept liability for something out of their control. And when there is some degree of sharing control possible (Level 3 or 4), a well-advised person would be concerned that the vehicle might try to pass back control at the last seconds before an accident, to pass responsibility and liability back too, but in circumstances where the potential driver has no better prospects of avoiding the crash than the vehicle, since they have not necessarily been paying close attention, and if it is too hard for the very smart car it might be too hard for a human. Since operators, especially those familiar with trying to ignore existing legal obligations (under a motto like 'seek forgiveness, not permission'), such as Waymo or Uber, could be normally expected to try to avoid responsibility to the maximum degree possible, there is potential for attempt to let the operators evade being held liable for accidents while they are in control.
As higher levels of automation are commercially introduced (Level 3 and 4), the insurance industry may see a greater proportion of commercial and product liability lines while personal automobile insurance shrinks. [4]
When it comes to the direction of fully autonomous car liability, torts cannot be ignored. In any car accident the issue of negligence usually arises. In the situation of autonomous cars, negligence would most likely fall on the manufacturer because it would be hard to pin a breach of duty of care on the user who isn't in control of the vehicle. The only time negligence was brought up in an autonomous car lawsuit, there was a settlement between the person struck by the autonomous vehicle and the manufacturer (General Motors). Next, product liability would most likely cause liability to fall on the manufacturer. For an accident to fall under product liability, there needs to be either a defect, failure to provide adequate warnings, or foresee-ability by the manufacturer. [8] Third, is strict liability which in this case is similar to product liability based on the design defect. Based on a Nevada Supreme Court ruling (Ford vs. Trejo) the plaintiff needs to prove failure of the manufacturer to pass the consumer expectation test. [9] That is potentially how the three major torts could function when it comes to autonomous car liability.
Existing tort liability for drivers and insurers and product liability for manufacturer provide the current basis for governing crashes.
There are three basic theories of tort liability: traditional negligence, no-fault liability and strict liability. [3]
Traditional negligence | Driver is held liable for harms caused when reasonable care was not taken while in operation of the vehicle |
---|---|
No-fault | Crash victims are not permitted to sue the driver of the vehicle, unless the injuries resulting from the crash are of a certain severity. Victims are compensated through their own insurance |
Strict liability | Applies for abnormally dangerous or “ultrahazardous” activities. The actors involved consequently bear the associated costs regardless of whether they are legally at fault |
According to a National Motor Vehicle Crash Causation Survey, over 90% of the crashes (representing an estimated 2 million crashes in USA) involved the driver as the critical reason for the crash. [10] Meanwhile, research from the Insurance Institute for Highway Safety (IIHS) shows that Advanced Driver-Assistance Systems, which are seen as stepping stones to get to Level 3 and 4 autonomy, have helped reduce collisions by employing forward-collision warnings and automatic braking. [11] Given these trends, the increased use of autonomous vehicle technology could reduce the number of collisions and prevent crash-related deaths. [12] Consequently, cases of traditional negligence will likely fall, and this will, in turn, reduce automobile-insurance costs. [3]
With the onset of fully autonomous cars, it is possible that the need for specialized automobile insurance disappears and that health insurance and homeowner's liability insurance instead cover automobile crashes, much in the same way that they cover bicycle collisions. [3] Moreover, as cases of traditional negligence decrease, no-fault insurance systems appear attractive given their benefits. [3] It would provide compensation to victims relatively quickly, and the compensation would not depend on the identification of a party at-fault. In such systems, individual drivers would be well protected and would encourage the adoption of autonomous cars for their safety and cost-related benefits.
Negligence was the basis for the lawsuit Nilsson vs. General Motors. Nilsson was knocked off his motorcycle when a Chevy Bolt switched into his lane while it was in a self-driving mode. Nilsson sued for negligence based on the self-driving car having (1) a duty to follow traffic laws and regulations, (2) breaching that duty while switching lanes when he was passing, (3) and injuring his neck. [13] The case was settled before it went to court, so the answer to the question "was the error of the self-driving car foreseeable?" still remains unclear.
Product liability governs the liability of manufacturers in terms of negligence and strict liability. [3]
Negligence | Manufacturers must exercise reasonable care in designing their products to be safe under potential use cases |
---|---|
Strict liability | Manufacturer is held strictly liable for damages even when the manufacturer has exercised all possible care to remove defects |
Autonomous car manufacturers are incentivized by possible product liability torts lawsuits to reduce the danger of their products as much as they can within a reasonable cost structure. Strict liability covers an expansive range of potential harms that manufacturers may find challenging to protect against; instead of reducing less cost-effective risks, manufacturers may, to some degree, pass on potential costs of liability to consumers through higher prices.
Furthermore, product liability cases distinguish among various types of defects.
Manufacturing defects | When the product does not meet the manufacturer's specifications and standards |
---|---|
Design defects | When foreseeable risks of harm could have been reduced by use of an alternative design |
Failure to warn | When manufacturer fails in its duty to provide instruction about how the product can be safely used and does not provide adequate warning of its risks |
Under manufacturing defects, a plaintiff needs to show that the autonomous car failed to work as specified by the manufacturer. In the case of autonomous cars, however, this presents a significant hurdle because no court has applied manufacturing defects to software, which is not something tangible that is manufactured. [14] The wrong performance of the technology system is called “malfunction”, which means that there is a coding error within the system to cause the collisions. When there is a coding error, then the controlling software system may not have functioned as its authors had initially intended. [15] If a crash stems from a software error, then the traditional product liability law on manufacturing defects may not suffice. A greater understanding of how the software will be treated under this liability law, mainly when a software error causes physical parts to malfunction, needs to be explored.
Historically, courts have used two tests for the defectiveness of design: consumer-expectations and cost-benefit.
Consumer-expectations: "A product is defective in design or formulation when it is more dangerous than an ordinary consumer would expect when used in an intended or reasonably foreseeable manner. Moreover, the question of what an ordinary consumer expects in terms of the risks posed by the product is generally one for the trier of fact." [16]
On the other hand, the cost-benefit test weighs the benefits against the costs of a product in determining whether a design is defective. With autonomous cars, the plaintiff could make the argument that a different design, whether in the physical features of the vehicle or in the software that controls the movements of the vehicle, could have made the vehicle safer. [17] For plaintiffs, this creates a high burden of proof and also makes it challenging to find qualified experts. [14]
In asking "who do I sue," a plaintiff in a traditional car crash would assign blame to the driver or the car manufacturer, depending on the cause of the crash. In a crash involving an autonomous car, a plaintiff may have four options to pursue. [14]
In defense of such liabilities, autonomous vehicle manufacturers could make the argument of comparative negligence, product misuse, and state of the art. [3] With comparative negligence, the driver or passenger interference is seen as a part of the cause of harm and injury. With product misuse, the driver or passenger may be at fault for disregarding directions or altering the vehicle in a way to affect the proper performance of the vehicle. With state of the art, manufacturers could make the argument that there were not safe alternative designs at the time of manufacturing. [3]
As cars become more interconnected and autonomous, the potential for hacking a car system to acquire data and cause harm poses a serious risk. For manufacturers and developers of autonomous technology, liability exposures arise from the collection and storage of data and personal information in the vehicle and the cloud. [20] Currently, manufacturers require indemnification from vendors and subcontractors (dealerships, repair/installation facilities, etc.), and this practice will likely be extended to autonomous technology developers.
Transportation systems are vital for the autonomous vehicle as they serve as the commander, and with multiple autonomous vehicles systems used to increase efficiency, the risk of exposure to malicious attacks will dramatically increase. In order to protect the systems, cyber physical system must be implemented with autonomous dynamical subsystems to ensure the decision, interaction, and control. [21]
In 2018, a British Automated and Electric Vehicles act of parliament defines the rules for:
The law defines some cases of automated vehicle liability.
(1) Where—
- (a) an accident is caused by an automated vehicle when driving itself on a road or other public place in Great Britain,
- (b) the vehicle is insured at the time of the accident, and
- (c) an insured person or any other person suffers damage as a result of the accident,
- the insurer is liable for that damage.
(2) Where—
- (a) an accident is caused by an automated vehicle when driving itself on a road or other public place in Great Britain,
- (b) the vehicle is not insured at the time of the accident,
- (c) section 143 of the Road Traffic Act 1988 (users of motor vehicles to be insured or secured against third-party risks) does not apply to the vehicle at that time—
- (i) because of section 144(2) of that Act (exemption for public bodies etc), or
- (ii) because the vehicle is in the public service of the Crown, and
- (d) a person suffers damage as a result of the accident,
- the owner of the vehicle is liable for that damage.
— Automated and Electric Vehicles Act 2018 [22]
For instance:
An insurance policy in respect of an automated vehicle may exclude or limit the insurer's liability under section 2(1) for damage suffered by an insured person arising from an accident occurring as a direct result of—
- (a) software alterations made by the insured person, or with the insured person's knowledge, that are prohibited under the policy, or
- (b) a failure to install safety-critical software updates that the insured person knows, or ought reasonably to know, are safety-critical.
— Automated and Electric Vehicles Act 2018 [22]
The UK could have a regulator. When there is no user in charge (NUIC) the police contacts the regulator. The regulator sanctions the automated driving system entity until possible withdrawal of authorization. [23]
On 14 April 2021, in France, a text defines the rules for the level 3 (véhicule à délégation de conduite) and level 5 (transport routier de marchandises, lorsqu'il est effectué au moyen d'un système de transport routier automatisé). This text is titled: ordonnance n° 2021-443 du 14 avril 2021 relative au régime de responsabilité pénale applicable en cas de circulation d'un véhicule à délégation de conduite et à ses conditions d'utilisation. [24]
On first July 2021, France is the first European country to update it code de la route for automated cars. [25] This update clarifies driver and car role and responsibility and plans application after Vienna convention update and before September 2022. [26]
When Mercedes launch its Drive Pilot mid 2021 in Germany, it is expected that Daimler would have to assume insurance liability, depending on the jurisdiction. [27]
Level 3 means the driver can legally take their eyes off the wheel and the company, Daimler in this case, would have to assume insurance liability, depending on the jurisdiction.
— TheIndu. [27]
In 2021, a proposed German law proposed the "self-driving vehicle" would require an operating permit to be used as an automated vehicle. [28]
As argued in the article “The Coming Collision Between Autonomous Vehicles and the Liability System” by Gary Marchant and Rachel Lindor, a manufacturer cannot anticipate all possible scenarios that an autonomous car will encounter. [29] While the manufacturer will design the system to minimize risks of situations that it does anticipate, the collisions that are most damaging and costly will be those that the manufacturer fails to anticipate. This leaves the manufacturer highly vulnerable to design defects, in particular the cost-benefit test.
In light of this, Marchant and Lindor argue that “the technology is potentially doomed...because the liability burden on the manufacturer may be prohibitive of further development. Thus, even though an autonomous vehicle may be safer overall than a conventional vehicle, it will shift the responsibility for collisions, and hence liability, from drivers to manufacturers. The shift will push the manufacturer away from the socially-optimal outcome—to develop the autonomous vehicle.” [29]
Consequently, policymakers need to be mindful of manufacturer overbearing the liability costs and the potential consequences that may result, such as higher consumer costs and delays in introducing autonomous car technology. In the report “Autonomous Vehicle Technology” by the Rand Corporation, the authors recommend that policymakers consider approaches such as tort preemption, a federal insurance backstop, and long-term cost-benefit analysis of the legal standard for reasonableness. [3] These approaches attempt to align the private and public costs of autonomous car technology such that adoption is not unnecessarily delayed, and one party does not overly-bear the costs.
In September 2016, the National Highway Traffic Safety Administration released a policy report to accelerate the adoption of autonomous car technology (or HAVs, highly automated vehicles) and provide guidelines for an initial regulatory framework. The key points are: [30]
The House of Representatives on September 6, 2017, unanimously passed H.R. 3388, the SELF DRIVE Act of 2017 [31] [32]
The Federal Government, with the passing of the SELF DRIVE Act, is limiting the role of States, and this could signal a change in the future of liability laws. With the Federal Government also asserting that consumers will be protected, manufacturers may be at a liability disadvantage and stand to lose surplus. Updating the Federal Motor Vehicle Safety Standards will affect liability law. These laws will continue to protect the consumer while placing stricter standards on producers. The Federal Government has yet to announce any specific autonomous vehicular manslaughter liability laws. [33] [34]
More broadly, any software with access to the real world, including autonomous vehicles and robots, can cause property damage, injury, and death. This raises questions about civil liability or criminal responsibility.
In 2018, The University of Brighton researcher John Kingston analyzed three legal theories of criminal liability that could apply to an entity controlled by artificial intelligence. [35] [36]
Possible defenses include unexpected malfunction or infection with malware, which has been successfully used in the United Kingdom in the case of a denial-of-service attack. [35]
Kingston identifies two areas of law, depending on the type of entity: [35]
The NHTSA investigation of a fatal 2016 crash involving Tesla Autopilot proceeded as an automobile product safety inquiry and determined that despite the crash, there were no defects that required a recall (though Tesla is working to improve the software to avoid similar crashes). Autopilot only gives cars limited autonomy, and human drivers are expected to maintain situational awareness and take over as needed. [37]
With fully autonomous vehicles, the software and vehicle manufacturers are expected to be liable for any at-fault collisions (under existing automobile products liability laws), rather than the human occupants, the owner, or the owner's insurance company. [38] Volvo has already announced that it will pay for any injuries or damage caused by its fully autonomous car, which it expects to start selling in 2020. [38] Starting in 2012, laws or regulations specifically regarding autonomous car testing, certification, and sales, with some issuing special driver's licenses; have been passed by some U.S. states, this remains an active area of lawmaking. [39] Human occupants would still be liable for actions they directed, such as choosing where to park (and thus for parking tickets).
University of South Carolina law professor Bryant Walker Smith points out that with automated systems, considerably more data will typically be available than with human-driver crashes, allowing more reliable and detailed assessment of liability. He also predicted that comparisons between how an automated system responds and how a human would have or should have responded would be used to help determine fault. [40]
According to the NHTSA, states retain their responsibility for motor vehicle insurance and liability regimes, among other traditional responsibilities such as vehicle licensing and registration and traffic laws and enforcement. [18] Several states, such as Michigan and Nevada and Washington D.C., have explicitly written provisions for how liability will be treated.
State | Bill Number | Relevant Provisions | Effective Date |
---|---|---|---|
Michigan | SB 663 (2013) | Limits liability of vehicle manufacturer or upfitter for damages in a product liability suit resulting from modifications made by a third party to an automated vehicle or automated vehicle technology under certain circumstances; relates to automated mode conversions | Enacted and chaptered on Dec. 26, 2013 |
Nevada | SB 313 (2013) | Provides that the manufacturer of a vehicle that has been converted to be an autonomous vehicle by a third party is immune from liability for certain injuries | Enacted and chaptered on June 2, 2013 |
Washington D.C. | 2012 DC B 19-0931 | Restricts conversion to recent vehicles, and addresses liability of the original manufacturer of a converted vehicle | Enacted and effective from April 23, 2013. |
Arizona's Republican Gov. Doug Ducey's new rules, implemented March 1, lay out a specific list of licensing and registration requirements for autonomous car operators. Specifically, Ducey's order specifies that a “person” subject to the laws includes any corporation incorporated in Arizona. [41]
In a white paper titled “Marketplace of Change: Automobile Insurance in the Era of Autonomous Vehicles,” KPMG estimated that personal auto accounted for 87% of loss insurance, while commercial auto accounted for 13% in 2013. [4] By 2040, personal auto is projected to fall to 58%, while commercial auto rises to 28%, and product liability gains 14%. This reflects the view that personal liability will fall as the responsibility of driving shifts to the vehicle and that mobility on demand will take greater hold. In addition, with the view that the overall pie representing losses covered by liability policies will shrink as autonomous cars cause fewer collisions. [4]
Although KPMG cautions that this elimination of excess capacity will bring about significant changes to the insurance industry, 32% of insurance firm leaders expect that driverless vehicles will have no material effect on the insurance industry over the next 10 years. [4] Inaction by the large players has opened up opportunities for new entrants. For example, Metromile, an insurance provider start-up founded in 2011, has started to offer usage-based insurance for low-mileage drivers and designed a policy to complement the commercial coverage of Uber drivers. [42]
In 2015, Volvo issued a press release claiming that Volvo would accept full liability whenever its cars in autonomous mode. [19] President and Chief Executive of Volvo Cars Håkan Samuelsson went further urging "regulators to work closely with car makers to solve controversial outstanding issues such as questions over legal liability in the event that a self-driving car is involved in a crash or hacked by a criminal third party." [19]
In an IEEE article, the senior technical leader for safety and driver support technologies at Volvo echoed a similar sentiment saying, “if we made a mistake in designing the brakes or writing the software, it is not reasonable to put the liability on the customer...we say to the customer, you can spend time on something else, we take responsibility.” [43]
Starting in September 2023 in the United States, Mercedes-Benz takes liability for the Level 3 Drive Pilot as long as the "user operates Drive Pilot as designed," [44] but "the driver must be ready to take control of the vehicle at all times when prompted to intervene by the vehicle." [45]
In August 2023 a General Motors Cruise self-driving car drove into wet concrete in a road construction zone in San Francisco, California, and got stuck. The company will pay to repave the road. [46]
In law and insurance, a proximate cause is an event sufficiently related to an injury that the courts deem the event to be the cause of that injury. There are two types of causation in the law: cause-in-fact, and proximate cause. Cause-in-fact is determined by the "but for" test: But for the action, the result would not have happened. The action is a necessary condition, but may not be a sufficient condition, for the resulting injury. A few circumstances exist where the but-for test is ineffective. Since but-for causation is very easy to show, a second test is used to determine if an action is close enough to a harm in a "chain of events" to be legally valid. This test is called proximate cause. Proximate cause is a key principle of insurance and is concerned with how the loss or damage actually occurred. There are several competing theories of proximate cause. For an act to be deemed to cause a harm, both tests must be met; proximate cause is a legal limitation on cause-in-fact.
In criminal and civil law, strict liability is a standard of liability under which a person is legally responsible for the consequences flowing from an activity even in the absence of fault or criminal intent on the part of the defendant.
A self-driving car, also known as a autonomous car (AC), driverless car, robotaxi, robotic car or robo-car, is a car that is capable of operating with reduced or no human input. Self-driving cars are responsible for all driving activities, such as perceiving the environment, monitoring important systems, and controlling the vehicle, which includes navigating from origin to destination.
Vehicle insurance is insurance for cars, trucks, motorcycles, and other road vehicles. Its primary use is to provide financial protection against physical damage or bodily injury resulting from traffic collisions and against liability that could also arise from incidents in a vehicle. Vehicle insurance may additionally offer financial protection against theft of the vehicle, and against damage to the vehicle sustained from events other than traffic collisions, such as vandalism, weather or natural disasters, and damage sustained by colliding with stationary objects. The specific terms of vehicle insurance vary with legal regulations in each region.
Advanced driver-assistance systems (ADAS) are technologies that assist drivers with the safe operation of a vehicle. Through a human-machine interface, ADAS increase car and road safety. ADAS use automated technology, such as sensors and cameras, to detect nearby obstacles or driver errors, and respond accordingly. ADAS can enable various levels of autonomous driving.
In law, liable means "responsible or answerable in law; legally obligated". Legal liability concerns both civil law and criminal law and can arise from various areas of law, such as contracts, torts, taxes, or fines given by government agencies. The claimant is the one who seeks to establish, or prove, liability.
In its broadest sense, no-fault insurance is any type of insurance contract under which the insured party is indemnified by their own insurance company for losses, regardless of the source of the cause of loss. In this sense, it is similar to first-party coverage. The term "no-fault" is most commonly used in the United States, Australia, and Canada when referring to state or provincial automobile insurance laws where a policyholder and their passengers are reimbursed by the policyholder's own insurance company without proof of fault, and are restricted in their right to seek recovery through the civil-justice system for losses caused by other parties. No-fault insurance has the goal of lowering premium costs by avoiding expensive litigation over the causes of the collision, while providing quick payments for injuries or loss of property.
English tort law concerns the compensation for harm to people's rights to health and safety, a clean environment, property, their economic interests, or their reputations. A "tort" is a wrong in civil law, rather than criminal law, that usually requires a payment of money to make up for damage that is caused. Alongside contracts and unjust enrichment, tort law is usually seen as forming one of the three main pillars of the law of obligations.
Personal injury is a legal term for an injury to the body, mind, or emotions, as opposed to an injury to property. In common law jurisdictions the term is most commonly used to refer to a type of tort lawsuit in which the person bringing the suit has suffered harm to their body or mind. Personal injury lawsuits are filed against the person or entity that caused the harm through negligence, gross negligence, reckless conduct, or intentional misconduct, and in some cases on the basis of strict liability. Different jurisdictions describe the damages in different ways, but damages typically include the injured person's medical bills, pain and suffering, and diminished quality of life.
In the English law of negligence, the acts of the claimant may give the defendant a defence to liability, whether in whole or part, if those acts unreasonably add to the loss.
Vehicular automation is the use of technology to assist or replace the operator of a vehicle such as a car, truck, aircraft, rocket, military vehicle, or boat. Assisted vehicles are semi-autonomous, whereas vehicles that can travel without a human operator are autonomous. The degree of autonomy may be subject to various constraints such as conditions. Autonomy is enabled by advanced driver-assistance systems (ADAS) of varying capacity.
Dobson v Dobson, [1999] 2 SCR 753 was a landmark decision by the Supreme Court of Canada on a pregnant woman's legal duties in tort law. It was the first time the Supreme Court of Canada had to consider this issue. The majority of the Court found that tort claims cannot be brought against women for negligence toward the fetus during pregnancy.
Negligent entrustment is a cause of action in United States tort law which arises where one party is held liable for negligence because they negligently provided another party with a dangerous instrumentality, and the entrusted party caused injury to a third party with that instrumentality. The cause of action most frequently arises where one person allows another to drive their automobile.
The following outline is provided as an overview of and introduction to tort law in common law jurisdictions:
A traffic collision, also known as a motor vehicle collision, or car crash, occurs when a vehicle collides with another vehicle, pedestrian, animal, road debris, or other moving or stationary obstruction, such as a tree, pole or building. Traffic collisions often result in injury, disability, death, and property damage as well as financial costs to both society and the individuals involved. Road transport is statistically the most dangerous situation people deal with on a daily basis, but casualty figures from such incidents attract less media attention than other, less frequent types of tragedy. The commonly used term car accident is increasingly falling out of favor with many government departments and organizations, with the Associated Press style guide recommending caution before using the term. Some collisions are intentional vehicle-ramming attacks, staged crashes, vehicular homicide or vehicular suicide.
Lister v Romford Ice and Cold Storage Co Ltd[1956] UKHL 6 is an important English tort law, contract law and labour law, which concerns vicarious liability and an ostensible duty of an employee to compensate the employer for torts he commits in the course of employment.
Vehicle insurance in the United States is designed to cover the risk of financial liability or the loss of a motor vehicle that the owner may face if their vehicle is involved in a collision that results in property or physical damage. Most states require a motor vehicle owner to carry some minimum level of liability insurance. States that do not require the vehicle owner to carry car insurance include Virginia, where an uninsured motor vehicle fee may be paid to the state, New Hampshire, and Mississippi, which offers vehicle owners the option to post cash bonds. The privileges and immunities clause of Article IV of the U.S. Constitution protects the rights of citizens in each respective state when traveling to another. A motor vehicle owner typically pays insurers a monthly or yearly fee, often called an insurance premium. The insurance premium a motor vehicle owner pays is usually determined by a variety of factors including the type of covered vehicle, marital status, credit score, whether the driver rents or owns a home, the age and gender of any covered drivers, their driving history, and the location where the vehicle is primarily driven and stored. Most insurance companies will increase insurance premium rates based on these factors and offer discounts less frequently.
The death of Elaine Herzberg was the first recorded case of a pedestrian fatality involving a self-driving car, after a collision that occurred late in the evening of March 18, 2018. Herzberg was pushing a bicycle across a four-lane road in Tempe, Arizona, United States, when she was struck by an Uber test vehicle, which was operating in self-drive mode with a human safety backup driver sitting in the driving seat. Herzberg was taken to the local hospital where she died of her injuries.
The impact of self-driving cars is anticipated to be wide-ranging in many areas of daily life. Self-driving cars have been the subject of significant research on their environmental, practical, and lifestyle consequences and their impacts remain debated.
Regulation of self-driving cars, autonomous vehicles and automated driving system is an increasingly relevant topic in the automotive industry strongly related to the success of the actual technology. Multiple countries have passed local legislation and agreed on standards for the introduction of autonomous cars.
{{cite journal}}
: CS1 maint: multiple names: authors list (link)