13 Expert Reasons Why Autonomous Cars Are Dangerous

Handing car keys to robot, autonomous driver

David Kidd, a Senior Research Scientist for the Insurance Institute for Highway Safety (IIHS), recently wrote a report on the pitfalls of autonomous vehicles. The report details how autonomous vehicles are a safety risk because of a lack of federal regulation.

Kidd writes that as our vehicles become increasingly futuristic due to modern technology granting them unprecedented capabilities, it’s essential that people are aware of this technology’s limitations and how they can properly and safely use it. He also highlights the inherent safety risks of not properly regulating them.

A Brave New World for Drivers

Image Credit: Mliu92, CC BY-SA 3.0, via Wikimedia Commons.

Cruise autonomous vehicles are roaming the streets of Phoenix, Arizona. Meanwhile, new cars that populate dealership lots sport driver-assist features that can control a car’s speed, lane position, or even change lanes while the driver’s hands are not on the wheel.

Cruise, as mentioned above, has fully autonomous vehicles, meaning they have no driver behind the wheel. People who hail a ride from a Cruise can select their destination and lock in the car doors with an app. These are not our grandparents’ cars or even the cars of our parents’ generation. As such, the requirements for road safety change need to change.

Is This Technology Safe?

Image Credit: National Transportation Safety Board, Public domain, via Wikimedia Commons.

Kidd states that regulators have yet to determine if this technology is safe for road users. The partial automation driving systems in production cars that are more readily becoming available to consumers and the driverless vehicles that automakers are deploying onto our streets present safety challenges that federal regulators still need to face.

According to Kidd, no “common sense guardrails” are currently in place to regulate these innovations. More crash reporting requirements regarding accidents with autonomous vehicle systems need to be implemented, the lack of which stifles researchers from determining how safe these systems are in real-world settings.

What’s Currently Being Done To Address This Issue

Image Credit: Shutterstock.

Kidd states that several United States senators recently confronted the National Highway Traffic Safety Administration (NHTSA) about what he refers to as their “hands-off approach” to autonomous vehicle regulations.

These Senators, Edward Markey, Richard Blumenthal, Peter Welch, Elizabeth Warren, Ben Ray Lujan, and Bernie Sanders, wrote a letter to the federal agency citing issues around misleading marketing practices, which a federal probe is currently investigating Tesla for, from automakers which fuel inappropriate uses of these systems by consumers. It is vital to address this issue to avoid putting numerous road users at risk.

New Safety Standards Are Needed

Image Credit: Shutterstock.

In their letter to the NHTSA, the Senators pointed out that these “self–driving cars” are not undergoing sufficient testing. Automakers freely test them on public roads without submitting to extra performance or safety requirements.

Their abilities and functions vastly differ from conventional cars, but the NHTSA holds them to the same safety standards. The senators are pushing the federal agency to address this issue by requiring more comprehensive reporting surrounding the role of automation in vehicle crashes.

The Dangers of Vehicle Automation

Image Credit: Shutterstock.

While Kidd notes that vehicle automation can significantly reduce human mistakes behind the wheel, he also says that experts have yet to find consistent evidence that current automated driving systems are making our roads safer.

Unfortunately, mounting evidence suggests that the current automation systems in modern vehicles are introducing new and often “foreseeable” safety risks, according to Kidd. He says that because the NHTSA does not currently require comprehensive data reporting for crashes involving vehicle automation systems, it is more difficult for experts to assess and ultimately reduce the safety risks of these systems.

Increases in Distracted Driving

Image Credit: Shutterstock.

In their letter to the NHTSA, the senators state that significant safety risks arise from drivers misusing partial driving automation systems. Sometimes, drivers even use these systems on roads under circumstances that automakers did not design them for.

The senators add that these autonomous driving systems potentially encourage drivers to be unattentive behind the wheel, leading to higher rates of distracted driving. They stress that this can also occur when drivers use this technology in the “appropriate locations.”

Why Are Drivers Misusing These Systems?

Image Credit: Shutterstock.

Kidd states that in 2016, the IIHS warned the NHTSA that drivers had a significant risk of misusing vehicle automation systems as they initialized their Federal Automated Vehicles Policy. They predicted that the public would be “confused” about what partially automated driving systems are capable of, leading to overreliance and misuse.

At that time, the IIHS suggested to the NHTSA that federal regulations should require automakers to develop methods for decreasing consumer overreliance and misuse of these systems. For example, the IIHS recommended constraining these driving automation systems so drivers could only use them under the specific conditions and roads that automakers engineered them for.

Misleading Advertising

Image Credit: Shutterstock.

Kidd notes that The National Transportation Safety Board also made this recommendation to the NHTSA after they investigated the involvement of Tesla’s Autopilot feature in numerous fatal crashes.

Kidd points out that in their letter, the senators cite how advertising exaggerates these system’s capabilities, increasing drivers’ likelihood of misusing them. He also mentions that the IIHS has long been aware of this risk. A 2019 survey his organization conducted highlighted the link between consumers’ misunderstanding of autonomous driving system capabilities and names like “Autopilot.”

A Lack of Comprehensive Reporting on “Highly Automated Fleets”

Image Credit: Alexander Migl, CC BY-SA 4.0, via Wikimedia Commons.

According to Kidd, most of the coverage regarding highly autonomous vehicle fleets comes from social media videos and news reports showing people stymieing “robotaxis” with traffic cones or getting stuck at intersections or in construction zones.

Kidd notes that this is how we are becoming aware of safety failures for these highly autonomous vehicles. Many of these accidents involving partially autonomous driving systems are avoidable with the proper regulations.

A Lack of Transparency

Image Credit: Shutterstock.

He also states that autonomous vehicle companies must be more transparent with the public and regulators. For example, Kidd cites how Cruise withheld information from both California officials and the NHTSA after one of its autonomous vehicles dragged a person through the street after a hit–and–run accident.

Cruise also operated its autonomous vehicles even though it knew its systems failed to detect children on the road. Kidd stresses that IIHS evaluates new passenger vehicles for their ability to apply brakes to avoid automatically hitting children and adults. He says that although many new production vehicles have this ability, it is “unimaginable” that autonomous vehicle companies are not demonstrating “mastering” this “basic crash avoidance function.”

Kidd’s Solution

Image Credit: Insurance Institute for Highway Safety, CC BY-SA 3.0, via Wikimedia Commons.

Kidd endorses the senator’s push for the NHTSA to require more comprehensive reporting on autonomous driving systems involved in crashes. He stresses that the information safety experts gain from this data will allow them to properly understand the full spectrum of safety risks this new technology poses.

Additionally, Kidd states that the NHTSA must guide the creation of new regulations for autonomous vehicle technology. He testified to the federal agency in 2017 about why developing and maintaining a database for cars with partially automated driving systems is vital for road safety. In doing so, the NHTSA could link this database to other crash and insurance claim databases, which could further aid them in determining these vehicles’ real-world safety risks.

According to Kidd, the IIHS has successfully used this approach to document and determine the safety risks and benefits of driver assistance technologies, such as automatic emergency braking.

What the NHTSA Has Done

Image Credit: an NTSB employee, Public domain, via Wikimedia Commons.

Kidd states that in 2021, the NHTSA required car companies to report property damage and crash injuries if a vehicle uses partially automated driving for automated driving systems within 30 seconds of an accident.

Their order also included whether a vehicle deployed its airbag during an accident or if a tow truck was necessary to haul the car away. So, the federal agency has taken action to obtain information about the safety risks of autonomous driving systems.

Dubious Records

Image Credit: Shutterstock.

However, Kidd states that the data this regulation has produced is flawed. The reporting happening under this regulation is often incomplete, according to Kidd. He says that these records will often be duplicates, not verified, or incomplete and consist of vague, “inconsistent descriptions” of pertinent details, such as a crash’s severity and the level of damage it caused.

Disturbingly, he also states that someone can have these records redacted because they allegedly contain “confidential business information.” This happened in over 80 percent of vehicles with partially autonomous driving systems and around 25 percent with highly automated systems involved in “crash narratives.”

More Needs To Be Done

Image Credit: Ford Motor Company.

Kidd criticizes the NHTSA’s reporting requirements as “weak.” He believes the federal agency could do more to help researchers better determine these systems’ safety and performance capabilities. He also notes that the NHTSA has not issued any regulations guaranteeing that autonomous driving systems are safe.

Kidd quotes the senator’s letter, stating that the agency has only come up with “after–the–fact responses” to autonomous driving system issues. The NHTSA has primarily investigated defects and recalled cars that already have safety issues, such as Ford’s BlueCruise and Tesla’s Autopilot features, after they were in use during multiple fatal crashes.

However, they need to be more proactive in drawing and implementing measures to prevent future misuse and avoidable crashes involving autonomous vehicle systems.

What the IIHS Is Doing

Image Credit: Insurance Institute for Highway Safety, CC0, via Wikimedia Commons.

Kidd states that the IIHS has developed a rating program to encourage automakers to build more safeguards into their autonomous driving systems to prevent drivers from misusing this technology. However, the rating program still needs to address many potential safety issues regarding vehicle automation.

Unlike NHTSA regulations, which federal law compels automakers to follow, car companies can choose not to follow IIHSS recommendations. It is worth noting that car companies will often use IIHS ratings to improve the safety of their vehicles. However, Kidd states that regulating autonomous driving system safety falls to the NHTSA. It’s their job to save lives by preventing crashes and developing the necessary regulations for autonomous vehicles.

Similar Posts