Risks of Driverless Cars – Who Is Responsible?

Risks of Driverless Cars

America is looking to make driverless vehicles the norm and nearly every major automaker offers some level of autonomy. With the autonomous vehicle market likely to reach $556.67 billion by 2026, it’s clear driverless automobiles are here to stay.

Today, more people are demanding self-driving vehicles, however, the risks of driverless cars are a serious concern. The National Highway Traffic Safety Administration (NHTSA) has classified driver-assist technology into six categories of autonomy. However, the lack of industry rules and legislation leads to marketing blurring.

In addition, technology is evolving to limit the risks of driverless cars. For example, Level 5 AVs will travel without human drivers, using sophisticated LiDAR, AI software, and RADAR sensor technology. But how does this work in practice?

 

Is a Human-driven Car Safer?

Contrary to popular belief, self-driving cars suffer more accidents than human-driven automobiles. However, the casualties in driverless cars are usually less serious. There are 9.1 self-driving accidents for every million miles driven compared to only 4.1 for regular vehicles.

Taking that into account, let’s examine some of the risks of driverless cars:

 

1. Misplaced Confidence in Driverless Vehicles

Is it any surprise that these automobiles are billed as “driverless” when human drivers act more like passive passengers? The term “driverless” is deceptive because none of these automobiles are fully self-driving. It appears the cause of the vast majority of self-driving car accidents is the distraction of the human driver.

The same is true in non-autonomous vehicles. Drivers must be vigilant and ready to take control at any time.

 

2. Fire Danger

On Saturday, April 17, an autonomous Tesla crashed in Texas, killing both passengers and burning for four hours. The National Transportation Safety Board is investigating the accident, although authorities say no one was driving.

Lithium-Ion (LI) batteries are notoriously flammable. When lithium burns, it forms a metal fire that reaches 3,632°F.

Firefighting with water may result in a hydrogen gas explosion. Furthermore, if there is damage to the battery, “thermal runaway” might occur. This can generate poisonous gas explosions, flames, and projectiles, putting emergency personnel in grave danger.

 

3. Unreliable Technology

A recent study found that active driving assistance systems had issues every eight miles in real-world driving. On the other hand, they discovered that ADAS often disengage without warning requiring the driver to regain control promptly. One can understand how this scenario might go wrong if the driver is distracted or overly reliant on the technology.

In 2016, a Tesla tried to pass an 18-wheeler on a Florida highway at high speed. The Tesla driver was injured.

The investigation determined the car’s autopilot failed to brake because it couldn’t see the truck’s white side against the bright sky. The occupant was found to be at fault since they should have had time to brake before the crash. However, distraction was a big factor.

 

4. Cyber-attack

Hackers pose a real threat to operations. In 2015, hackers commandeered a Jeep and forced it to stop on a St. Louis freeway. The hackers got into the car’s braking and steering via the entertainment system.

According to reports, this was a planned impromptu experiment, meaning the driver didn’t know how or when the takeover would occur. Nevertheless, the danger and panic he felt were real. Hackers are cunning and sometimes choose to use their skills in ways that are damaging, even lethal.

 

5. Real-world Driving Conditions

Charles Perrow wrote the book, Accidents: Living with High-Risk Technologies. In his book, he writes that alerts and safeguards meant to improve safety inevitably fail because of system complexity. Instead, he says, complexity may actually establish new accident categories.

The ability to look into another driver’s eyes at a crossroad is a real-life condition best left to a driver. When used appropriately, modern automobile aid technology can save lives.

However, driving is complicated. Lanes, roads, and situations change. In addition, the same behaviors don’t always work in every situation.

 

6. Self-driving Car Regulations

Industry trade associations and businesses are lobbying lawmakers to pass legislation allowing for “wider deployment of autonomous vehicles.” At the same time, they are requiring rigorous safety standards for the new technology. Currently, there is some regulation governing self-driving vehicles and more states are exploring legislation.

Meanwhile, automakers, like Tesla, are free to unleash their autonomous cars on the market.

On January 15, 2021, the President announced new rules in regards to autonomous vehicles. The rules allow makers of self-driving vehicles to skip certain federal crash safety requirements in vehicles not designed to carry humans. More laws and legislation are likely to follow as self-driving vehicles become more common.

 

Is It Too Soon?

Safety activists have lately criticized NHTSA for implementing voluntary standards for self-driving vehicle makers. That means they do not have to divulge how their vehicles do in federal safety tests. Critics say that firms have to make these judgments.

Cases involving self-driving cars will become more complex as technology and legislation evolve. For a self-driving car accident, you need an attorney who understands the technical, legal, and legislative complexities involved.

By 2035, the US will probably have more than 4.5 million self-driving automobiles. Let’s hope automakers prioritize consumer safety before profit and that the agencies that safeguard us do their jobs.

 

Image Credit: Rodnae Productions; Pexels; Thank you!

Latest from NewsReports