The belief that self-driving cars will eventually make our roadways safer is widely held, but the recent surge of autonomous vehicle (AV) crashes is causing serious concern. Is it too early for AVs to be on the roads? And what is causing all of these crashes? Although most accidents have been minor, there are exceptions, including the self-driving Uber that hit and killed a pedestrian in Tempe, Arizona.
The less “sensational” crashes may not make national news, but reports showing all AV crashes—even minor fender benders—are particularly alarming. They are happening with relative frequency…and most involve rear-end collisions. Take the state of California, for example. In the month of September alone, three AVs were rear-ended and three were sideswiped. Most AV developers do their road testing in California, Arizona, Pennsylvania, Nevada, and Michigan, but California is the only state that requires AV companies to report detailed information about their testing. Since 2014, California has recorded at least 104 collisions involving AVs. Of those, a whopping 49 occurred in 2018.
Critics warn that getting to a point where AVs can basically eliminate the country’s annual 40,000 roadway fatalities may take decades, and current testing programs are akin to a public experiment in AI to which public participants haven’t willingly signed on. Considering that possible outcomes include serious injury or death, concern is understandable. According to research into recent accident patterns, experts have concluded that AVs drive in ways that may be unexpected by the human drivers with whom they share the road. A MA auto accident attorney can help you determine how to proceed if you’ve been injured due to another’s negligence.
Rear-End Collisions and AVs—Who’s at Fault?
Analyzing the data found in nationwide reports, researchers have concluded that rear-end accidents account for approximately two-thirds of all AV accidents. Why all the rear-end collisions? And doesn’t that mean they’re the fault of the human driver who hits the AV from behind? Although most states hold that rear-end accidents are the fault of the driver who hits the other vehicle from behind (and there is no denying that today’s human drivers are more distracted than ever), many experts believe that the AVs are at least partially to blame.
Of the 28 rear-end accidents reported involving self-driving cars in California last year, 22 occurred when the vehicle was in full autonomous mode. Such statistics lead experts to believe that AVs simply must be doing something that increases the likelihood of being involved in a rear-end collision. Although autonomous vehicles may make take the “path of least resistance” (i.e. make an illegal left-turn to avoid mowing down a pedestrian), they don’t always drive in a way that human drivers expect. Which may be the biggest problem faced by AV developers, and the general public.
People Expect People to Break Rules
Kyle Vogt, cofounder and CEO at Cruise believes the reports coming out of California paint a very clear picture—humans expect other humans to break traffic rules when behind the wheel (i.e. speeding up at a yellow light or driving over the speed limit), but AVs don’t bend the rules.
“We’re not going to make vehicles that break laws just to do things like a human would,” says Vogt. “If drivers are aware of the fact that AVs are being lawful, and that’s fundamentally a good thing because it’s going to lead to safer roads, then I think there may be a better interaction between humans and AVs.” A Boston auto accident lawyer can help you recover damages if you’ve been injured due to another’s negligence.
It’s going to be a long time until AVs are universally safe on American roadways. In the meantime, awareness is key. The public would benefit immensely from knowing how self-driving technology works, how—and where—it is being tested, and how AVs behave. Labeling AVs in similar fashion to driver’s education vehicles could be one way to help human drivers adapt to their artificially-intelligent counterparts. Continue reading