Driverless cars will likely be the main mode of transportation in the future, but that future may be a bit more distant than expected. A self-driving taxi pilot project in Las Vegas ended with less-than-desirable results last week. After only a few hours of shuttling people around the city, the driverless van collided with a delivery truck as it was backing into an alley to make a delivery. According to the initial investigation, the human driver did something that the robot car couldn’t have anticipated.
Last Wednesday, the pod-like shuttle was on day one of offering complimentary rides around a small loop in Vegas. No one was injured in the collision, but it is cause for concern. It also brings to light a glaring issue that designers of driverless vehicles have yet to figure out – how can self-driving vehicles effectively interact with those driven by humans?
“This is exactly the kind of real-world scenario that this pilot is attempting to learn from,” said John Moreno, AAA spokesman. “This is one of the most advanced pieces of technology on the planet, and it’s just now learning how to interact with humans and human driving.”
Robots Don’t Understand Nonverbal Communication
The reality is, humans use nonverbal communication signals when driving every day. The truck, which was backing up when it shouldn’t have been, collided into the driverless pod stopped behind it. Had a human been driving the pod, he or she would have likely given in to the truck’s nonverbal request to “get out of my way,” by backing up. Had there not been another vehicle behind the pod, it may have done the same. However, the pod appeared to freeze in place, unable to determine how to react – can’t move forward, can’t back up.
According to a reporter who was on board at the time of the incident, a human would have probably responded differently. “We had about 20 feet of empty street behind us (I looked) and most human drivers would have thrown the car into reverse and used some of that space to get away from the truck,” wrote Jeff Zurschmeide, a reporter for Digitaltrends.com. “Or at least leaned on the horn and made our presence harder to miss. The shuttle didn’t have those responses in its program.”
Police arrived at the scene, issuing a ticket to the truck driver.
The purpose the the AAA-sponsored pilot program is to expose riders to driverless technology and determine how these vehicles perform in real-world situations. There is a human operator on board during the pilot rides, but the incident simply happened too quickly for the operator to react. A Boston car accident lawyer can help you determine how to proceed if you’ve been injured due to another’s negligence.
The Las Vegas pilot project incident isn’t the first crash involving a driverless car. The National Transportation Safety Board (NTSB) criticized Tesla Inc.’s semi-autonomous systems in September, referencing a fatal 2016 accident involving the Tesla Model S. The Model S allows the driver to go “hands-free” for an extended period. Basically, it can steer itself. Unfortunately, a driver in Florida was killed when his Model S, which was steering itself at the time, crashed into a truck. NTSB ruled that, although the human drivers were the main cause of the accident, the autopilot design was a contributing factor.
This recent self-driving accident in Las Vegas shows the difficulties that driverless vehicles have when it comes to nonverbal communication. This type of communication occurs with great frequency between human drivers every day. The truck driver may not have seen the pod, but it’s more likely that he expected it to move.
“He probably had an expectation that the shuttle would back off and allow him to do his thing,” said Duke University robotics professor Missy Cummings. “Obviously that doesn’t work. There wasn’t the logic inside this little shuttle to anticipate this.” A MA auto accident attorney can help you recover damages if you’ve been injured due to another’s negligence. Continue reading