In the past few years, various automakers and technology firms have been competing to develop a safe autonomous vehicle. Among the companies involved are many well-known automakers, namely Mercedes, BMW, and Tesla, as well as tech firms like Google. The concept grew from the logic that computers should be able to more safely operate vehicles than humans who commit errors or unsafe driving behaviors frequently. This premise may be under scrutiny after a deadly automobile accident involving a self-driving car. The accident occurred on May 7 in Williston, Florida and involved a Tesla Model S electric sedan. The driver of the Tesla sedan was killed while the car was in self-driving mode. The National Highway Traffic Safety Administration made a statement about the incident saying a tractor-trailer made a left turn in front of the vehicle, and the car failed to apply the brakes. This is the first known incidence of a fatal crash in which the vehicle was driving itself by means of computer software. The driver was identified by Florida Highway Patrol as Joshua Brown, 40, of Canton, Ohio. Brown was a Navy veteran who owned a technology consulting firm. Tesla made a statement on Thursday saying Brown was a man who “spent his life focused on innovation and the promise of technology and who believed strongly Tesla’s mission.” Brown had previously posted several videos of himself using the autonomous Tesla vehicle. In one, he applauded the technology for successfully preventing an accident involving his car.
The release of this story has been detrimental to Tesla’s efforts in expanding its product line from pricey electric vehicles to more conventional models. It is still unclear whether the car the driver, or both were to blame for the lethal accident. In a news release, the company said, “Neither autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied.” Many critics of self-driving cars have noted that this is evidence that computers cannot make “split-second, life-or-death decisions” as humans often need to. Companies have been conducting tests using self-driving vehicles in private courses as well as public roads. However, it does not seem that the technology has been tested and developed enough for the government to sign off on the autonomous cars. The National Highway Traffic Safety Administration has recently been working on new regulations concerning testing these self-driving cars on public roads which are anticipated to be released sometime this month.
Karl Brauer, an analyst with the auto research firm Kelley Blue Book, notes that this accident shows autonomous technology may not be as developed as it needs to be to enter the open market. “This is a bit of a wake-up call,” he says. “People who were maybe too aggressive in taking the position that we’re almost there, this technology is going to be in the market very soon, maybe need to reassess that.” Tesla initially introduced the self-driving feature in the Model S last fall. In a statement made Thursday, the company reiterated that it was still only a test feature and its use “requires explicit acknowledgement that the system is new technology.” Tesla noted when drivers turned on the self-driving component, there was a warning that emphasized it “is an assist feature that requires you to keep your hands on the steering wheel at all times.” Evidently, there are many kinks that still need to be worked out before the everyday person will be found using autonomous cars.
BOUDETTE, BILL VLASIC and NEAL. “A Tesla Driver Died in a Crash While Using Autopilot Mode.” Boston.com. The New York Times, 01 July 2016. Web. 12 July 2016