Transportation

The Uber CEO’s Mistaken Notion of What a Mistake Is


Dismissing Uber’s own self-driving errors as mere “mistakes” feels wrong, too (although on a different order of magnitude). Especially given the raft of documents released last week by the federal transportation safety watchdog the National Transportation Safety Board, which has spent the last 20 months investigating the context of the accident in which a car killed the woman, named Elaine Herzberg. During Sunday’s interview, Primack asked whether the crash boiled down to a “bad sensor.” “Yes, yeah,” Khosrowshahi responded, before Primack cut him off. But according to the documents, that’s not quite true. In fact, a series of poor decisions appear to have led to that moment on a dark Arizona road. (In May, an Arizona prosecutor said there was “no basis for criminal liability for the Uber corporation arising from” the fatal crash. On November 19, the NTSB will announce the final results of its investigation, saying who and what they believe is at fault for the crash.)

According to the NTSB investigation, Uber’s software was not created to recognize pedestrians outside of crosswalks. “The system design did not include a consideration for jaywalking pedestrians,” one of the documents said. As a result, Uber’s system wasted some 4.4 seconds trying to “classify” Herzberg, and to use that information to predict her movement.

Then, with just 1.2 seconds until impact, the Uber system again did what it was designed to do: It held off braking for one second. This aspect of the system was meant to give the “mission specialist” hired to monitor the self-driving car from behind the wheel time to verify “the nature of the detected hazard” and take action. According to the NTSB documents, Uber created “action suppression” system because the self-driving program’s developmental software kept having false alarms—that is, identifying hazards on the roads where none existed—and so kept executing unnecessary but “extreme” maneuvers, like swerving or hard braking. But on that night in March, the woman behind the wheel of the car didn’t look up during that second-long period, and the system only began to slow down 0.2 seconds before impact. In the end, the car was traveling at 43.5 mph when it hit Herzberg.

And if the self-driving system had flaws, maybe those can be traced to a series of decisions Uber made around its organizational structure. The NTSB documents note that, while Uber’s self-driving unit did have a system safety team, it didn’t have an operational safety division or a safety manager. Nor did it have a formal safety plan, or a standardized operating procedure or guiding document for safety—the stuff of a well-thought-out “safety culture.” In fact, the company had only recently decided to depart from industry standards and have just one single person in each testing vehicle instead of two. (“We deeply value the thoroughness of the NTSB’s investigation into the crash and look forward to reviewing their recommendations once issued after the NTSB’s board meeting later this month,” an Uber spokesperson said last week in a statement.)

So, does Uber get to be forgiven? That’s probably for Uber and Uber customers to decide. For part of Monday morning, #BoycottUber trended nationwide on Twitter. Uber says it has completely revamped its self-driving testing procedures since the crash, and has added another person to each of the vehicles it tests on public roads. To cut down on mistakes.


More Great WIRED Stories



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.