In March 2018, a 49-year-old woman named Elaine Herzberg was struck and killed by an Uber-owned self-driving car in Tempe, Arizona. Last week, the National Transportation Safety Board (NTSB) released a summary of its report on the accident, and the results are quite damning to Uber and make one question the reasoning behind self-driving cars in the first place.

The report should raise eyebrows here in Massachusetts, where self-driving technology is being tested thanks to a 2016 executive order by Gov. Charlie Baker allowing the practice. Only a few months after Herzberg’s death, 15 Massachusetts communities signed a memorandum of understanding to allow testing on their public roads. Two companies — Ireland-based Aptiv (a partner company with Hyundai) and MIT spinoff company Optimus Ride — are currently running tests, according to the state’s website.

A dashboard video released by Tempe Police a few days after the 2018 Arizona crash showed the Uber operator had her eyes on her cell phone in the seconds leading up to the moment the vehicle struck Herzberg.

But driver inattention, the NTSB report said, was only part of the story. “The vehicle operator’s prolonged visual distraction, a typical effect of automation complacency, led to her failure to detect the pedestrian in time to avoid the collision,” the report reads.

In other words, automation leads us to become complacent, rendering us less able to react in an emergency (or in this case, to even be aware of the emergency until it is too late).

Compounding this, the report claimed, was Uber’s flamboyantly atrocious safety culture. The programming of the automated system, outlined by a 16-page Vehicle Automation Report released by the NTSB earlier this month, showed a host of problems with the program.

To be clear, the nighttime conditions during the accident should have been conditions where an automated driving system would perform well. Not only does the car have cameras all around it, it has a 360-degree LIDAR system that can detect objects even in the dark and perceive their distances.

The Uber system detected Herzberg 5.6 seconds before the crash — plenty of time to stop. But because she was crossing outside of a crosswalk, the system did not initially identify her as a pedestrian, and reclassified her several times as pedestrian, bicycle (she was carrying a bicycle), and “other.” Each time the system reclassified Herzberg, it eliminated the history data of its previous observations, as if it were detecting Herzberg for the first time.

Before reacting, the system was designed to have a 1-second freeze in which it doesn’t do anything — apparently avoiding false alarms was more important to Uber programmers than reacting to real emergencies. As the seconds ticked down and the car got closer and closer, the car finally determined that a collision was imminent at 1.2 seconds from impact. The system once again instituted its 1-second freeze, and then rather than braking, which it wasn’t equipped to do so close to an object, it initiated an audio warning to the distracted human driver 0.2 seconds before hitting Herzberg at full speed.

Meanwhile, though the system was so dependent on its human operator in emergency situations, Uber removed the requirement to have two people in a car during test drives, according to the NTSB report. The company also rarely monitored the behavior of their drivers to see that they were following the company’s own rules of conduct (which one imagines would forbid looking at a phone rather than the road), the report said.

If an automated vehicle is dependent on its human driver to detect emergencies, what is the point? How will such technology possibly prevent more accidents than it causes?

Regrettably, the Yavapai County Attorney’s Office didn’t wait for this report to come out, and concluded in March of this year that Uber bore no criminal liability in Herzberg’s death. So though the programmers and company executives made dangerous decisions that the NTSB report showed contributed to Herzberg’s death, they will face no criminal consequences. The company has already settled with the victim’s family. It is back to test driving vehicles on Pittsburgh roads. It hopes to soon expand to San Francisco and Toronto.

Many at the time of the crash claimed that Uber was the problem rather than automated driving technology. It’s true that, company-wide, Uber is notably lax on safety, a pattern that recently lost the ride-hailing company its license to operate in London. But the fact is, if we continue to have a lack of oversight, where those who make criminally bad, fatal decisions are not held accountable, dangerously immature technology will be allowed to be used on our roads with potentially disastrous consequences.

A much better way to avoid such accidents is to reduce distracted driving, as Massachusetts has finally done with its hands-free device law Baker signed this week, which prohibits drivers from using their phones while driving, except in hands-free mode.

Dave Eisenstadter can be reached at deisen@valleyadvocate.com.