An LA TV station caught the aftermath of a crash involving a Tesla Model S and a firetruck.

Two separate teams of federal safety investigators have been dispatched to California to examine what happened when a Tesla Model S crashed into a firetruck on a suburban Los Angeles freeway on Monday, the agencies aiming to find out if the incident happened while the battery sedan was operating in semi-autonomous Autopilot mode.

This marks the second time both the National Highway Traffic Safety Administration and the National Transportation Safety Board have examined a Tesla crash to see if the maker’s self-driving technology may have malfunctioned.

In the first case, 40-year-old Joshua Brown was killed in a Florida crash in May 2009 when his Model S failed to stop when a semi-truck turned in front of it. The former Navy SEAL was ruled at least partially at fault for failing to take evasive action, but the NTSB also determined that flaws in the design of the Tesla Autopilot system contributed to the crash.

The Silicon Valley automaker subsequently announced it would make major revisions to both the hardware and software used by its Autopilot system, updated it began phasing in by late 2016.

A Tesla Model S was completely destroyed in a May 2016 crash that killed Joshua Brown.

Another crash in which Autopilot were to catch blame would do more than just create headaches for Tesla, however. Congress is currently considering a measure that would allow the auto industry and tech firms like Google spinoff Waymo to place tens of thousands of prototype vehicles on public roads to test their self-driving technologies.

General Motors recently revealed a version of its battery-electric Chevrolet Bolt EV that has no steering wheel or pedal, announcing that it has requested federal permission to begin production in 2019.But the automaker this week was targeted by a lawsuit blaming it for a crash involving a self-driving test vehicle.

In Monday’s crash, a Tesla Model S traveling south on I-405 through Culver City crashed into the back of a firetruck that had been called to the scene of another collision. No injuries were reported by the California Highway Patrol but authorities said they were told by the Tesla driver that he had activated Autopilot before the incident occurred.

Determining if that is accurate will be one of the first challenges for the NTSB and NHTSA teams. But Tesla has designed its vehicles to record such information and it also uploads extensive amounts of data from its vehicles to the cloud. That has helped the automaker counter claims by drivers in several other incidents who apparently tried to diminish their own responsibility by claiming an Autopilot malfunction.

In the May 9, 2016 incident, driver Brown was shown to have switched the system on, but he also was found to have put his hands on the steering wheel for only 25 seconds during the 37.5 minutes that Autopilot was in operation. After an initial review put the blame on the 40-year-old Tesla owner the NTSB issued a subsequent report that also faulted Tesla’s self-driving system.

GM wants to start production of Chevy Bolt EVs, like this one, with no steering wheel or pedals.

Critics have also charged the automaker with overstating the capabilities of Autopilot, leading many owners to think it can operate completely without human intervention in many circumstances. In fact, it is capable of adjusting its speed, including braking, to maintain a safe distance from surrounding traffic. But drivers must still be ready to retake control if the system glitches. In the Florida crash it appears Autopilot had trouble distinguishing the white truck Brown hit from the bright sky behind it.

Tesla isn’t the only company that has experienced problems with its semi-autonomous technology. Waymo has experienced around 20 crashes, almost all of them minor, and the vast majority blamed on other drivers. General Motors this week was hit by a lawsuit filed by a motorcyclist who claims he was struck by a Chevrolet Bolt EV operating in autonomous mode. The accident report filed after the December 7 crash appears to contradict some of the claims made in the suit, however.

(For more on the San Francisco Chevy Bolt crash, Click Here.)

But even if GM and Tesla are both cleared of responsibility, proponents of self-driving technology worry about the impact the initial reports could have. Gil Pratt, the head of the Toyota Research Institute, the unit overseeing Toyota’s own autonomous development program, has warned that consumers are likely to be “far less forgiving” of crashes caused by a machine rather than a human driver.

That’s backed up by a pair of new studies, including one by AAA that this week reported a solid majority of Americans say they would be “afraid” to ride in a self-driving vehicle.

(Click Here for more on how Americans feel about self-driving vehicles.)

The irony, say autonomous vehicle proponents, is that fact that federal data show about 93% of fatal crashes are the result of human error. Backers like Mark Rosekind, who retired in January 2017 as NHTSA administrator, contend that self-driving vehicles could eventually eliminate vehicle crashes and lead to an era of what automaker like Nissan promise will be zero highway fatalities.

But some safety advocates, including Consumer Watchdog and another former NHTSA administrator Joan Claybrook, want more testing at private facilities like Michigan’s new American Center for Mobility before next-gen driverless technology is allowed on public roads. More crashes involving faulty prototypes could influence public opinion.

(Safety titan Claybrook wants to rein in autonomous vehicles. Click Here for the story.)

Don't miss out!
Get Email Alerts
Receive the latest Automotive News in your Inbox!
Invalid email address
Give it a try. You can unsubscribe at any time.