Federal investigators cleared Tesla and its Autopilot system of any safety defects that may have caused this fatal crash in Florida last year.

Federal safety regulators have determined there is no known safety defect with Tesla’s semi-autonomous Autopilot system, wrapping up an investigation that was launched after a fatal Florida crash last May that resulted in the death of a former Navy SEAL.

A handful of other crashes were tentatively linked to the technology, though the battery-carmaker strongly disputed those claims. With Autopilot considered one of the key selling attributes for Tesla products like the Model S and upcoming Model 3, getting the all-clear from the National Highway Traffic Safety Administration was a significant development for the automaker.

In a Twitter post, Tesla founder and CEO Elon Musk declared the wrap-up of the investigation was “very positive,” Tesla following up with a statement that said “the safety of our customers comes first, and we appreciate the thoroughness of NHTSA’s report and its conclusion.”

The NHTSA investigation began in late June, nearly two months after 40-year-old Joshua Brown was killed when his Tesla slammed into a tractor trailer that had turned left in front of his vehicle on a Florida highway. Initial indications suggested the vehicle’s camera and radar systems – which are supposed to work together to identify and prevent potential collisions – confused the truck with the bright Florida sky.

(Tesla looking to set solar panel record. For more, Click Here.)

After several subsequent, albeit less serious, accidents were initially linked to Autopilot, several safety groups, including California’s Consumer Watchdog, called on federal regulators to force Tesla to disable its Autopilot system.

But reports from the scene of the Brown crash also indicated that the Tesla owner had turned full control of the vehicle over to Autopilot, which Tesla stretches is designed only to assist drivers, and not to assume full, autonomous control. At the time, according to some reports, Brown was watching a video on his laptop computer.

In its final report, NHTSA determined that had Brown been paying attention he would have spotted the truck “at least” seven seconds before the collision occurred, something it referred to as a “period of extended distraction,” and stressed he took no action to stop or steer around the problem.

While Tesla’s statement suggested that the investigation was now in the company’s rearview mirror, NHTSA did note that it will continue monitoring reports about Autopilot. “The closing of this investigation does not constitute a finding by NHTSA that no safety-related defect exists,” it cautioned, adding that the agency “reserves the right to take future action if warrented by the circumstances.”

(Tesla misses mark on deliveries. Click Here for details.)

Industry leaders and analysts, as well as government regulators, agree that the era of fully autonomous vehicles is rapidly approaching, Tesla among the companies that promise to have such technology on the road by early in the coming decade. But there is debate over the form self-driving systems will take.

That includes the hardware systems that will be required, as well as the extent to which vehicles will be able to operate hands-free. Some experts, such as Gill Pratt, the head of the Toyota Research Institute, contend early autonomous vehicles will be limited to specific geographic areas and weather conditions. Others, such as Tesla’s Musk, have suggested they are aiming for technology that would work anywhere and at all times.

Even now, more and more makers are adding semi-autonomous driver-assistance systems to their vehicle. Tesla this month updated the Autopilot software on vehicles built since last October – and equipped with the company’s latest sensor systems. BMW, meanwhile, says drivers can take their hands off the wheel of the new 5-Series models for up to 50 seconds in certain situations, normally limited-access highways.

(To see more about Tesla’s Autopilot update, Click Here.)

For its part, NHTSA has told automakers they need to make it clear to owners the limits of their semi-autonomous systems. The agency, in a Thursday conference call with reporters said it faulted Tesla for choosing the name, Autopilot for its current technology, suggesting the name might lull motorists into incorrectly assuming the system was cable of doing more than is currently possible.

Don't miss out!
Get Email Alerts
Receive the latest Automotive News in your Inbox!
Invalid email address
Give it a try. You can unsubscribe at any time.