Tesla is being sued by three owners claiming that the $5,000 Autopilot option is dangerous and doesn't function as described.

Like Tesla itself, the battery-carmaker’s semi-autonomous Autopilot system has had its ups-and-downs. For many customers, it has been a key reason to purchase a Model S or X, but it has also been linked to a handful of crashes, including one in which a former Navy SEAL was killed last May.

Now, Tesla is finding itself facing a potential class-action lawsuit filed on behalf of three customers who say that Autopilot, designed to take control of the vehicle while driving on well-marked, limited-access highways, doesn’t live up to its billing as safe and reliable, declaring the $5,000 option “essentially unusable and demonstrably dangerous.”

“When consumers received these pricey vehicles, it became clear that Tesla’s marketing was all smoke and mirrors,” said Steve Berman, managing partner of Seattle’s Hagens Berman, which has been an aggressive practitioner of plaintiff law in recent auto safety cases. “And Tesla knew when it made these promises that it didn’t have the capabilities to follow through on its deal. It knowingly deceived tens of thousands who put their faith in these cars and in Tesla.”

In typically high-tech fashion, Tesla first released the Autopilot system to about 60,000 vehicles – including both the new Model X and Model S sedans made after September 2014 – using a wireless communications network. The initial version of the technology relied on both radar and camera sensors, but that design came under close scrutiny when 40-year-old Joshua Brown was killed on a Florida freeway.

It appears the system didn’t properly react when a truck pulled in front of Brown’s Model S sedan, though a federal investigation ultimately blamed him for failing to watch the road and be ready to re-take control in an emergency.

(Tesla bracing for possible strike in Germany. Click Here for the story.)

An illustration shows how Tesla Autopilot works.

Nonetheless, Tesla had a public split from Israeli-based in-car camera supplier Mobileye and has since gone to a new version of Autopilot relying solely on an upgraded onboard radar system. But that and updates to older models haven’t delivered what was promised, according to the new lawsuit.

A Model S purchased by one of the plaintiffs, Dean Sheikh, was “flashed” through an over-the-update in February, but it has since “operated in an unpredictable manner, sometimes veering out of lanes, lurching, slamming on the brakes for no reason and failing to slow or stop when approaching other vehicles or obstacles,” according to the claims made in U.S. District Court in San Jose California, the district representing Palo Alto, where Tesla is based.

For its part, Tesla made it clear it wasn’t going to roll over. The company issued a statement dismissing the claims in the lawsuit as a “disingenuous attempt to secure attorney’s fees posing as a legitimate legal action,” and said it misrepresents many facets of its systems.

Tesla also insisted that it has “always been transparent about” what the original, as well as the Enhanced Autopilot, systems can and can’t do, stressing that they rely on “software … that would roll out incrementally over time, and that features would continue to be introduced as validation is completed, subject to regulatory approval.”

(UAW gearing up for organizing battle with Tesla. Find out more, Click Here.)

To critics, the lawsuit is a welcome move. Some agree that Tesla has promised more than it can deliver, though others say that whatever Tesla explains, the very name, “Autopilot,” will lead some drivers to believe it has more capabilities than the system currently can handle. German regulators have, for that reason, been pressuring Tesla to adopt another name for the system.

Proponents, on the other hand, fear that a lawsuit like this – especially if it succeeds – could have a chilling effect on the push to develop autonomous technology and then sell it to the public. Regulatory and legal roadblocks could delay that rollout or even convince manufacturers not to offer such technology to the public, according to David Cole, director emeritus at the Center for Automotive Research in Ann Arbor, Michigan.

If the suit succeeds, it also could give autonomous technology a black eye with an already wary public. The May crash in Florida, along with incidents involving Uber and Waymo self-driving prototypes played a role in the fact that the majority of Americans say they don’t trust fully autonomous vehicles, according to the new Tech Choice Study by J.D. Power and Associates released this week.

“In most cases, as technology concepts get closer to becoming reality, consumer curiosity and acceptance increase,” said Kristin Kolodge, an executive director at JDPA. “With autonomous vehicles, we see a pattern where trust…in the technology…is declining.”

(For more on the Power Tech Choice Study, Click Here.)

Don't miss out!
Get Email Alerts
Receive the latest Automotive News in your Inbox!
Invalid email address
Give it a try. You can unsubscribe at any time.