Initially, the shuttles, built by Navya, were used autonomously, but they began operating without drivers this week. (Photo credit: Jeff Zurschmeide)

It’s not a good thing to get involved in a crash during your first day on the job. But that’s what happened this week when a fully driverless shuttle bus was making its debut run – complete with dignitaries and journalists onboard – in Las Vegas.

The shuttle was following a predetermined route through the city when it came up on a food delivery truck trying to squeeze down a narrow ally to make a delivery. The autonomous shuttle bus stopped and waited but the trailer eventually backed out, trying to reposition itself, and rammed the bus.

Las Vegas police were called to the scene and determined the truck driver was at fault. And a spokesman for transportation firm Keolis, which is running the trial program, claimed the shuttle couldn’t move out of the way “because there was traffic behind it.” The system, the spokesman said, “operated as it was supposed to.”

Well, maybe not so, at least according to journalist Jeff Zurschmeide, who flew into Sin City to check out the debut event.

“The truck was backing up at a sharp angle, the back end of the tractor portion was coming right at us,” he reports, adding that “We had about 20 seconds of watching it come toward us at about 1 mph. But this was an autonomous van – we couldn’t back up, we couldn’t honk, we couldn’t do anything except watch the inevitable impending crunch.”

An autonomous shuttle in Las Vegas was found to not be at fault for this crash, but riders say it could have been prevented. (Photo credit: Jeff Zurschmeide)

(Waymo to give Phoenix rides a lift in driverless minivans. For the story, Click Here.)

So far, it appears the driver takes total blame. But, Zurschmeide noted, “we had about 20 feet (between the shuttle bus and traffic behind it). A human driver would have thrown it into reverse and backed up 10 feet with plenty of margin. The autonomous van didn’t know to do that – but I bet the next one will.”

This isn’t the first time humans have taken the blame for a crash with an autonomous or driverless vehicle. In fact, all but one of nearly 20 incidents known to have occurred involving prototypes being tested by Waymo have been blamed on the other, human driver. But there’s a catch.

The autonomous vehicles were programmed to strictly obey the law. As such, they instantly came to a stop whenever the light turned yellow. Unfortunately, at crowded intersections, like those near parent Google’s headquarters in Silicon Valley, its customary for two drivers to scoot through the intersection, making left turns, as the light changes. Expecting the Waymo cars to follow that etiquette, the drivers behind drove into it when the autonomous vehicles stopped.

The driver of this truck was found to be at fault in a collision with an autonomous vehicle in Las Vegas. (Photo credit: Jeff Zurschmeide)

(Waymo gives the lo-down on its autonomous vehicle tech. Click Here to go along for a ride.)

Waymo subsequently began tweaking its software to act more like a human – which led to the first crash its vehicles were blamed for. It expected a bus to let it pull in when its lane ended but wound up sideswiping the bus when it didn’t slow down.

Incidents like these may be the toughest challenge facing engineers and programmers developing tomorrow’s autonomous vehicles.

It’s frequently noted that more than 90% of crashes are the result of human error. But as autonomous and driverless vehicles will long be required to share the road with human drivers they are going to have to learn how to anticipate what people do behind the wheel – and then respond when the human does something wrong.

(Delphi makes big investment in autonomous technology with $450m purchase of nuTonomy. Click Here for more.)

Learning how to back up is probably a good first step.

Don't miss out!
Get Email Alerts
Receive the latest Automotive News in your Inbox!
Invalid email address
Give it a try. You can unsubscribe at any time.