The idea of being able to turn over all of the driving responsibilities to an autonomous fleet of vehicles gets major endorsements from federal safety officials, at least half of drivers, automakers and last, but certainly not least, Google.
In fact, recently Google officials complained they felt like they were being held back in the development of their vehicle. Despite a dozen accidents – none of which were deemed Google’s fault – and most recently, a situation where a policeman pulled over the autonomous vehicle for driving too … wait for it … slowly, the tech giant wants to speed up development.
Mark Rosekind, head of the National Highway Traffic Safety Administration, is a staunch ally in the move toward autonomous vehicles, believing they could virtually eliminate traffic accidents and, by extension, the thousands of fatalities that occur on U.S. highways and byways annually.
Despite the wave of support, there is a small and now vocal minority calling for a slow and steady pace for autonomous vehicle development. Consumer Watchdog, a consumer advocacy group, is pushing back against Google and its efforts to speed up the process of bringing an autonomous vehicle to market.
“Google may have its foot on the accelerator pedal in its mad drive to develop robot cars, but the DMV has admirably served as traffic cop and set reasonable limits that have genuinely protected public safety,” wrote John M. Simpson, Consumer Watchdog Privacy Project director, in a letter to the DMV. “We call on you to stay on this responsible course and put public safety first.”
(Global survey reveals strong interest in autonomous vehicles. For more, Click Here.)
Simpson and the organization felt compelled to appeal to the California Department of Motor Vehicles to not speed up the testing process after Google expressed its frustration and, at the same time, federal safety officials suggested that accelerating the testing process may be in order.
U.S. Department of Transportation spokesperson Suzanne Emmerling told the Associated Press recently that the pace of development is forcing the agency to rethink its testing policies on the federal level.
“Breathtaking progress has been made,” Emmerling wrote. She said Transportation Secretary Anthony Foxx ordered his department’s National Highway Traffic Safety Administration update its 2013 policy “to reflect today’s technology and his sense of urgency to bring innovation to our roads that will make them safer.”
It’s unclear what the new policy will be, though the tone of the statement signaled that Foxx is interested in endorsing the technology. Specific language the traffic safety administration in revisiting holds that states which do permit public access after testing should require that a qualified driver be behind the wheel.
(Click Here for more about the forecast for a new vehicle sales record in 2016.)
Currently, individual states are driving the rules and regulations of testing. However, the federal call for stepping it up caused officials in California to reconsider its testing mandates. As part of that, it asked NHTSA for advice about how to proceed. That forced California Watchdog to up its input level in the process, supporting the state’s currently tortoise-rather-than-hare approach.
“We commend the DMV for its thoughtful and thorough approach, and urge that you continue to act in the public’s interest, rather than succumbing to corporate pressure. It is imperative that the DMV reject the Internet giant’s self-serving lobbying,” the letter said.
“Quite correctly your department is acting at a deliberate pace to ensure that autonomous vehicle regulations for public use adequately protect our safety. The important thing is getting the regulations right, not rushing them out the door.”
(Forget hybrids and hydrogen cars, some must have a V12. Click Here for the story.)
The testing rules have important safeguards, but should still be improved, Consumer Watchdog said: “As you know, Consumer Watchdog has petitioned the department to amend the testing rules to require that police investigate any robot car crash. We also ask that the rules require copies of any technical data and video associated with a robot car crash be turned over to the department. We look forward to a positive response to our petition.”
So let’s see who could financially benefit from slowing down implementation of autonomous vehicles hmmm. Particularly electric powered vehicles. Hmmm. Would be interesting to find out where and who the money comes for Consumer Watchdog comes from. Hmmm
Could be any company or industry behind in
development. Could be any company or industry using a different energy source. Of course it going to be about the money. Using safety is the last grasp for lawyers in particular. Nothing new there. The requirement of new rules applying only to autonomous vehicles for accident investigation is interesting for proproganda purposes. Given the other side of the arguement, people are terrible drivers, and that having autonomous electric vehicles would save alot of lives, waiting just allows more people to die. So much for Safety.
Could be, too, that some legitimate groups are not yet convinced of the safety of the technology, DWH. We’ve seen Google Cars crash, and while they were not legally at fault it appears that they are still not properly programmed to operate in a man/machine environment.
Paul E.
It appears, its the human part that is the hinderance of the equation. Too many in recent history has a supposed legimate been backed a group of like it just fine the way it is presently and wants to keep it that way. Conservative and religious come to mind.
Again I ask, if no one is driving the vehicle and it does not have normal familiar driving controls, does anyone in the vehicle need to possess an operator license? If no one is in or has access to physical controls, how can they be held responsible?
At the moment, the laws do NOT allow for a vehicle to be on the road without traditional controls. And an “operator” must be behind the wheel ready to take control if the vehicle’s autonomous system does not operate properly. That could be taken to mean, ie, something as basic as failing to signal a turn, speeding or driving under the minimum speed. Eventually, Google wants to put on the road some versions of the Google Car prototypes without controls but there is no legal basis to permit that yet. If and when that happens, the laws would have to be revised so, among other things, a decision would be made about whether anyone would be the designated “operator.” For now, someone with a license, and in a position to legally drive — meaning they cannot be under the influence of alcohol or drugs — has to be in the driver’s seat.
Paul A. Eisenstein
Publisher, TheDetroitBureau.com
Let’s see… who could be in favor of rushing AV’s to market to cash in even though they know full well that AV’s have not been properly designed, tested or programmed nor do they include failsafe safety designs for when the computers crash, hackers take control, electronic sensors fail, or when the programmer’s simply got the code wrong? The answer is all who have a potential to cash in while endangering society with unsafe vehicles because they can currently get away with these unsafe vehicles due to lack of Federally mandated comprehensive safety designs. AVs should have even higher safety requirements than commercial aircraft have because there is no pilot to take control when the AV’s computers malfunction. Have you ever used a Windows based PC that did not crash, generate a BSOD or lose data? AVs will operate as poorly and dangerously as Windoze if left to the devices of those looking to cash in by being the first to market.
In regards to the claims that Google’s’ twelve (reported) accidents were not the fault of the AV, this is untrue. TWO of the incidents were due to programming errors causing the AV to slam on the brakes as soon as a traffic light turned yellow, causing cars behind the AV to slam into the AVs. In Michigan the AV would be 50% responsible for the accident and dangerous panic braking when the light turns yellow. This means the owner/designer and operator of such an AV would need to be heavily fined to insure these types of accidents from improper programming never occur, because they are going to. It’s clear that Microsoft and most other software companies can not produce reliable, safe operating systems or software code that is bug free and 100% reliable. As such this serious issue needs to be addressed before any AVs are allowed to operate on the roadways in autonomous mode.
There will be a day when these personal transportation devices are ready for the roadways but in reality, that is many years off contrary to the rhetoric that we hear or read almost daily.
Well well, set the hook and see what comes out of the shadows. Its becoming obvious that pilots need more training as with air frances, atlantic crash that the pilots duplicated the automatic pilot error of a malfunctioning air speed indicator. Same with a yellow light, coming to a stop is not a violation. Neither is for that matter with green light when turning left on a intersection without a turn lane and waiting for opposing traffic to pass. Calling for Duplicating the bad habits of bad drivers is no reason for then calling it a programming error. Running into the back of a vehicle always insures that the rear vehicle is at fault , lack of attention. Time to reprogram/ retrain human drivers. Michigan has always been strange among many things. Autonomous vehicles are better than a vast majority of human drivers. A reduction of accidents for the ambulance chasing law profession would result in a drastic drop in their income.
There is no argument that humans as a group do a lousy job of driving, particularly in the U.S. IME. That doesn’t absolve AV creators of their extreme responsibility to properly design, engineer, manufacture and program AV’s to operate properly and safely. The fact that you want to exempt Google programmers from their most important task – which is to program the AV to drive safely, is inexcusable. That is precisely why AVs should not be allowed on the roadways until they pass all Federal safety testing specifically designed for AVs, which should be mandatory annually or more frequently depending on operational time and environment.
No AV should be rushed to market until they are confirmed to be 100% hacking secure, occupant safe, employ failsafe redundant propulsion management systems, have defect free operational software code and provide for limp modes to get them safely off the roadway when they are hacked, a sensor fails or the computers crash – which is absolutely going to happen just like your desktop PC crashes.
The fact that Google’s programmers would even write such poor code to create an accident should be a big red flag. The reason why it’s 50% the responsibility of the lead car for a rear end collision when the lead car slams on the brakes, is because it’s improper to do so. Even the technically challenged should be able to figure out reaction times and stopping distances, i.e. it takes an average of 192 ft. to stop a car traveling 45 mph from the instant a driver sees a red stop signal and does a panic stop. (From actual industry testing).
As I previously advised the U.S. should synchronize all of their traffic signals to standard time increments. That should be obvious and simple.
Since the autonomous automobile industry is still not yet fully in force on a global scale, there are still many improvements to be made in terms of the technicality of the whole setup. However, I believe over time, when the market has progressed forward with various advances, the industry will be accepted as one of the major modes of transportation.