He wasn’t driving, the man told the highway patrolman Monday morning. The car was. The driver explained that his Tesla electric vehicle “had been set on autopilot.”The driver had a blood-alcohol content nearly double the legal limit and a tenuous relationship with consciousness when his car slammed into the back of a parked fire truck on San Francisco’s Bay Bridge.
Still, he became maybe the first to add a technologically advanced new entry to the list of drunken driving excuses.
He wasn’t driving, the man told the highway patrolman Monday morning. The car was.
According to the California Highway Patrol, the driver explained that his Tesla electric vehicle “had been set on autopilot,” obviating the need for him to be in control of the vehicle or, well, sober.
He was wrong, of course, and was ultimately jailed under suspicion of driving under the influence. But as word of another Tesla autopilot crash spread, the case of car as designated driver became an interesting thought exercise for anyone with more than a passing interest in vehicles that drive themselves.
If Elon Musk and other forward-thinking automakers have their way, there will soon be a time when there is no more drunken driving, because cars never have to wonder whether they’ve had one too many vodka martinis.
But until we all have our own computer-controlled, two-ton chauffeurs, we’re left with an increasing number of cars with a raft of features that make them semi-autonomous — vehicles that are safer and smarter, if not particularly geniuses.
Carmakers are transparent about that caveat emptor quality of their vehicles.
Tesla, for example, warns that its autopilot system is not fully autonomous. Attempts to reach Tesla for comment were not immediately successful. But the company instructs drivers to be alert because they are ultimately responsible for their vehicle and whatever it smacks into.
But humans can slip into complacency when the car is doing most or all of the work.
For example, a fatal Tesla crash involving the autopilot system drew international scrutiny in spring 2016. The Model S had been set on autopilot and neither the vehicle nor the driver recognized that a tractor-trailer hauling blueberries had turned onto the divided highway.
In its report, the National Transportation Safety Board cited Joshua Brown’s overreliance on the autopilot. He had set the speed at 16 km/h over the posted speed limit and in the final 37 minutes of his drive, he had his hands on the wheel for just 25 seconds. He also ignored seven dashboard warnings and six audible warnings.
For Brown, those mistakes were fatal. But as technology advances, automakers say, they won’t be mistakes at all.
“We aimed for a very simple, clean design, because in the future — really, the future being now — the cars will be increasingly autonomous,” Musk said in July, according to The Washington Post’s Peter Holley. “So you won’t really need to look at an instrument panel all that often. You’ll be able to do whatever you want: You’ll be able to watch a movie, talk to friends, go to sleep.”
And Musk and other autonomous vehicle proponents have disseminated videos and other media that show autopilot at its best, protecting drivers, passengers and even pedestrians from crashes.
Authorities have not identified the driver of the Tesla that crashed into the fire truck on the Bay Bridge. They say no one — not even autonomous vehicle drivers — is allowed to be drunk behind the wheel of a car, no matter how advanced its safety features.
No one was seriously injured in the Bay Area wreck; the firefighters were parked in the emergency lane and car pool lane, responding to a crash on the other side of their truck, according to the San Jose Mercury News.
Tesla can check the car’s data to see whether the car was indeed using autopilot before the crash, but have not released that information.
Firefighters joked that the feature definitely wasn’t engaged after the 105 km/h crash.
The car was towed, they said. “No, it didn’t drive itself to the tow yard.”
Source: The Toronto Star