The auto industry has always been dedicated to providing U.S. motorists with world-class thrills. Takata exploding airbags! Cars that suddenly shut off at 70 miles an hour! More and more crashes here compared to the rest of the world! That’s right, Americans can experience plenty of thrills these days just by getting in a car - even if their destination isn’t an actual amusement park (although your chances of ending up in the ER are clearly better if you’re headed over to one.)
So now, imagine this: a new ride, where you strap yourself into your car and slowly – and then quickly – hit the highway in a vehicle over which you have no control. Instead, it’s driven by a machine-learned algorithm that, if not right 100 percent of the time will cause you to crash, yet is so vulnerable to error that a small sticker on a stop sign could cause it to read a speed sign instead. Or “a pair of clear glasses with a funky pattern printed on their frames” could tip the algorithm so “it thought it saw what wasn’t there.” Or is so prone to cybersecurity issues that right now, open source software is being developed to allow anyone to rig your robot car’s automated driving system. (See talk by George Hotz, aka GeoHot, who is developing that very technology and at the 38 minute mark, says "we don't want to drive by the rules of the road.")
That’s right folks, step right up to the nation’s latest thrill ride - robot cars! (aka self-driving cars, aka autonomous vehicles.) With this new adventure, your daily routine will start resembling a death-defying (ok not exactly “defying”) amusement park ride where everyone else is on their own different ride and you hope no one “bumps” into you. Or it propels you through a stop sign. Or makes driving decisions based on eyewear. Or flips and flings you in ways you can’t even imagine right now. Come one come all to the scariest show on earth. (And polls seem to agree, as: “three-quarters of U.S. drivers report feeling afraid to ride in a self-driving car, and only 10 percent report that they’d actually feel safer sharing the roads with driverless vehicles.”)
But wait, there’s more! If you thought things were scary before, check this out: Congress seems to want states and localities to have no say at all as to whether these vehicle (the “drivers”) are safe, yet it wants “manufacturers to sell thousands of vehicles that would be exempt from current safety standards, including those of crashworthiness.” (Safety advocates and state lawmakers are decidedly not thrilled.)
And that’s not all. In areas where states do have a say – liability and insurance - General Motors is now trying to immunize the industry from liability, weakening automakers’ financial incentive to ensure these cars are safe. Writes the Associated Press,
California regulators are embracing a General Motors recommendation that would help makers of self-driving cars avoid paying for accidents and other trouble, raising concerns that the proposal will put an unfair burden on vehicle owners.
If adopted, the regulations drafted by the California Department of Motor Vehicles would protect these carmakers from lawsuits in cases where vehicles haven’t been maintained according to manufacturer specifications.
That could open a loophole for automakers to skirt responsibility for accidents, injuries and deaths caused by defective autonomous vehicles, said Armand Feliciano, vice president for the Association of California Insurance Companies. For instance, manufacturers might avoid liability if the tires on self-driving cars are slightly underinflated or even if the oil hasn’t been changed as regularly as manufacturers suggest, he said.
“When is the last time you followed everything that is listed in your car manual?” Feliciano said.…
The section addressing the limits of automakers’ liability adopts much of the wording proposed in an April 24 letter to the DMV from Paul Hemmers Baugh, formerly chief counsel for the National Highway Traffic Safety Administration and now chief counsel for the General Motors division overseeing self-driving cars.
Consumer Watchdog, an activist group frequently critical of business interests, believes Hemmersbaugh plied the connections he made at the California DMV while working at the National Highway Traffic Safety Administration to insert the clause that could make it easier for self-driving carmakers to avoid liability.
“It is the result of the ongoing and troubling federal revolving door between the National Highway Traffic Safety Administration and the auto industry,” Consumer Watchdog officials wrote in a letter sent Tuesday to the DMV and the head of the transportation overseeing the agency.
And then there’s hacking and other kinds of mischief. Writes one industry observer,
Why anyone would want to hack a self-driving car, knowing that it could result in a death? One reason is that widespread deployment of autonomous vehicles is going to result in a lot of unemployed people, and some of them are going to be angry.
In August 2016, Ford CEO Mark Fields said that he planned to have fully autonomous vehicles operating as urban taxis by 2021. Google, Nissan, and others planned to have similar autonomous cars on the roads as soon as 2020. Those automated taxis or delivery vehicles could be vulnerable to being maliciously dazzled with a high-power laser pointer by an out-of-work Teamster, a former Uber driver who still has car payments to make, or just a pack of bored teenagers.
Lots of angry people with nothing but time on their hands. Car manufacturers – the actual drivers of these cars (with a history of “mistakes, coverups and illegalities that killed or injured millions of people") trying to skirt state licensing and liability rules.
Comments