Uber has quickly and confidentially compensated the family of the pedestrian killed by its robot car in Tempe Arizona earlier this month - before a lawsuit was ever filed. Not surprising. The last thing Uber needs right now is for this family to sue Uber or to talk about it.
But what Uber cannot silence are a lot of disturbing questions raised by this crash. This might have been the first innocent pedestrian death, but it clearly won’t be the last. As Consumer Watchdog recently noted in comments to the National Highway Traffic and Safety Administration (NHTSA):
An email exchange between former Uber CEO Travis Kalanick and then lead robot car developer Anthony Levandowski, revealed in the recent Waymo-Uber lawsuit, makes clear the corporation’s lack of concern for public safety. “I just see this as a race and we need to win, second place is first loser” read one text from Levandowski in March 2016. “We do need to think through the strategy to take all the shortcuts we can find,” said another from the engineer on the same day.
Clearly, Uber isn’t unique here. What most people don’t realize is that these cars don’t come out of the factory knowing how to manage roads. They no more know how to do that than would a 16-year-old before a learner’s permit. But at least teenagers have lived in the world and have brains. On top of all the obvious safety problems that come with automation and computer technology (massive malware attacks anyone?), these vehicles have to “learn to think like humans.” As one expert put it,
Self-driving cars must be taught to understand not only what the surroundings are but the context: A car approaching from the front is not a danger if it’s in the other lane, but if it’s in the car’s own lane circumstances are entirely different. Car designers should test vehicles based on how well they perform difficult tasks, like parking in a crowded lot or changing lanes in a work zone. This may sound a lot like giving a human a driving test – and that’s exactly what it should be, if self-driving cars and people are to coexist safely on the roads.
Except that I don’t know of any drivers ed course that treats members of the public like guinea pigs, teaching students not to hit pedestrians by hitting one so they won’t do it again. And as auto and tech companies are in a mad mad rush to be first on the road with these vehicles, safety isn’t the highest priority for any of them. I’ll tell you what is their highest priority: secrecy, especially when something goes wrong.
We recently wrote about this issue in the context of sexual harassment claims. There are two distinct problems.
First are non-disclosure clauses and the related issue of confidential settlements, which Uber just forced onto the family of the pedestrian it killed. Such settlements allow information about the cause of crashes to be covered-up. Second are forced arbitration clauses. Companies like Uber currently use these clauses to keep all kinds of disputes out of court and put into secretive, rigged proceedings that the company controls, which keeps information hidden from regulators and the public. As long as forced arbitration clauses are considered legal, the at-fault company can decide whether to compel their use in an individual case. There may be debate about the wording and breadth of a particular clause, and if they might cover a personal injury or wrongful death case. In the Tempe situation, the family settled before a lawsuit was filed so we don’t know how Uber might have responded to an actual lawsuit. But as frightening crashes accelerate and more innocent people are hurt or killed, use of forced arbitration is bound to come up, without a doubt.
Congress needs to do something. Currently, some members of the U.S. Senate have placed a hold on that chamber’s robot car bill, called the AV Start Act, because of certain safety concerns. In a letter to the leaders of the Senate Commerce Committee, five Democratic members of that committee called for better safety standards in the bill, including a provision that federal regulators have “clear direction to gather and analyze data on the deployment of these vehicles, including their incident data.” As Consumer Watchdog explained, more and better data are critically needed before such cars are further tested, and must be collected nationally and released publicly (as is currently done - at least in part - only in California right now). Consumer Watchdog wrote (footnotes omitted):
[T]he tragic [Uber-caused] death should be no surprise to anyone who read the data released recently by the California Department of Motor Vehicles foretelling of the fatal risks posed by robot cars.
The data in these reports proves that this is the first of many human fatalities if corporations continue to have the privilege to drive their underdeveloped and unregulated robot cars on public roads. The reports, which were released by twenty companies to the California DMV and are the only publicly available data about the state of robot car technology show that so-called self-driving cars cannot go more than 5,596 miles in the best-case scenario without a human driver taking over the wheel. In most cases, the vehicles cannot travel more than a few hundred miles without needing human intervention. The recent fatal incident on March 18, 2018 that took the life of an innocent pedestrian confirms the information revealed in this data.
As to the issue of forced arbitration, the letter from five Senators unfortunately did not mention it, even once. Yet it’s just as critical from a public safety standpoint. As CNN wrote in a recent article:
Critics are concerned that the bill, known as the AV Start Act, does not prohibit forced arbitration between autonomous vehicle manufacturers and consumers. …
Arbitration proceedings are also private, so use of them would mean the public is more likely to be kept in the dark about flaws in self-driving vehicles.
"The nightmare scenario is that someone is hurt because of a defect and it's dealt with through a confidential arbitration proceeding that nobody knows about, and then more people are hurt because no one found out about it," Ed Walters, who teaches robotics law at Georgetown Law and Cornell Tech, told CNN. "Congress could stick up for the right to sue by prohibiting these kind of clauses, but so far they haven't."
Fortunately, however, Senators are doing at least something. As CNN wrote last week:
In a letter sent Thursday to 60 manufacturers of self-driving technology, including Uber CEO Dara Khosrowshahi, 10 Democratic Senators noted a "potentially glaring omission" in the AV Start Act, a Senate bill under consideration. If passed, the bill will pave the way for broad deployment of self-driving cars on US roads.
Consumer advocates warned this month that the bill fails to prohibit forced arbitration clauses in terms of service agreements between autonomous vehicles manufacturers and the consumers that ride in them. The omission would benefit big tech and car companies at the expense of average Americans, the advocates say.…
In the letter, the Senators ask the companies if they will be requiring passengers in self-driving cars to agree to forced arbitration. Uber customers riding in its human-driven vehicles already agree to an arbitration clause in the company's terms of service.
The Senators express concern about ensuring Americans are not deprived of their rights when injury or death occurs.
The letter was signed by Senators Richard Blumenthal, Patrick Leahy, Christopher Coons, Edward Markey, Kirsten Gillibrand, Richard Durbin, Elizabeth Warren, Sheldon Whitehouse, Mazie Hirono and Catherine Cortez Masto.
Senators had raised concerns about the AV Start Act before. But the letter is the first time the issue of forced arbitration regarding self-driving cars has been publicly addressed.
The public on whom this technology is being thrust has a right to know what a company knows when a crash happens. Secret settlements and non-disclosure clauses undermine that right. So do forced arbitration clauses. When people are injured or die, not only does someone need to be held accountable – which requires full disclosure of information about went wrong to every party with a claim. But also, information about what happened must be disclosed to regulators, and to the public.
Otherwise, I don’t see how the public will accept this technology, no matter what kind of PR pitch industry groups make.
Comments