SAN FRANCISCO – U.S. lawmakers have applied a light touch in regulating robot cars.
Crashes invite tighter rules for self-driving cars
Consumer advocates, politicians call for more stringent regulations.
By Carolyn Said, San Francisco Chronicle
At the national level, the Trump administration has proclaimed that driverless-car guidelines should be "entirely voluntary" for automakers, and bills pending in Congress would clear the way to putting tens of thousands of autonomous cars on the road — orders of magnitude more than the few hundred in the country today — before federal safety regulations are set. Forty states have passed various rules, with California's being the most well defined, but the pending federal laws would void states' ability to regulate driverless cars.
All that may change soon.
The industry's first pedestrian fatality, in which a self-driving Uber SUV hit and killed Elaine Herzberg in Tempe, Ariz., in March, has focused attention on the nascent industry and its safety rules — or lack thereof.
"The one positive thing that may come from Ms. Herzberg's death is that regulators at all levels will start to ask the questions they should have asked before [automated vehicles] were tested in public," said Jim McPherson, a Bay Area attorney who runs SafeSelfDrive to consult on driverless cars.
Consumer advocates have long warned that lax regulations play fast and loose with public safety.
"It's crazy that we're letting these things on the road right now, using you and me as human guinea pigs, and letting companies use public roads as private laboratories," said John Simpson from Consumer Watchdog, which has called for a nationwide moratorium on public autonomous testing until there's a report on the Arizona crash. "We're getting too far ahead of ourselves."
The industry counters that self-driving cars — which don't text, drink or get distracted — could end the nation's 40,000 annual traffic fatalities, making it a moral imperative to get them on the roads sooner than later.
Tesla, which said that its Autopilot system was engaged in a deadly Silicon Valley crash on Hwy. 101 in March, struck a similar tone in defending the driver-assistance technology. The company said drivers using it were nearly four times less likely to have an accident. Though driver-assistance features aren't the same as driverless tech, incidents involving them may also shape public opinion.
A study from the Rand Corp. says widespread use of autonomous cars before they're perfected would save lives, even if they'd still cause crashes, injuries and fatalities.
Testing autonomy on public roads doesn't mean using the public as lab rats, the argument goes. It is a way to bring this potentially life-saving technology to the public more quickly, the industry says.
"It's a Catch-22," said Stephen Beck, founder of management consulting firm CG42. "For the technology to grow, learn and get better, you have to put it in real-world situations. There's only so far you can go in testing environments" such as closed courses and simulations.
Uber suspended its autonomous-car testing program after the accident, in which a backup driver was at the wheel but appeared to be looking down briefly prior to the accident. Most other developers have continued trials on public roads, generally with backup drivers.
In Washington, lawmakers expressed concerns after the fatality, which could lead to stricter oversight.
"This tragic incident makes clear that autonomous vehicle technology has a long way to go before it is truly safe for the passengers, pedestrians and drivers who share America's roads," Sen. Richard Blumenthal, D-Conn., said in a statement.
The Uber crash "underscores the need to adopt laws and policies tailored for self-driving vehicles," Sen. John Thune, R-S.D., co-sponsor of a pending bill in the Senate, said in a statement. "Congress should act to update rules, direct manufacturers to address safety requirements and enhance the technical expertise of regulators."
With human-driven cars, the federal government regulates vehicle design and safety, while states regulate vehicle operations and operators — issuing car registrations and driver's licenses, for instance.
But autonomous vehicles contain both the car and the driver in one robot package. "That blurs the lines between those state-federal responsibilities," said Bryant Walker Smith, a law professor at the University of South Carolina who studies autonomous-car rule-making.
The problem, he and others said, is that the pending federal laws quash states' abilities to address autonomous-vehicle safety concerns — even though it could take two years for the National Highway Traffic Safety Administration to hammer out federal safety standards for autonomous vehicles.
Both a bill passed by the House and one pending in the Senate "are insufficiently protective of safety," said Sarah Light, an assistant professor of legal studies and business ethics at the Wharton School, who studies the subject. "They would immediately preempt state safety rules before there are federal safety standards for autonomous driving systems. That would create a safety gap."
Smith thinks someone should be required to attest that the cars are safe, whether it's a company, a fleet owner or a manufacturer.
His thoughts on how to beef up federal regulations: Establish mandatory and substantial safety-evaluation reports and make them publicly available. Give the NHTSA more resources to assess manufacturers' claims and authority to act on those reports — pressing for more information, preventing companies from deploying if their claims aren't credible, and enabling the creation of tests and standards for autonomous vehicles. And don't stop states from crafting their own rules, at least for now.
"If states want to exercise more authority, let them," he said. "If one state doesn't want autonomy, let them. That may be worse for their citizens, but people don't want to feel this is being forced on them."
Simpson from Consumer Watchdog has some simple advice for regulations.
"You have to pass an eye exam and a driving test before you can get a driver's license," he said. "We should have analogous requirements for autonomous vehicles that they can sense and differentiate objects. Can they distinguish between overhanging tree branches and a human waving their arms, for instance?"
California, the epicenter of self-driving car testing because of the intense interest in it among Silicon Valley companies, has crafted more detailed rules than any other state. These include requirements that all autonomous cars register with the state, and that companies report on crashes and on how often humans must take over from the machine.
The Golden State has recently added rules that allow carmakers to apply to test cars with no drivers — such testing could start later this spring — and eventually to carry paying passengers. Companies must certify that their cars are ready, and promise that remote operators are available to steer the vehicles around obstacles such as construction.
Other states, such as Arizona and Florida, already allow no-driver cars and have few, if any, rules for them.
"California's rules are more comprehensive and more protective of safety than the federal rules," Light said. "It should be allowed to say how it wants to protect its citizens."
The U.S. has long trusted automakers to guarantee their cars' safety and it hasn't always ended well.
"Transportation safety regulations and the rule-making process in the U.S. is very reactive, slow and usually, unfortunately, written in blood of past victims," said Najmedin Meshkati, a professor of civil engineering at USC.