Getting your Trinity Audio player ready...

With a new state permit in hand, the autonomous car company Waymo will soon be kicking off its training wheels — i.e. its human drivers. The company’s self-driving vehicles are now cleared to travel with no humans behind the wheel on the streets of Palo Alto, Mountain View and nearby cities in north Santa Clara County.

Waymo’s new permit from the California Department of Motor Vehicles, which the Voice obtained through a public records request, indicates that the company is largely being allowed to set its own guidelines and protocols for driverless testing. The permit document, which appears to be authored entirely by Waymo, provides little insight as to how DMV regulators interpreted and scrutinized the company’s driverless testing program.

Who authored the permit?

Almost letter for letter, the permit granted by the DMV largely matches Waymo’s application filed six months earlier in April, except for one major change. Originally, Waymo requested permission to test drive up to 59 Chrysler Pacifica minivans, but its final certified permit reduced that number down to 39. Exactly why that reduction was made remains a mystery — DMV officials said it was Waymo’s decision, and the company, a division of Google’s parent company Alphabet, did not respond to questions submitted by the Voice.

DMV officials acknowledged that Waymo designed many facets of its driverless testing protocol, but they gave assurances that the company’s proposal satisfied all the regulatory requirements. DMV regulators rigorously analyzed Waymo’s application and pressed the company to clarify elements of its vehicles’ operational design and interaction plan for law enforcement, said Martin Greenstein, DMV spokesman.

“A lot of this is going to be authored by Waymo,” he said. “This a big step obviously for driverless testing, so we took time to thoroughly review the application. We’re not going to issue a permit until they’ve met our requirements.”

Consumer advocate concerns

But while this marks a new milestone, consumer advocates are raising alarms that there are plenty of unanswered questions about the technology and the regulations underpinning self-driving cars. Many details regarding Waymo’s system for human safety monitors and vehicle certification are skimmed over in the DMV permit, they note.

In some ways, human drivers face tougher scrutiny of their driving safety than Waymo’s software-piloted vehicles, said John Simpson, an advocate with the nonprofit Consumer Watchdog. For years, Simpson’s organization has expressed skepticism about the promises of self-driving technology, warning that the industry and regulators were racing ahead without adequate safeguards in place. A person seeking a driver’s license has to undergo a written exam and a test behind the wheel. But in Waymo’s case, Simpson alleges the DMV is largely trusting the company to vouch for its own safety.

“There’s a fundamental problem with the DMV’s approach here,” Simpson said. “The industry is rushing way too fast beyond the safety of where we are.”

In particular, Simpson singles out Waymo’s proposed certification system approved by the DMV to test the safety and driving capability of its vehicles. It is not explained what this testing will entail. Under the permit, Waymo is being allowed to handle this certification internally without outside review — a level of discretion the DMV wouldn’t give human drivers, he points out.

Each Waymo vehicle is equipped with sensors and software technology that the company has been developing for nearly a decade. Under the DMV rules, Waymo is required to also have human monitors who keep an eye on the self-driving vehicles remotely from computer terminals. The company’s permit calls for two separate teams who will continuously monitor the fleet and check on each vehicle’s diagnostics and driving.

Simpson notes that Waymo’s permit does not disclose how many vehicles each human monitor will be tracking simultaneously. The DMV regulations do not set a limit on this.

In its original application filed in April, Waymo said it would insure each of its vehicles for up to $10 million in personal injury claims. It is unclear if this coverage remains intact — the final permit disclosed by the DMV has most information on insurance coverage redacted. At a minimum, self-driving car companies are required to cover each vehicle for $5 million in potential liability.

While this is a higher standard than human drivers face, Simpson warns that insurance is still insufficient since a major multi-vehicle crash with injuries could quickly deplete that coverage.

Reporting crashes

Waymo is required to report any vehicle crashes involving self-driving cars to the DMV within 10 days. If safety problems with the technology emerge, the DMV has the authority to suspend or revoke Waymo’s permit, Greenstein said. He said that would be unlikely — Waymo has already test-driven its vehicles over millions of miles with humans on board, and safety hazards haven’t surfaced so far. In the vast majority of cases, crashes involving Waymo vehicles have been the fault of human drivers in other vehicles, he said.

According to California Secretary of State filings, Google and Waymo paid $62,000 to the Sacramento-based firm KP Public Affairs to lobby the DMV and other agencies while its application for driverless testing was under review. The exact nature of this lobbying activity is not clear from the public filings.

Asked about this, Greenstein gave assurances that the DMV autonomous-vehicle team was never in contact with any lobbyists.

Dealing with emergency conditions

Under the DMV rules, Waymo is required to set an interaction plan with law enforcement, basically laying out rules for how police or firefighters can stop or disable a self-driving vehicle in emergency situations. If such an incident occurs, law enforcement officers are being asked to call a 24-hour hotline (which was dead when the Voice called). They then are asked to provide Waymo’s support team with a self-driving car’s company ID number and license plate number.

Waymo vehicles are reportedly able to detect police and emergency vehicles automatically, especially if they have their sirens or flashing lights turned on. If that happens, Waymo cars are programmed to pull over and stop at the first available spot.

If a self-driving car gets in an accident, it is designed to immediately stop and contact Waymo’s response team. The company’s responders may call 911 or send out its own support team, depending on the circumstances.

If for some reason a car’s self-driving system goes haywire, then it can be disabled if emergency officials can open one of its doors, put the vehicle in park or set the emergency brake.

The law enforcement interaction plan also contains instructions to completely shut down a vehicle by severing its electric power, but it is not clear when that would be necessary.

Waymo officials met briefly with the Mountain View Police Department last week, but the company is still expected to hold a training session to teach police officers how to engage self-driving vehicles. Once that is done, Waymo will be allowed to begin deploying its driverless cars.

Mountain View police officials are not planning any special restrictions or enforcement measures for Waymo vehicles at this time, said spokeswoman Katie Nelson.

“I can say that in our conversation with them, it is very clear that any testing conducted will be done with a very measured approach with plenty of training time so that (police officers) are ready and comfortable with the technology,” Nelson said.

Waymo has announced public forums to discuss its self-driving cars and its planned rollout. One is scheduled in Palo Alto from 6 to 8 p.m. on Tuesday, Nov. 27, at the Cubberley Community Center’s Room H-1, 4000 Middlefield Road. Another forum will be held in Mountain View from 6 to 8 p.m. on Thursday, Dec. 13, at the Historic Adobe Building at 157 Moffett Blvd.

Mark Noack writes for the Mountain View Voice, a sister publication of The Almanac.

Mark Noack writes for the Mountain View Voice, a sister publication of The Almanac.

Mark Noack writes for the Mountain View Voice, a sister publication of The Almanac.

Join the Conversation

3 Comments

  1. It’s great for people to be dubious of self-driving capabilities, but if the manned driving trials were any indication, the biggest issue we’re going to see is over-aggressive human drivers ploughing into over-cautious Waymo vehicles… I’m hoping the autonomous critics also take their fair share of human responsibility for the incredibly dangerous human component of auto accidents…

  2. I have an idea, let’s regulate these cars until they are perfect, then we can release them into the wild. The only problem is, they’ll never be perfect. And if we want them close to perfect, it will take another 50 years. All the while the how many drunk drivers and texting drivers are killing our citizens? The reason the regulation is following Google’s lead is that it’s new technology. Google isn’t following a path, they’re creating a new one, right now. I want there to be regulation, but there is regulation, and these cars are already safer than a human driver so it’s time to get these out there and start saving lives.

Leave a comment