Waymo announced recently that its fully driverless vehicles in California and Arizona have traveled 1 million miles as of January 2023. To recognize this milestone, the Alphabet-owned company pulled back the curtain on some interesting statistics, including the number of crashes and vehicle collisions that involved its robot cars.
Waymo’s driverless cars were involved in two crashes and 18 ‘minor contact events’ over 1 million miles
Waymo’s driverless cars were involved in two crashes and 18 ‘minor contact events’ over 1 million miles
Waymo operates a fleet of driverless cars in Phoenix, San Francisco, and the Bay Area. Some of those trips include paying customers. The company also recently started testing its driverless vehicles in Los Angeles.
Over that 1 million miles, Waymo’s vehicles were involved in only two crashes that met the criteria for inclusion in the National Highway Traffic Safety Administration’s database for car crashes, called the Crash Investigation Sampling System (CISS). In general, these are crashes that were reported to the police and involved at least one vehicle being towed away. Of the two crashes that met the criteria, Waymo says its vehicle was rear-ended by another vehicle whose driver was looking at their phone while approaching a red light.
Over that 1 million miles, Waymo’s vehicles were involved in only two crashes
Waymo’s vehicles have also been involved in 18 “minor contact events” that did not meet NHTSA’s CISS criteria. These involve incidents like a car backing out of a parking spot and colliding with a stationary Waymo vehicle or a portable plastic sign stand getting blown by the wind and making contact with one of the company’s driverless cars.
Waymo says 55 percent of these minor contact events involved another driver colliding with a stationary Waymo vehicle, and 10 percent occurred at night. None of the events took place at intersections, where most vehicle crashes occur, nor did any involve pedestrians, cyclists, or other vulnerable road users.
In fact, Waymo is quick to place the blame on error-prone human drivers. “Every vehicle-to-vehicle event involved one or more road rule violation and/or dangerous behaviors on the part of the human drivers in the other vehicle,” the company says in a blog post. Waymo says it is publicizing these events in the interest of “greater transparency.”
None of the events took place at intersections, where most vehicle crashes occur
“Far too many people still die or are injured on our roads every year in communities across the country,” Waymo’s chief safety officer Mauricio Peña said in a statement. “This data suggests our fully autonomous driving system, the Waymo Driver, is reducing the likelihood of serious crashes, helping us make progress towards our mission for safer, more accessible mobility for all.”
Improved safety has been one of the main predictions of the autonomous vehicle (AV) industry. With over 1 million people dying in auto crashes globally every year, AV operators are increasingly leaning on this safety case to spur regulators to pass legislation allowing more fully autonomous vehicles on the road. But while the argument seems convincing on the surface — AVs don’t get drunk or distracted like humans, nor do they speed or break the law — there is scant data that proves that fully automated vehicles are safer than human drivers.
Waymo frequently discloses certain stats about its driverless vehicles in the interest of boosting its message that robot drivers are safer than humans. Previously, the company had sought to measure the safety of its AVs by simulating dozens of real-world fatal crashes that took place in Arizona over nearly a decade. The Google spinoff discovered that replacing either vehicle in a two-car collision with its robot-guided vehicles would nearly eliminate all deaths. Waymo also has submitted scientific papers for peer review and publication comparing autonomous vehicle performance to human driving.
There’s no standard approach for evaluating AV safety. A recent study by Rand concluded that, in the absence of a framework, customers are most likely to trust the government — even though US regulators appear content to let the private sector dictate what’s safe. In this vacuum, Waymo hopes that, by publicizing this data, policymakers, researchers, and even other companies may begin to take on the task of developing a universal framework.