It’s half past nine at night in San Francisco. With her traffic light on red, a pedestrian crosses a zebra crossing in one of the busiest areas of the city, and is hit by a hit-and-run driver. The impact causes the victim to be thrown into the adjacent lane, where a Cruise vehicle, one of the two driverless taxi services active in San Francisco, is traveling. The autonomous car brakes suddenly but cannot avoid it, passing over it. According to the authorities’ report, the robotaxi then tries to make a “turning maneuver while the pedestrian was under the vehicle”, dragging the woman under it for about six meters.
The accident and subsequent performance of Cruise’s vehicle have caused the immediate suspension of the 400 licenses that the company had to operate driverless taxis in San Francisco. California transportation authorities accuse the company (owned by General Motors and in which Honda and Microsoft have also invested) of having “falsified information related to the safety of the autonomous technology of its vehicles” and hiding the video of the accident.
The decision will force Cruise to return to having a driver in its vehicles who can act in an emergency indefinitely. “They are not safe for the public,” the transportation authorities say in a statement. The accident occurred on October 2 and was not fatal, although no further details of the victim’s condition have been given.
The veto comes just a couple of months after both Cruise and Waymo (owned by Google) were given permission to operate around the clock, increase their fleets and operate in a larger area of San Francisco. The approval was considered an institutional boost to the service and provoked complaints from the city’s Police and Fire Departments, who denounce that the robotaxis “are not ready”, as well as from several neighborhood associations opposed to the tests that these vehicles carry out in the city and the traffic problems that sometimes result.
😬 @Cruise self-driving operations had a complete meltdown earlier in North Beach. We overheard on the scanner that all Cruise vehicle agents were tied up at the time (not literally) and so North Beach was going to get a delayed response. But wow, WTF!pic.twitter.com/D89xrSxAdu
— FriscoLive415 (@friscolive415) August 12, 2023
Although both Cruise and Waymo (although it is the former that accumulates most of the controversies) have presented figures that show that their cars double the number of kilometers they travel without an accident compared to human drivers, the focus is on about their actions. The inhabitants of the mecca of digital technology are used to seeing how situations of chaos caused by these vehicles go viral on social networks.
Thus, just a few days after expanding its operations, a dozen Cruise robotaxis were stopped on a San Francisco street, collapsing it for no apparent reason. Another was trapped when entering fresh cement (they have demonstrated recurring problems detecting construction sites or police perimeters) and another collided with a fire truck (the presence of emergency vehicles is another situation that is difficult for them to discern).
A questioned maneuver
Cruise has published a statement in which it regrets the accident but emphasizes that it is “an ongoing hit-and-run crime by a human driver against another road user,” not an accident caused by its vehicles. Furthermore, the company has revealed that the maneuver in which the car dragged and lowered the victim of the accident was a consequence of the safety protocol agreed upon with the administration.
“After a collision, Cruise’s automated vehicles are designed to perform a maneuver to minimize safety risks to the extent possible within the context of driving. This is called “achieving a minimum risk condition” and is required by California regulations and encouraged by federal autonomous vehicle guidance. The specific maneuver, such as stopping immediately, moving out of the traffic lane, or exiting traffic after exiting an intersection, depends largely on the driving context and the capabilities of Cruise’s vehicle at that time,” he says. movement of the robotaxi after the accident.
Cruise affirms that if the two cars that were involved in the event had been company robotaxis, the accident would not have occurred. “The autonomous vehicle would have detected and avoided the pedestrian, and the pedestrian could have continued on its way. We wish this had been the case,” the company says. They have published a video recreating the situation in these terms.
“We also discovered that in a real scenario, the autonomous vehicle responded by diverting its trajectory away from the person 460 milliseconds faster than most human drivers, and braked vigorously to minimize the impact,” they add from Cruise, denying that they withheld information from road safety authorities during the investigation of the events.
The zero margin of error
The suspension of Cruise’s licenses comes at a time when excitement over accidents or chaos caused by these vehicles was reaching its peak among the citizens of San Francisco, even though they occur in smaller numbers than when the driver is a human. An official page of the California Government collects them all.
“Waymo and Cruise have driven a combined total of nearly 13 million km without a driver, including more than 6.4 million km in San Francisco since the beginning of 2023. And because California law requires autonomous vehicle companies report every major accident, we know a lot about their performance,” explains Julián Estévez, professor of Robotics and Artificial Intelligence at the University of the Basque Country. “In total, the two companies have reported just over 100 accidents involving driverless vehicles. It may seem like a lot, but they occurred in about 10 million km of driving. “That equates to one accident every 100,000 kilometres, which is equivalent to about five years of driving for a typical human motorist,” he adds.
I think accidents and failures are part of progress. The same thing that happened in the history of the conquest of aviation, or the dominance of electricity
Julian Estevez
— Professor of Robotics and Artificial Intelligence
The professor emphasizes that in many of these accidents, the responsibility has also fallen on human drivers. “We are asking for a technology that has zero accidents. I think accidents and failures are part of progress. The same thing that happened in the history of the conquest of aviation, or the dominance of electricity,” he explains.
Whether those accidents would prevent this technology from spreading around the world is an open question. For Estévez, it will depend on the regulation and infrastructure to assimilate autonomous cars on the roads. In any case, it warns that the Silicon Valley experience does not have to be replicated so well in other places. “It is easier for autonomous vehicles to succeed in San Francisco than in another city, since they have been conducting experiments in this city for almost two decades, and it is a well-known environment. For it to work so effectively in other environments may not be easy,” he concludes.
Follow the new WhatsApp channel of elDiario.es with the keys of the day and the most important last hours.