Crash Leads to New Self-Driving Software Recall

By B.N. Frank

Over the years, there have been numerous safety issues as well as a few deadly accidents associated with autonomous vehicles (AVs) (see 1, 2, 3), particularly Tesla models (see 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13).  A report published in May 2022 suggests that AVs may always need a human operator – at least AVs offering public transportation.  In June, the National Highway Traffic Safety Administration (NHTSA) released more data confirming safety issues associated with Tesla’s Autopilot software.  Two fatal accidents involving motorcyclists and drivers operating in Autopilot last month led to another NHTSA investigation.  Not long after, another motorcyclist was killed by a Tesla driver.

Of course, Tesla isn’t the only company that has installed self-driving software in its vehicles.  Recently a GM subsidiary recalled the self-driving software in its vehicles as well.

From Wired:


GM’s Cruise Recalls Self-Driving Software Involved in June Crash

After two people were injured in the incident, Cruise blocked its robot vehicles from making left turns for several weeks before issuing a software update.

Autonomous driving company Cruise and US regulators said today that the General Motors subsidiary had recalled software deployed on 80 vehicles after two people were injured in a June crash involving a Cruise car operating autonomously in San Francisco. The incident occurred one day after the state of California granted Cruise a permit to start a commercial driverless ride-hail service in the state. The flawed software was updated by early July, Cruise said in a filing with the US National Highway Traffic Safety Agency.

The crash occurred when a Cruise vehicle attempting to make an unprotected left turn across a two-lane street was struck by a car that was traveling in the opposite direction and speeding in a turn lane. Cruise said in its NHTSA filing that its software had predicted that the other car would turn right and determined that it was necessary to brake hard in the midst of its own vehicle’s left turn to avoid a front-end collision. But the other vehicle continued straight through the intersection, T-boning the now stationary Cruise car.

At least one person in the speeding vehicle and one Cruise employee riding in the autonomous vehicle were treated for injuries, according to a report that Cruise submitted to the California Department of Motor Vehicles in June. Cruise responded to the incident by putting its robot cars on a tighter leash until their software was updated. The company reduced the area of San Francisco the vehicles operated in and barred them from making left turns altogether.

Cruise said in its NHTSA filing that the software update improves its self-driving software’s predictions, especially in situations like the one that led to the crash. The company said it has determined that if the vehicle involved in the June 3 incident had been running the current software, no crash would have occurred.

The recall is just the NHTSA’s second to involve fully self-driving software. In March, the self-driving developer Pony.ai recalled three self-driving vehicles after it found that a software error caused the system to shut down unexpectedly while its vehicles were in motion. The company said all affected vehicles were repaired. The increasing amount of software in vehicles means that more vehicle recalls—even among human-driven cars—can be accomplished through over-the-air updates.

In a written statement on the Cruise recall, NHTSA head Steven Cliff said the agency continues to investigate crashes involving self-driving vehicles and will “ensure that vehicle manufacturers and developers prioritize the safety of pedestrians, bicyclists, and other vulnerable road users.” Cruise met with NHTSA officials multiple times to discuss the crash, according to the recall filing.

Cruise spokesperson Hannah Lindow said in a written statement that the software issue has been resolved. “Cruise AVs are even better equipped to prevent this singular, exceptional event,” Lindow wrote. Right now, Cruise’s service operates in 70 percent of the city between 10 pm and 6 am, except during rain or fog. Interested riders must apply to use the service. The robots can make left turns again.

Two other companies, Alphabet subsidiary Waymo and the robotic delivery company Nuro, have also received permits to deploy commercial self-driving services in California. In San Francisco, Waymo offers select members of the public paid rides with an employee in the car to monitor its technology.

One person who used Cruise’s autonomous ride-hail service in San Francisco told WIRED this summer that his taxi trip took less-than-direct routes around the city, presumably to avoid the busiest streets. In a recent interview with WIRED, General Motors president Mark Reuss said, “I’ll take a complaint like, ‘It took me a little longer to get there than it may have under other circumstances, but it took the safest route, and it managed the interfaces with human drivers better than what I thought.’”

Cruise’s San Francisco operations have had a troublesome few months. According to internal messages previously reported on by WIRED, the self-driving vehicle developer suffered several incidents in which the company lost touch with its driverless vehicles on San Francisco roads, requiring Cruise employees to fetch and sometimes tow them back to the company’s garages.

In a few instances, the stopped vehicles jammed up city streets. In May, a Cruise employee sent an anonymous letter to California’s Public Utilities Commission outlining what they alleged to be unsafe practices within the company. The regulator is investigating the allegations.

After the June crash, Jeff Bleich, ​​chief legal officer at Cruise, encouraged employees to stay focused on their jobs and warned that crashes would likely increase in frequency as the company scaled up its self-driving services, according to a recording reviewed by WIRED. “We just have to understand that at some point this is now going to be a part of the work that we do, and that means staying focused on the work ahead,” he said. Cruise says its safety record is monitored by several government regulators, and that its record “speaks for itself.”


Additionally, experts have issued other warnings about AVs as well:

  • Deployment could increase pollution rather than decrease it
  • AV technology emits biologically and environmentally harmful electromagnetic radiation (see 1, 2, 3, 4)

AVs sound less appealing every day.

Activist Post reports regularly about autonomous vehicles (AVs), electric vehicles (EVs), and unsafe technology.  For more information, visit our archives and the following websites:

Become a Patron!
Or support us at SubscribeStar
Donate cryptocurrency HERE

Subscribe to Activist Post for truth, peace, and freedom news. Follow us on SoMee, Telegram, HIVE, Flote, Minds, MeWe, Twitter, Gab, What Really Happened and GETTR.

Provide, Protect and Profit from what’s coming! Get a free issue of Counter Markets today.


Activist Post Daily Newsletter

Subscription is FREE and CONFIDENTIAL
Free Report: How To Survive The Job Automation Apocalypse with subscription

Be the first to comment on "Crash Leads to New Self-Driving Software Recall"

Leave a comment