Waymo Self-Driving Errors Raise Safety Flags, Prompting Immediate Recall

Waymo’s recent self-driving errors have pushed the autonomous-vehicle industry back under the microscope, as a pair of high-profile crashes and two separate software recalls prompted renewed questions about the readiness of robotaxis on public roads. The tension is unmistakable: Waymo cites millions of safe autonomous miles and insists its fleet is statistically safer than human drivers—but edge-case failures continue to raise regulatory concerns and public hesitation.

This article breaks down what happened, why regulators intervened, and what these recalls signal for the future of autonomous driving—especially as major cities weigh expansion or restriction of AV services.

What Triggered the Recall: The Key Incidents

The December 2023 Tow-Truck Collisions in Phoenix

In December 2023, two driverless Waymo vehicles separately collided—within minutes—with the same tow truck carrying an angled pickup truck across multiple lanes. Both incidents were low-speed and caused no injuries, but they exposed a blind spot in how the Automated Driving System (ADS) perceived and predicted the movement of “unusual” towed vehicles.

Coverage from Reuters and CNN confirmed that the unusual geometry of the towed pickup truck confused the system’s path-planning model, leading to nearly identical collisions.

The May 2024 Utility-Pole Crash

In May 2024, Waymo issued another voluntary recall after a fully driverless robotaxi struck a wooden utility pole in Phoenix. According to incident reporting covered by Reuters, the crash pointed to limitations in detecting narrow, fixed roadside objects—objects that can be difficult for an AV’s perception stack, but trivial for human drivers.

These incidents demonstrated the core challenge of autonomous driving: AVs excel in routine environments but struggle with rare, irregular situations that require high-level reasoning and unconventional pattern recognition.

The Official Recalls and NHTSA’s Involvement

The First Recall: NHTSA Campaign 24E-013

Following the tow-truck collisions, Waymo filed an official safety recall with the National Highway Traffic Safety Administration. The recall report—available on NHTSA’s public portal (Campaign 24E-013)—covered 444 vehicles running its 5th-generation ADS software.

The defect description stated that the ADS could mis-predict the movement of certain towed vehicles, creating a risk of improper lane selection.

Second Recall: Barrier-Detection and Pole-Detection Issues

After the utility-pole crash, Waymo issued another voluntary recall affecting 672 vehicles. This second recall focused on how the system detected narrow roadside objects, including poles, gates, and similar fixed structures.

Journalists at Reuters and local coverage confirmed this recall as part of a larger set of more than 1,200 vehicles Waymo updated to address barrier and gate-detection issues—something later acknowledged in NHTSA investigation summaries.

Why These Errors Raise Significant Safety Flags

Edge-Case Failures Are AV Technology’s Greatest Vulnerability

Waymo’s robotaxis perform exceptionally well in routine driving. However, irregular scenarios—angled loads, odd vehicle orientations, narrow roadside poles—are where AV systems show their weaknesses.

These are exactly the types of failures that human drivers rarely make.

Fleet-Wide Software Risks: The Core Regulatory Concern

The fact that two vehicles crashed into the same object minutes apart illustrates a risk unique to AV fleets: a single software flaw can instantly scale across hundreds of vehicles.

That possibility is why agencies and safety advocates call for more transparency in:

  • Software versioning

     

  • Edge-case testing

     

  • Disengagement and remote-intervention logs

     

  • Operational design domain (ODD) limitations

     

Public Trust Focuses on Outliers, Not Averages

Despite strong safety statistics, public perception remains skeptical. Surveys covered by Pew Research Center show most Americans still wouldn’t ride in a driverless car.

The reasoning is straightforward:
Isolated failures feel more memorable—and more alarming—than aggregate safety improvements.

A white Jaguar I-PACE equipped with rooftop sensors is parked on a busy city street as a woman approaches its open rear door amid passing pedestrians.

Waymo’s Response: “Safer Than Human Drivers”

Waymo consistently cites internal safety analyses and its Safety Impact Reports to argue its fleet performs better than human drivers on:

  • Police-reported crashes

  • Injury-causing collisions

  • Predictable, routine driving routes

These claims are well supported by millions of autonomous miles logged in Phoenix, San Francisco, and Los Angeles.

But the disconnect between statistical safety and perceived safety remains a decisive barrier. And each recall widens that perception gap, even if systemwide performance remains strong.

How a Self-Driving Recall Works

Fleet-Owned AVs = Software Recall, Not Traditional Recall

Because Waymo operates its own vehicles rather than selling them to consumers:

  • There is no dealership appointment.

  • Updates are pushed over the air or applied at fleet depots.

  • The recall identifies a software version, not physical components.

NHTSA’s Compliance Framework

Under federal law, once a defect is identified, companies must:

  • Notify NHTSA

  • Identify the affected fleet population

  • Deploy a fix

  • Report completion percentages

In the AV space, this can also mean modifying the vehicles’ ODD—reducing speeds, geofencing areas, or restricting certain maneuvers until validation is complete.

What This Means for Riders and Everyday Road Users

Short-Term Service Adjustments

Following high-profile incidents, riders may notice:

  • Reduced service zones

  • Temporary pauses

  • Re-routed trips

  • Increased presence of test or safety-driver vehicles

Policy and Permit Implications

Cities hosting robotaxi operations—Phoenix, San Francisco, Los Angeles—are evaluating:

  • More detailed incident reporting

  • Requirements for remote-operator disclosures

  • Fleet caps

  • Mandatory cool-down periods after incidents

  • Construction-zone training standards

AV recalls are accelerating these policy discussions in California, Arizona, Texas, and Nevada.

Final Analysis: Why These Recalls Matter Beyond Waymo

Waymo’s recall illustrates both the strengths and the vulnerabilities of autonomous fleets:

  • Strength: Software fixes can be deployed across hundreds of vehicles instantly.

  • Vulnerability: A single defect can replicate across an entire fleet just as quickly.

Autonomous driving will continue advancing—but these incidents highlight the importance of transparent reporting, public trust, and rigorous edge-case testing before true nationwide adoption.

Where MRS Fits In: Supporting Fleets, Dealerships, and Multi-State Vehicle Operations

While companies like Waymo handle cutting-edge automation, traditional fleets, dealerships, and vehicle operators still rely on human-driven compliance. That’s where Montana Registration Services (MRS) supports the industry.

MRS works with:

  • Fleet operators managing vehicles across multiple jurisdictions

  • Dealerships processing large volumes of titles and registrations

  • Brokers and registration agents coordinating complex multi-state operations

With same-day processing, error-free documentation, and deep regulatory expertise, MRS ensures businesses stay compliant—even as technology, transportation laws, and safety expectations continue to evolve.

Learn more at Montana Registration Services.