Tesla’s own Robotaxi data confirms crash rate 3x worse than humans even with monitor

5 min read Original article ↗
Tesla Robotaxi hero

Tesla’s nascent robotaxi program is off to a rough start. New NHTSA crash data, combined with Tesla’s new disclosure of robotaxi mileage, reveals Tesla’s autonomous vehicles are crashing at a rate much higher tha human drivers, and that’s with a safety monitor in every car.

The data

According to NHTSA’s Standing General Order crash reports, Tesla has reported 9 crashes involving its robotaxi fleet in Austin, Texas between July and November 2025:

  • November 2025: Right turn collision
  • October 2025: Incident at 18 mph
  • September 2025: Hit an animal at 27 mph
  • September 2025: Collision with cyclist
  • September 2025: Rear collision while backing (6 mph)
  • September 2025: Hit a fixed object in parking lot
  • July 2025: Collision with SUV in construction zone
  • July 2025: Hit fixed object, causing minor injury (8 mph)
  • July 2025: Right turn collision with SUV

According to a chart in Tesla’s Q4 2025 earnings report showing cumulative robotaxi miles, the fleet has traveled approximately 500,000 miles as of November 2025. That works out to roughly one crash every 55,000 miles.

For comparison, human drivers in the United States average approximately one police-reported crash every 500,000 miles, according to NHTSA data.

That means Tesla’s robotaxis are crashing at a rate 9 times higher than the average human driver.

However, that figure doesn’t include non-police-reported incidents. When adding those, or rather an estimate of those, humans are closer to 200,000 miles between crashes, which is still a lot better than Tesla’s robotaxi in Austin.

The safety monitor problem

Here’s what makes this data particularly damning: every Tesla robotaxi in the reported mileage had a safety monitor in the vehicle who can intervene at any moment.

These aren’t fully autonomous vehicles operating without backup. There’s a human sitting in the car whose entire job is to prevent crashes. And yet Tesla’s crash rate is still nearly an order of magnitude worse than regular human drivers operating alone.

Waymo, by comparison, operates a fully driverless fleet, no safety monitor, no human backup, and reports significantly better safety numbers. Waymo has logged over 125 million autonomous miles and maintains a crash rate well below human averages.

The transparency gap

Perhaps more troubling than the crash rate is Tesla’s complete lack of transparency about what happened.

Every single Tesla crash narrative in the NHTSA database is redacted with the same phrase: “[REDACTED, MAY CONTAIN CONFIDENTIAL BUSINESS INFORMATION]”

We know a Tesla robotaxi hit a cyclist. We don’t know what happened.
We know one caused a minor injury. We don’t know what happened.
We know one hit an animal at 27 mph. We don’t know what happened.

Meanwhile, Waymo, Zoox, and other AV operators provide full narrative descriptions of every incident. Here’s a typical Waymo report from the same dataset:

“The Waymo AV was traveling northbound on N. 16th Street in the left lane when it slowed to a stop to yield to a pedestrian that had begun crossing the roadway. While the pedestrian continued to cross and the Waymo AV remained stopped, a passenger car approaching from behind made contact with the rear of the stationary Waymo AV.”

That’s accountability. That’s transparency. Tesla provides none of it.

It’s clear that Tesla is not responsible for some of these crashes, but the fact that we don’t know is entirely due to Tesla’s own secrecy.

A great example is an incident that happened last week in Santa Monica, California, where a Waymo hit a child in a school zone. That sounds awful, doesn’t it? Potentially a company-ending incident, but Waymo released all the details, which confirmed that the child ran into the street while hidden behind an SUV. The Waymo vehicle immediately detected the child and while it didn’t have to time to prevent the impact, it was able to apply the brakes and reduce the speed from 17 mph to under 6 mph before contact was made.

As a result, the child was OK. Waymo even claims that its models show that a human driver would have likely reacted more slowly and hit the kid at twice the speed.

It’s better to know about these incidents than to keep everything secret to avoid publicizing those you are responsible for.

Electrek’s Take

There’s good and there’s bad in this. With only a crash in October and one in November, there appears to be improvements.

But the overall data is sobering.

A crash every 55,000 miles, with a safety monitor in the car, is not robotaxi-ready. It’s not even close. And the complete lack of transparency about what’s causing these crashes makes it impossible to have confidence that Tesla is learning from them.

Waymo operates fully driverless vehicles in multiple cities and publishes detailed information about every incident. Tesla operates supervised vehicles in one geofenced area and redacts everything.

If Tesla wants to be taken seriously as a robotaxi operator, it needs to do two things: dramatically improve its safety record, and start being honest about what’s happening on the roads of Austin.

Right now, it’s failing at both.

Add Electrek as a preferred source on Google Add Electrek as a preferred source on Google

FTC: We use income earning auto affiliate links. More.