Elon Musk has pledged that the work of his so-called Department of Government Efficiency, or DOGE, would be “maximally transparent.” DOGE’s website is proof of that, the Tesla and SpaceX CEO, and now White House adviser, has repeatedly said. There, the group maintains a list of slashed grants and budgets, a running tally of its work.
But in recent weeks, The New York Times reported that DOGE has not only posted major mistakes to the website—crediting DOGE, for example, with saving $8 billion when the contract canceled was for $8 million and had already paid out $2.5 million—but also worked to obfuscate those mistakes after the fact, deleting identifying details about DOGE’s cuts from the website, and later even from its code, that made them easy for the public to verify and track.
For road-safety researchers who have been following Musk for years, the modus operandi feels familiar. DOGE “put out some numbers, they didn’t smell good, they switched things around,” alleges Noah Goodall, an independent transportation researcher. “That screamed Tesla. You get the feeling they’re not really interested in the truth.”
For nearly a decade, Goodall and others have been tracking Tesla’s public releases on its Autopilot and Full Self-Driving features, advanced driver-assistance systems designed to make driving less stressful and more safe. Over the years, researchers claim, Tesla has released safety statistics without proper context; promoted numbers that are impossible for outside experts to verify; touted favorable safety statistics that were later proved misleading; and even changed already-released safety statistics retroactively. The numbers have been so inconsistent that Tesla Full Self-Driving fans have taken to crowdsourcing performance data themselves.
Instead of public data releases, “what we have is these little snippets that, when researchers look into them in context, seem really suspicious,” alleges Bryant Walker Smith, a law professor and engineer who studies autonomous vehicles at the University of South Carolina.
Government-Aided Whoopsie
Tesla’s first and most public number mix-up came in 2018, when it released its first Autopilot safety figures after the first known death of a driver using Autopilot. Immediately, researchers noted that while the numbers seemed to show that drivers using Autopilot were much less likely to crash than other Americans on the road, the figures lacked critical context.
At the time, Autopilot combined adaptive cruise control, which maintains a set distance between the Tesla and the vehicle in front of it, and steering assistance, which keeps the car centered between lane markings. But the comparison didn’t control for type of car (luxury vehicles, the only kind Tesla made at the time, are less likely to crash than others), the person driving the car (Tesla owners were more likely to be affluent and older, and thus less likely to crash), or the types of roads where Teslas were driving (Autopilot operated only on divided highways, but crashes are more likely to occur on rural roads, and especially connector and local ones).
The confusion didn’t stop there. In response to the fatal Autopilot crash, Tesla did hand over some safety numbers to the National Highway Traffic Safety Administration, the nation’s road safety regulator. Using those figures, the NHTSA published a report indicating that Autopilot led to a 40 percent reduction in crashes. Tesla promoted the favorable statistic, even citing it when, in 2018, another person died while using Autopilot.
But by spring of 2018, the NHTSA had copped to the number being off. The agency did not wholly evaluate the effectiveness of the technology in comparison to Teslas not using the feature—using, for example, air bag deployment as an inexact proxy for crash rates. (The airbags did not deploy in the 2018 Autopilot death.)
Because Tesla does not release Autopilot or Full Self-Driving safety data to independent, third-party researchers, it’s difficult to tell exactly how safe the features are. (Independent crash tests by the NHTSA and other auto regulators have found that Tesla cars are very safe, but these don’t evaluate driver assistance tech.) Researchers contrast this approach with the self-driving vehicle developer Waymo, which often publishes peer-reviewed papers on its technology’s performance.
Still, the unknown safety numbers did not prevent Musk from criticizing anyone who questioned Autopilot’s safety record. “It’s really incredibly irresponsible of any journalists with integrity to write an article that would lead people to believe that autonomy is less safe,” he said in 2018, around the time the NHTSA figure publicly fell apart. “Because people might actually turn it off, and then die.”
Number Questions
More recently, Tesla has continued to shift its Autopilot safety figures, leading to further questions about its methods. Without explanation, the automaker stopped putting out quarterly Autopilot safety reports in the fall of 2022. Then, in January 2023, it revised all of its safety numbers.
Tesla said it had belatedly discovered that it had erroneously included in its crash numbers events where no airbags nor active restraints were deployed and that it had found that some events were counted more than once. Now, instead of dividing its crash rates into three categories, “Autopilot engaged,” “without Autopilot but with our active safety features,” and “without Autopilot and without our active safety features,” it would report just two: with and without Autopilot. It applied those new categories, retroactively, to its old safety numbers and said it would use them going forward.
That discrepancy allowed Goodall, the researcher, to peer more closely into the specifics of Tesla’s crash reporting. He noticed something in the data. He expected the “without Autopilot” number to just be an average of the two old “without Auptilot” categories. It wasn’t. Instead, the new figure looked much more like the old “without Autopilot and without our active safety features” number. That’s weird, he thought. It’s not easy—or, according to studies that also include other car makes, common—for drivers to turn off all their active safety features, which include lane departure and forward collision warnings and automatic emergency braking.
Goodall calculated that even if Tesla drivers were going through the burdensome and complicated steps of turning off their EV’s safety features, they’d need to drive way more miles than other Tesla drivers to create a sensible baseline. The upshot: Goodall wonders if Tesla is allegedly making its non-Autopilot crash rate look higher than it is—and so the Autopilot crash rate allegedly looks much better by comparison.
The discrepancy is still puzzling to the researcher, who published a peer-reviewed note on the topic last summer. Tesla “put out this data that looks questionable on first glance—and then you look at it, and it is questionable,” he claims. “Instead of taking it down and acknowledging it, they change the numbers to something that is even weirder and flawed in a more complicated way. I feel like I’m doing their homework at this point.” The researcher calls for more transparency. So far, Tesla has not put out more specific safety figures.
Tesla, which disbanded its public relations team in 2021, did not reply to WIRED’s questions about the study or its other public safety data.
Direct Reports
Tesla is not a total outlier in the auto industry when it comes to clamming up about the performance of its advanced technology. Automakers are not required to make public many of their safety numbers. But where tech developers are required to submit public accounting on their crashes, Tesla is still less transparent than most. One prominent national data submission requirement, first instituted by the NHTSA in 2021, requires makers of both advanced driver assistance and automated driving tech to submit public data about its crashes. Tesla redacts nearly every detail about its Autopilot-related crashes in its public submissions.
“The specifics of all 2,819 crash reports have been redacted from publicly available data at Tesla’s request,” says Philip Koopman, an engineering professor at Carnegie Mellon University whose research includes self-driving-car safety. “No other company is so blatantly opaque about their crash data.”
The federal government likely has access to details on these crashes, but the public doesn’t. But even that is at risk. Late last year, Reuters reported that the crash-reporting requirement appeared to be a focus of the Trump transition team.
In many ways, Tesla—and perhaps DOGE—is distinctive. “Tesla also uniquely engages with the public and is such a cause célèbre that they don’t have to do their own marketing. I think that also entails some special responsibility. Lots of claims are made on behalf of Tesla,” says Walker Smith, the law professor. “I think it engages selectively and opportunistically and does not correct sufficiently.”
Proponents of DOGE, like those of Tesla, engage enthusiastically on Musk’s platform, X, applauded by Musk himself. The two entities have at least one other thing in common: ProPublica recently reported that there is a new employee at the US Department of Transportation—a former Tesla senior counsel.