Since Tesla launched its Full Self-Driving (FSD) feature in beta in 2020, the company’s owner’s manual has been clear: Contrary to the name, cars using the feature can’t drive themselves.
Tesla’s driver assistance system is built to handle plenty of road situations—stopping at stop lights, changing lanes, steering, braking, turning. Still, “Full Self-Driving (Supervised) requires you to pay attention to the road and be ready to take over at all times,” the manual states. “Failure to follow these instructions could cause damage, serious injury or death.”
Now, however, new in-car messaging urges drivers who are drifting between lanes or feeling drowsy to turn on FSD—potentially confusing drivers, which experts claim could encourage them to use the feature in an unsafe way. “Lane drift detected. Let FSD assist so you can stay focused,” reads the first message, which was included in a software update and spotted earlier this month by a hacker who tracks Tesla development.
“Drowsiness detected. Stay focused with FSD,” read the other message. Online, drivers have since posted that they’ve seen a similar message on their in-car screens. Tesla did not respond to request for comment about this message, and WIRED has not been able to find this message appearing on a Tesla in-car screen.
The problem, researchers say, is that moments of driver inattention are exactly when safety-minded driver assistance features should demand drivers get ultra-focused on the road—not suggest they depend on a developing system to compensate for their distraction or fatigue. At worst, such a prompt could lead to a crash.
“This messaging puts the drivers in a very difficult situation,” says Alexandra Mueller, a senior research scientist at the Insurance Institute for Highway Safety who studies driver assistance technologies. She believes that “Tesla is basically giving a series of conflicting instructions.”
Plenty of research studies how humans interact with computer systems built to help them accomplish tasks. Generally it finds the same thing: People are really terrible passive supervisors of systems that are pretty good most of the time, but not perfect. Humans need something to keep them engaged.
In research in the aviation sector, it’s called the “out-of-the-loop performance problem,” where pilots, relying on fully automated systems, can fail to adequately monitor for malfunctions due to complacency after extended periods of operation. This lack of active engagement, also known as vigilance decrement, can lead to a reduced ability to understand and regain control of a malfunctioning automated system.
“When you suspect the driver is becoming drowsy, to remove even more of their physical engagement—that seems extremely counterproductive,” Mueller says.
“As humans, as we get tired or we get fatigued, taking more things that we need to do could actually backfire,” says Charlie Klauer, a research scientist and engineer who studies drivers and driving performance at the Virginia Tech Transportation Institute. “It’s tricky.”
Over the years, Tesla has made changes to its technology to make it more difficult for inattentive drivers to use FSD. The automaker began in 2021 to use in-car driver monitoring cameras to determine whether drivers were sufficiently paying attention while using FSD; a series of alerts warn drivers if they’re not looking at the road. Tesla also uses a “strike system” that can prevent a driver from using their driver assistance feature for a week if they repeatedly fail to respond to its prompts.
Tesla has “evolved its overall driver support features through [software updates] to help mitigate inattention and distraction-related issues,” Bryan Reimer, a research scientist who studies driver assistance technology at MIT’s AgeLab, writes in an email. But this latest in-car messaging around FSD feels like a lapse, he says. “The prompt seems highly contrary to research.”
Tesla did not respond to WIRED’s request for comment on this article.
The new FSD messages come at a delicate time for Tesla, which for years has been accused of making products that can allegedly be defective. In August, a Florida jury found the company partly liable for a 2019 crash that killed a 22-year-old woman; the crash occurred when a Tesla Model S driver was using an older version of the company’s driver assistance software, called Autopilot. The jury found Tesla liable for $243 million in overall damages; the company has appealed.
Tesla is also awaiting the result of an administrative court hearing conducted over the summer in California after the state’s Department of Motor Vehicles accused the automaker of misleading customers over what it advertised as its self-driving ability. If the judge sides against Tesla, the automaker could lose its right to sell and manufacture its EVs in the state—home to its biggest market and most productive US assembly plant—for 30 days.
At the same time, CEO Elon Musk and the company’s board of directors have put FSD at the center of the automaker’s strategy, which relies more and more on dominating in robotics and eventually, autonomous vehicles. A proposed trillion dollar pay package for Musk—shareholders will vote on it later this year—will depend, among other factors, on the CEO selling 10 million Full Self-Driving subscriptions on average over three consecutive months. Currently, FSD (Supervised) sells for $8,000, or a monthly $99 fee.
Musk has promised that the feature will transform into Full Self-Driving (Unsupervised)—a truly autonomous system—by the end of this year. He said on an April earnings call that the feature would likely become available to subscribers in “several cities” by 2026, so that drivers can “go to sleep in your car and wake up in your destination.” However, Musk has a habit of not keeping such promises.
Months later, Tesla launched a small, invite-only robotaxi service in Austin, Texas, with safety monitors in the front passenger seats. The service does not seem to have scaled as quickly as Musk promised.
In April, Musk said Tesla would be “cautious with the roll-out.” “At Tesla, we’re absolutely hardcore about safety,” Musk said.
Navigating the space between true autonomy and driver-supervised automated features will continue to be a complicated task, says Greg Brannon, who directs automotive engineering and industry relations for AAA. And not only for Tesla: Most automakers currently offer some form of “Level 2” driver assistance, which can control a vehicle’s steering and speed but must be carefully supervised by a driver. “The challenge automakers are facing is that, as the Level 2 systems get better and better, the more likely drivers are to engage in secondary tasks or become distracted,” says Brannon.
It’s a balance, and humans, unfortunately, seem to have something close to a death wish. “People will engage in riskier and riskier behaviors, assuming that the vehicle will bail them out.” In reality, the automakers will have to do that.