Highway surveillance footage from Thanksgiving Day shows a Tesla Model S vehicle changing lanes and then abruptly braking in the far-left lane of the San Francisco Bay Bridge, resulting in an eight-vehicle crash. The crash injured nine people, including a 2-year-old child, and blocked traffic on the bridge for over an hour.
The video and new photographs of the crash, which were obtained by The Intercept via a California Public Records Act request, provides the first direct look at what happened on November 24, confirming witness accounts at the time. The driver told police that he had been using Tesla’s new “Full Self-Driving” feature, the report notes, before the Tesla’s “left signal activated” and its “brakes activated,” and it moved into the left lane, “slowing to a stop directly in [the second vehicle’s] path of travel.”
Just hours before the crash, Tesla CEO Elon Musk had triumphantly announced that Tesla’s “Full Self-Driving” capability was available in North America, congratulating Tesla employees on a “major milestone.” By the end of last year, Tesla had rolled out the feature to over 285,000 people in North America, according to the company.
Tesla Full Self-Driving Beta is now available to anyone in North America who requests it from the car screen, assuming you have bought this option.
Congrats to Tesla Autopilot/AI team on achieving a major milestone!
— Elon Musk (@elonmusk) November 24, 2022
The National Highway Traffic Safety Administration, or NHTSA, has said that it is launching an investigation into the incident. Tesla vehicles using its “Autopilot” driver assistance system — “Full Self-Driving” mode has an expanded set of features atop “Autopilot” — were involved in 273 known crashes from July 2021 to June of last year, according to NHTSA data. Teslas accounted for almost 70 percent of 329 crashes in which advanced driver assistance systems were involved, as well as a majority of fatalities and serious injuries associated with them, the data shows. Since 2016, the federal agency has investigated a total of 35 crashes in which Tesla’s “Full Self-Driving” or “Autopilot” systems were likely in use. Together, these accidents have killed 19 people.
In recent months, a surge of reports have emerged in which Tesla drivers complained of sudden “phantom braking,” causing the vehicle to slam on its brakes at high speeds. More than 100 such complaints were filed with NHTSA in a three-month period, according to the Washington Post.
The child injured in the crash was a 2-year-old who suffered an abrasion to the rear left side of his head as well as a bruise, according to the incident detail report obtained by The Intercept. In one photograph of the crash, a stroller is parked in front of the car in which the child was injured.
As traditional car manufacturers enter the electric vehicle market, Tesla is increasingly under pressure to differentiate itself. Last year, Musk said that “Full Self-Driving” was an “essential” feature for Tesla to develop, going as far as saying, “It’s really the difference between Tesla being worth a lot of money or worth basically zero.”
The term “Full Self-Driving” has been criticized by other manufacturers and industry groups as misleading and even dangerous. Last year, the autonomous driving technology company Waymo, owned by Google’s parent company, announced that it would no longer be using the term.
“Unfortunately, we see that some automakers use the term ‘self-driving’ in an inaccurate way, giving consumers and the general public a false impression of the capabilities of driver assist (not fully autonomous) technology,” Waymo wrote in a blog post. “That false impression can lead someone to unknowingly take risks (like taking their hands off the steering wheel) that could jeopardize not only their own safety but the safety of people around them.”
Though Waymo doesn’t name any names, the statement was “clearly motivated by Musk’s controversial decision to use the term ‘Full Self Driving,’” according to The Verge.
Along the same lines, the premier lobbying group for self-driving cars recently rebranded from the “Self-Driving Coalition for Safer Streets” to the “Autonomous Vehicle Industry Association.” The change, the industry group said, reflected its “commitment to precision and consistency in how the industry, policymakers, journalists and the public talk about autonomous driving technology.”
Secretary of Transportation Pete Buttigieg has also been critical of the emerging driver assistance technologies, which he stresses have not replaced the need for an alert human driver. “I keep saying this until I’m blue in the face: Anything on the market today that you can buy is a driver assistance technology, not a driver replacement technology,” Buttigieg said. “I don’t care what it’s called. We need to make sure that we’re crystal clear about that — even if companies are not.”
Though the language may be evolving, there are still no federal restrictions on the testing of autonomous vehicles on public roads, though states have imposed limits in certain cases. Tesla has not announced any changes to the program or its branding, but the crash was one of multiple that month. Several days prior to the Bay Bridge accident, on November 18 in Ohio, a Tesla Model 3 crashed into a stopped Ohio State Highway Patrol SUV which had its hazard lights flashing. The Tesla is likewise suspected of having been in self-driving mode and is also being investigated by NHTSA.
NHTSA is also investigating a tweet by Musk in which he said that “Full Self-Driving” users would soon be given the option to turn off reminder notifications for drivers to keep their hands on the steering wheel. “Users with more than 10,000 miles on FSD Beta should be given the option to turn off the steering nag,” a Twitter user posted on New Year’s Eve, tagging Musk.
“Agreed, update coming in Jan,” Musk replied.
Additional reporting by Beth Bourdon.