Tesla released a new version of its controversial “Full Self-Driving Beta” software last month. Among the updates in version 11.4 are new algorithms determining the car’s behavior around pedestrians. But alarmingly, a video posted to Twitter over the weekend shows that although the Tesla system can see pedestrians crossing the road, a Tesla can choose not to stop or even slow down as it drives past.
The video was posted by the Whole Mars Catalog account, a high-profile pro-Tesla account with more than 300,000 followers. The tweet, which has been viewed 1.7 million times, featured a five-second video clip with the accompanying text:
One of the most bullish / exciting things I’ve seen on Tesla Full Self-Driving Beta 11.4.1.
It detected the pedestrian, but rather than slamming on the brakes it just proceeded through like a human would knowing there was enough time to do so.
The person posting the video then clarified that it was filmed in San Francisco and that anyone not OK with this driving behavior must be unfamiliar with city life. (As someone who has lived in big cities all his life, I am definitely not OK with cars not stopping for pedestrians at a crosswalk.)
Most partially automated driving systems like General Motors’ Super Cruise or Ford’s BlueCruise are geofenced to a controlled operational domain—usually restricted-access divided-lane highways. Tesla has taken a different approach, though, and allows users to unleash its FSD beta software on surface streets.
Not everyone is as comfortable with Tesla drivers road-testing unfinished software around other road users. In February, the National Highway Traffic Safety Administration told Tesla to issue a recall for nearly 363,000 vehicles with the software installed.
The agency had four principal complaints, including that the “FSD Beta system may allow the vehicle to act unsafe around intersections, such as traveling straight through an intersection while in a turn-only lane, entering a stop sign-controlled intersection without coming to a complete stop, or proceeding into an intersection during a steady yellow traffic signal without due caution.”
The version 11.4 update in April was supposed to improve how the cars behaved, but there’s now more evidence that the FSD Beta still leads to Teslas breaking traffic laws. Section 7 of California’s Driver’s Handbook, which deals with laws and rules of the road, says that pedestrians are considered vulnerable road users and that “pedestrians have the right-of-way in marked or unmarked crosswalks. If there is a limit line before the crosswalk, stop at the limit line and allow pedestrians to cross the street.”
This is not the first time Tesla’s software has been programmed to break traffic laws, either.
FSD is “make or break” for Tesla
Tesla CEO Elon Musk has repeatedly talked about the importance of FSD to his company, saying that it is “make or break” for Tesla and that it’s the difference between Tesla being “worth a lot of money or worth basically zero.”
FSD Beta has been implicated in a number of crashes and is the subject of several of the open federal investigations into Tesla’s electric vehicles. The option now costs $15,000, and each time the automaker declares another feature “complete,” it allows the company to recognize some of the deferred revenue it has been collecting as payments for the software.
Despite that bold stance in public, Tesla has been far more circumspect when dealing with authorities—in 2020, it told the California Department of Motor Vehicles that it did not expect FSD to become significantly more capable and that it would never pass beyond so-called SAE level 2, which requires an alert human in the driver’s seat who remains liable for the car’s actions.
Or, as author Ed Niedermeyer more concisely put it, “Full Self-Driving” is not, and never will be, actually self-driving.”
Tesla is holding its annual shareholder meeting later today in Texas.