Tesla FSD drives like a hat wearer, and the safety score finds out who should handle it
Tesla owners who have signed up for Full Self Driving (FSD) Beta have been laughing about the Tesla safety result in recent weeks. Zach Shahan himself has published at least three articles, YouTube is full of videos, Tesla forums are full of comments, and generally everyone has a bird while trying to play the system to achieve a perfect score. Trolls get headlines from the increasingly confusing Consumer reports thanks to claims that people will run red lights rather than risk their safety score. (Really is Consumer reports owned by GM or Koch Brothers? What gives?)
Tesla’s own post on the safety result goes into chapters and verses about what it does and how, but not why. Elon Musk has tweeted that owners need really high security scores to get into FSD Beta, but again, not why – at least not what I’ve seen (but I’m not an obsessive reader of every word either. is published about Tesla).
The first days probably 100/100, then 99, 98 etc.
– Elon Musk (@elonmusk) September 28, 2021
There are two main goals that the articles and comments seem to claim are Tesla’s motivation behind this. I mean, after all, if FSD is incredibly safe, why should drivers be so safe?
The first is it Tesla chases drivers in really safe surroundings on their daily laps. The theory here is that people who regularly drive through Mexico City traffic or the like, for example, are confronted with machismo give-no-ground brinksmanship 47 times before breakfast, which is quite difficult to navigate even for people who are used to it. (Really, the most aggressive mass driving I’ve ever seen is in Mexico City, and luckily I did not drive.)
I support this theory as far as it goes. FSD Beta is going to be under a lens. Any accident will be headline news and the usual suspects will bay for blood. After working with people who have done extensive work with reinforcement learning, you always start in as simple an environment as possible and add complexities. Starting in downtown Boston during Big Dig – something I tried to drive through once – is not a good choice. Places with lots of deer crossings are not good either, because no one in the company wants to see the headline “Tesla Killed Bambi!”
An overlapping group, including Zach, argues that Tesla’s obsession with safety includes training drivers to be safer through gamification. In other words, Tesla provides feedback to drivers that reflects the driving instructor’s constant and thorough assessment of your driving. Or any rear seat driver in any car ever. It will definitely work for some people. I am reminded of the hypermiles that started working towards maximum range in early modern electric cars in 2011 with the Nissan Leaf. Apparently, they are still something that does not surprise me at all. Give some people a metric and they will automatically be obsessed with maximizing that metric at the expense of everything else, including common sense, getting to work on time, sleeping, washing, paying bills, etc. Personally, I have never done that. Ever. Really.
But the flaw in this theory is that almost everyone hates driver’s seats, even if they are electronic and on the dashboard. For the drivers out there, ask yourself when was the last time you thanked your favorite rear seat driver for their input. Ask yourself how much you love your Apple Watch or Fitbit that tells you to stand or walk. Gamification works on people, it works on, and it’s not everyone. This theory does not hold so much water with me, in other words.
As someone who has regularly published programs on autonomous vehicles, has performed professional work in machine learning, published a machine learning and cleantech report through CleanTechnica, spend a lot of time on cognitive science, and as a result I have been asked to guide Canadian provincial and British academic efforts on road safety and autonomous vehicles, I may have a different opinion than most people writing articles about Tesla.
Tesla filters the drivers that do not like FSD.
This is in addition to finding drivers who have, shall we say, less intense daily laps in their Teslas, not instead, but that’s exactly what I would do.
I published about the cognitive expectation of passengers in autonomous vehicles and the impact on urban congestion in 2016. Although many people assume that autonomous vehicles will make traffic faster, it is contrary to reality. Zipping through intersections at high-speed inches from other cars and turning at hyper-speeds just isn’t going to happen, at least not until Star Trek is a reality. Passengers in autonomous vehicles expect to be able to drink a cup of coffee and feel like they are in a bus or a train, not strapped into a transparent table tennis ball in the middle of a bunch of other transparent table tennis balls.
As a result, modeling of the behavior of autonomous vehicles through city intersections consistently shows that, on average, they will be much slower. This is still the case with Waymo’s cars as they pilot the streets in the delimited geographies, mapped in incredible detail so they can work, as opposed to Tesla’s consumption model. But Tesla and Waymo have to cross in city intersections. They have to deal with the same environment and human passengers.
And drivers who constantly push for the narrow opening, change lanes to the slightest perceived advantage, drive aggressively in turns and demonstrate similar behavior, are deeply unlikely to be willing to put up with sitting behind the wheel of a car being driven as if the owner was an old man wearing a hat. One commenter commented that he loved his race days in his Model 3 Performance and that he should be allowed not to have this count on his score because he really was much more confident on the road after getting the speed out of his system on the track. Right. I have driven with race drivers and even though they were “safe”, it was because everyone around them went much slower than they were trained to handle.
When I owned cars and motorcycles, these were typically things I could get a lot of speed out of and with which I could move around obstacles. Many BMWs in my past, back then they were actually excellent driving machines. Lots of narrowly avoided tickets, and some tickets that were not avoided. I would have been a terrible candidate for FSD Beta. I would have turned it off after five minutes and been back to seek any benefit on my way to work or play where the drive was another part of the fun.
Now, when I pick up car parts, I’m pretty sure my safety result would indicate that I’m also driving like an old man with a hat, partly because I’m aged out of it, and partly because I’ve done a lot of extreme things – paragliding on the southern cliffs of Bali, windsurfing in blizzards, snowboarding in Black Diamond bowls at Whistler, calling Hold’em in several cities, etc. – that I have mostly burned out my adrenaline receptors.
And then that’s a big part of my theory about the safety outcome. It’s a filter to find people who want to drive like an old man in a hat and who want to give FSD the ability to take corners forever and not jump into the next lane. It is quite possible that many of them themselves become drivers in the back seat and shout to the FSD about the choices it makes, but otherwise sit there and pay enough attention to take over if required.
But yes, it’s a filter.
Do you appreciate the originality of CleanTechnica? Consider becoming a CleanTechnica member, supporter, technician or ambassador – or a patron of Patreon.