I’m Finally Out Of Tesla Full Self Driving (FSD) Purgatory And Using It Again

My wife and I have had access to Tesla Full Self Driving (Beta) and used it obsessively for at least 6 months, but then we crashed 5 times and lost access to FSD Beta for the second time. We were told that we would have access restored with a future software update. Tesla never specifically described what a failure (strike) was, and sometimes it seems like we get two or three strikes at the same time. According to Tesla, we are among the ~285,000 who have paid from $5,000 (April 2019) to $15,000 (since September 2022) for Full Self Driving (FSD). We are reportedly among the ~160,000 who have passed the security test and are actively using the system.

Tesla is very proud of the data showing that drivers using FSD have many fewer accidents per kilometers than drivers who do not. If this is true, it seems highly counterproductive – even immoral – to unnecessarily suspend access to the system for anyone who has paid for it. [Editor’s note: On the flip side, it could be the hypersensitivity to whom is allowed to use FSD that has led to the good stats, presuming they do still show fewer crashes per mile driven. —Zach Shahan]

The “five strikes you’re out” policy, followed by 60+ days in purgatory, strikes me as a very poor system. Any Tesla driver who uses the standard Automatic Lane Assist knows that if you turn up to tighten the steering wheel too little, too much and mainly too rarely, you lose access to it for the “rest of​​​​​​​​​​​​​​​​​​​​​​​​​​ If you are driving on a restricted access road, you must wait until the next exit, pull over, stop, put the car in park, and then you have access again. It seems to me that this would also be an excellent system for FSD. It causes you considerable inconvenience, and the punishment fits the crime. Why would Tesla remove access to its wonderful security system any further than this?

Tesla has been using “torque the wheel” now since the earliest days of Automatic Lane Assist. It determines whether your hands are on the wheel, but it does not determine whether you are paying attention to the road. For at least 4 years now, Tesla has installed a camera above the rearview mirror in its cars that can determine if the driver’s face, perhaps even eyes, are facing the road while using the FSD software. Since the camera method is much better than the turn the wheel method, I don’t understand why Tesla has to put the driver in double jeopardy by using both methods. It has even been reported that Tesla may eliminate the wheel torque method if the driver can drive safely using FSD for 1000 miles. Really? Another test of questionable benefit? Just done with steering torque for all Tesla cars that have the dash cam.

Last night, after waiting ~60 days, we got our third software update (2022.44.30.10) since we lost access. This time it was V10.69.25.2 by FSD. Not only was our access to FSD restored, but we have been awarded 5 new strikes.

Note: Purgatory aside, my wife and I have for over 300 days obsessively used and observed the Tesla FSD in our Model 3 through 9 software versions V10.5, V10.8, V10.10, V10.11.2.1, V10.12.2 , V10.69. V10.69.2.4., V10.69.3.1 and now V10.69.25.2.

My wife and I will try very hard to maintain access longer this time! In the past, we’ve obsessively used FSD whenever possible. This time we will turn off FSD except when we can give total concentration to maintain access to the software. No more eating, navigating, adjusting controls, distractions from grandchildren, cross-country skiing, etc., etc. when using FSD. There’s not much reason to use FSD when you’re traversing because regular FSD gives you automatic lane keeping on limited access roads and will take the exit to Superchargers. Cross-country driving gives you many hours to lose concentration and lose one of your 5 allotted strikes. Ironically, when an experienced driver is behind the wheel, most driving is done reflexively and stress levels are low. Unfortunately, your concentration level has to be higher with FSD, because you can’t always be quite sure what it’s going to do.

With FSD, the camera looks over the rearview mirror at your face and knows if you’ve been looking down at your phone, looking at the screen to the right or perhaps closed your eyes for too long. Most people agree that the driver should concentrate on the road ahead. But the first warnings come from the color blue flashing at the top of the screen to your right. If you concentrate on the road ahead, you may miss the warning on the screen to the right of your peripheral vision. The secondary warning is an audible signal. But when you get the audible signal, you are well on your way to a forced disconnection (eg a strike).

It has been reported that Tesla is being sued as a result of deaths that occurred when FSD was active. I’ve driven my Model 3 safely now for over three years and almost 90,000 miles and used the FSD obsessively for the 6 months I’ve had it. I used automatic lane assist occupied the rest of the time. I follow the warnings in the FSD and ALA instructions and I am prepared to intervene immediately when the automated software fails.

Tesla Automatic Lane Assist is a very powerful system! It will keep your car in the center of the lane better than anyone but the most experienced drivers can do. It is far superior to the lane assist I had on my 2018 Nissan Leaf. It brakes automatically and makes precise turns as sharp as those marked at 25mph. The Tesla FSD extends this to hairpin turns marked 15 mph and will also navigate precisely in a roundabout (rotating).

The Tesla FSD is very predictable, except when it isn’t. Faced with the repeated situation of switching to the turning lane when following the navigation would require staying in the center through the lane, it makes the same mistake every time. After exiting I-15 in Orem, Utah, at 1600 North, we have observed this behavior numerous times through many software updates. It continues with the latest version of the software. However, you would not expect this behavior the first time it is encountered. Generally I don’t trust FSD in heavy traffic except at traffic lights. It will often react too slowly and merge onto a busy street or highway – at which point I’ll give it a nudge with the accelerator. Even in light traffic at a stop sign, FSD will often stop prematurely and creep forward slowly until it determines it is safe to continue. Meanwhile, unless you’re patient, you’ll get annoyed, and a driver following you will definitely get annoyed.

One of the most disturbing features of FSD is that it sometimes chooses the wrong lane. It can be a turning lane or a wide cycle lane. This behavior continues with the latest version.

I have used FSD Beta now through nine versions of the software (see versions above). I enjoy being part of an artificial intelligence experiment. I really like being able to set an address in the navigation system, pull into the street in front of my house and have my car drive to that address – without intervention in some cases. Sometimes I like to patiently test the performance in light traffic. Other times I will only use FSD (especially in heavy traffic) when I am sure it will work.




Do you appreciate CleanTechnica’s originality and cleantech news coverage? Consider becoming a CleanTechnica member, supporter, technician or ambassador – or patron on Patreon.


Don’t want to miss a cleantech story? Sign up for daily news updates from CleanTechnica by email. Or follow us on Google News!

Have a tip for CleanTechnica, want to advertise, or want to suggest a guest for our CleanTech Talk podcast? Contact us here.



Leave a Reply

Your email address will not be published. Required fields are marked *