A Tesla owner reported that he crashed his Cybertruck into a pole after hitting a curb while using Full Self-Driving, Teslaβs advanced driving assist system that Elon Musk claims will be unsupervised this year.
The post is going viral.
Jonathan Challinger, a Florida-based software developer working for Kraus Hamdani Aerospace, reported in a viral post on X that he crashed his Cybertruck into a pole.
He reported that he was driving using Teslaβs Full Self-Driving system, a suite of advanced driver assist (ADAS) features that require driver supervision at all times. However, Tesla claims that it will soon work without driver supervisionβhence the name.
Challinger said that he was driving with FSD v13.2.4 on the right lane, which was ending and merging into the left lane, but the car failed to merge and hit the curb.
He said that he failed to react in time and take control of the Cybertruck:
It failed to merge out of a lane that was ending (there was no one on my left) and made no attempt to slow down or turn until it had already hit the curb.
The Cybertruck then crashed into a light post. He was lucky to walkway without a scratch.
To be fair, it was a strange location for a post, but thereβs no reason why Tesla FSD shouldnβt have moved lane and even if it wouldnβt have changed lane, it should have hit the curb or post (pictures via TroyTeslike):


Challinger said that he shared the story as a βpublic service announcementβ to tell people to remain attentive when using Teslaβs Full Self-Driving system and not become complacent:
Big fail on my part, obviously. Donβt make the same mistake I did. Pay attention. It can happen. I follow Tesla and FSD pretty closely and havenβt heard of any accident on V13 at all before this happened. It is easy to get complacent now β donβt.
It might be the first crash on Teslaβs latest FSD v13, which CEO Elon Musk has presented as βmind-blowingβ and an important step toward achieving βunsupervised self-drivingβ by the end of the year.
Musk and his Tesla influencers often share FSD videos claiming that the technology is βon the verge of becoming truly self-driving,β despite data evidence pointing to Tesla still being years away from achieving this.
Electrekβs Take
This guy is lucky to be alive, and he is right. Thereβs a problem with people becoming complacent with FSD, and Tesla, and especially Elon Musk, are not doing enough to prevent that from happening.
On the contrary, Musk continues to hype every update, like Tesla is on the verge of solving self-driving, and claims that its quarterly safety report proves that FSD is safer than human driving, which is misleading.
If Tesla was developing FSD in a vacuum without Elonβs claims that it would be solved every year for the last 5 years and Tesla selling the software package to customers without any clear idea of when it can be achieved or on what hardware, it would be celebrated product.
Instead, itβs a product that is making Tesla lose credibility and potentially dangerous, as we see today.
I myself had the exact same problem that Challinger described where a lane ends, but FSD doesnβt detect it. Itβs weird because it works most of the time so you can get this sentiment of complacency and give the system a chance to move. In this case, it evidently went too far.
Be careful out there and stop believing Elon Musk when he talks about self-driving.