PSA To New Full Self-Driving Testers: Teslas Are Not Autonomous
Elon Musk has recently ramped up the promotion of Tesla‘s prototype “Full Self-Driving” feature. He has directed staff to showcase the capability to customers picking up their new Teslas and announced that all current Tesla owners with cars capable of running the software will receive a 30-day free trial upon activation.
However, it is important to remember that Teslas cannot drive themselves and are not autonomous. Users must stay vigilant when using FSD as it has been known to exhibit erratic, illegal, and potentially dangerous behavior.
Tesla emphasizes this in a pop-up window that drivers must click through before activating the software for the first time. They caution that FSD “may do the wrong thing at the worst time,” which is the reason why the feature has been renamed to Full Self-Driving (Supervised) from its previous (Beta) label.
Despite the name, FSD is categorized as a level 2 driver-assistance system according to industry standards, which means that the driver must supervise the car at all times, just like they would with cruise control or lane-keeping features. Tesla still has a long way to go before achieving complete autonomy (level 5) where no driver attention is required.
While FSD aims to navigate complex environments like city streets using cameras and AI, it is not without flaws. Videos online often show Teslas using the software completing drives with minimal driver input, but there have been instances where the software has made critical errors, such as running red lights and failing to recognize road closures.
Many Tesla drivers testing the new trial software have reported issues such as cutting too close to curbs, driving hesitantly through intersections, choosing the wrong lane, setting lower speeds than the posted limit, and braking harshly for no reason.
With Tesla looking to boost sales by pushing FSD, it is crucial for users to drive responsibly and be prepared to take control at all times when using the feature.