A Twitter user has been told to “stay at home” after asking YouTube influencer Tesla Joy how to opt-out of Tesla’s Full-Self Driving Beta software.
The electric car maker had issued a recall of nearly 12,000 cars using its Full Self-Driving Beta software version 10.3, which was released on Saturday, October 23. Because incorrect collision warnings and needless auto emergency braking occurrences are possible, the firm disclosed an update two days later. As of October 29th, more than 99.8% of vehicles had been updated according to the company. This, however, has caused a rift between people who do not want to be involved with this public testing.
Beta users have access to an “autosteer on city streets” function, which hasn’t been perfected yet and allows drivers to ‘semi’-autonomously manoeuvre around cities alongside other cars, pedestrians, cyclists, and pets without having to use their own hands while driving.
Despite this, both hands on the wheel are recommended, as well as being prepared to take control at any moment. It must be noted that Tesla’s driver assistance systems, including its basic Autopilot package, premium Full Self-Driving option, and FSD Beta, are not autonomous.
Influencer, Tesla Joy, documents her journey with her Model 3 to her 16K subscribers on YouTube and 14k Twitter followers and is clearly a huge Tesla fan. She has been backing the FSD beta testing program, however, reminds people that they can opt out of the program in a recent tweet.
Philip Koopman took to Twitter to make the point that pedestrians cannot simply “opt-out” of this public test. Tesla Joy’s next response sparked outrage when she told the user to “stay at home”.
Tesla Beta Tester and YouTube influencer shares a strategy for pedestrians to opt out of being involuntary human test subjects. pic.twitter.com/YQT4xw6F8f
— Philip Koopman (@PhilKoopman) November 1, 2021
This is a subject that has risen to the top of social sites over the past few weeks after the beta program was released to specific drivers. The question is, while accidents are rare under software such as FSD, they do happen, so how does the public opt-out? This may be something that Tesla needs to address, but the answer is not to stay at home.