Transportation

Tesla asks owners to share fewer clips of ‘Full Self-Driving’ beta mistakes


Tesla is making owners who opt in to the controversial beta version of its “Full Self-Driving” software sign non-disclosure agreements and is also discouraging them from sharing video clips that show the driver assistance system making mistakes.

According to a copy of the NDA obtained by Vice, Tesla tells those who sign the document that “there are a lot of people that want Tesla to fail; Don’t let them mischaracterize your feedback and media posts.” The company also says owners in the beta should “share on social media responsibly and selectively” and “consider sharing fewer videos, and only the ones that you think are interesting or worthy of being shared.”

Vice’s report comes as Tesla is now working on expanding access to the “Full Self-Driving” software, all while the National Highway Traffic Safety Administration investigates the company’s less-advanced Autopilot driver assistance system currently available on its cars.

Tesla has allowed a small group of die-hard owners to test the beta version of the “Full Self-Driving” software for about a year now. Some of them take their roles as “beta testers” quite seriously and try to find flaws in the system in an effort to help Tesla make the software better. Many also film themselves traveling around with the software running. Some compress their longer drives into supercuts, speeding up the footage to emphasize just how far the software can take them without human intervention. Others post the raw footage, warts and all.

(As always, to be clear: this software does not make Tesla’s cars fully autonomous. Tesla CEO Elon Musk has himself even said that he believes the “feature complete” version of the software his company calls “Full Self-Driving” will, at best, only be “likely” to drive someone from their home to work without human intervention and will still require supervision. That does not describe a fully autonomous car.)

See also  Tesla agrees to recall cars with failing displays

This whole process — the years of unfulfilled claims of being able to make fully autonomous cars, the idea of beta testing in-development drivers assistance software on public streets with untrained owners behind the wheel — has drawn Musk and Tesla a lot of scrutiny. Recently, though, a clip from a video originally shot by Tesla owner and investor Galileo Russell went viral and charged the conversation even more.

In it, Russell’s car should be merging left, but it suddenly takes a dive to the right, eventually pointing straight at pedestrians in a crosswalk. A hedge fund owner shared this clip on Twitter, where many people (rightfully) were aghast at how close the car came to running over pedestrians.

In a follow-up video, Russell made a passing mention about how Tesla “doesn’t want” people in the beta sharing clips that look bad while making a larger point explaining why he posted the video to begin with. But it wasn’t until Vice reported on the NDA this week that it was clear what he meant.

Tesla is using this language to try and control the public perception of its “Full Self-Driving” software as the company starts to open up access to a much wider group — despite the software still being in development. Tesla added a button to the user interface of its cars this past weekend that lets owners request to be a part of the beta. It also launched a “safety score” system, which monitors drivers who apply and evaluates them on a number of metrics, like braking habits or aggressive acceleration.

See also  Lucid Motors goes public, collects $4.5 billion

At the moment, Musk says drivers with a perfect safety score of 100 will be accepted into the beta, though he tweeted that Tesla will lower that bar. He also said Tesla will soon start adding up to 1,000 new owners per day to the beta, a dramatic expansion of who will be able to test the driver assistance software out on public roads.

Expanding access is sure to bring even more attention to Tesla and Musk’s patchwork approach to rolling out the “Full Self-Driving” software. In fact, it already has. Last week, National Transportation Safety Board chair Jennifer Homendy told the Wall Street Journal that she wished the company would address “basic safety issues” before allowing new owners into the program. Homendy was one of the more outspoken board members during a 2020 hearing that found Autopilot to be partially at fault for the 2018 death of a driver in Mountain View, California.

Tesla doesn’t appear to be changing course, though. Over the weekend, in response to a blog post about her comments, Musk tweeted a link to Homendy’s Wikipedia page — which eventually had to be locked after a sudden rush of edits.





READ SOURCE

Leave a Reply

This website uses cookies. By continuing to use this site, you accept our use of cookies.