You can make people believe anything — even that they should let their Tesla drive itself
Article content
Sometimes you have to separate the personality from the product. Sometimes you have to separate the cult-like followers from the technology. Of course I’m talking about Tesla.
Advertisement 2
Article content
Last week, The Daily Mail — that British bastion of endless stories about Kardashian abs and football players’ feuding wives and whether the royal brothers are talking to one another — published a story that went viral. ”Deeply disturbing video appears to show a Tesla in full self-driving mode (FSD) repeatedly running over a child-sized mannequin in ‘controlled test conditions,’” screams the headline, because The Daily Mail screams everything.
By now you’re familiar with Tesla’s Autopilot, software that “assists with steering, accelerating and braking for other vehicles and pedestrians in its lane.” ‘Full Self-Driving’ (FSD) beta capability extends (for eligible owners with $12,000 or $199 a month) the Autopilot functions, including making lane changes and for some, Autosteer, and the chance to test things out on crowded city streets: “FSD Beta enables Tesla vehicles to drive autonomously to a destination entered in the car’s navigation system, but the driver needs to remain vigilant and ready to take control at all times.” Drivers are supposed to keep their hands on the wheel. Many don’t, and have been killed. Sometimes they don’t and kill other people. When there have been a reported 273 crashes involving Autopilot in the past year, it seems right to wonder if the reported 100,000 owners who have accessed FSD Beta will remember what beta means.
Advertisement 3
Article content
-
Watch this Tesla Model 3 ‘Self-Drive’ into oncoming traffic
-
Tesla FSD doesn’t work in Toronto, Musk blames streetcars
“FSD Beta can best be summarized as a host of new features that are not yet debugged. Chief among them is “autosteer on city streets,” which lets the car navigate around complex urban environments automatically,” according to CNBC. They went on to investigate how often it fails.
It appears the breathless claims of The Daily Mail were not quite as advertised (though that certainly applies to Tesla in other ways): when a Tesla-invested publication published a supposed frame-by-frame dive into the accompanying video and announced the FSD was not actually engaged. That was swatted away, as many started replicating the experiment — to mixed results. A Tesla fan on Twitter then offered to attempt to run over a real kid to prove it does work. The FSD, that is — not the running over a kid part.
Advertisement 4
Article content
When Tesla defenders peeled back the covers, it seems the original “test” had been conducted by The Dawn Project — an enterprise run by Dan O’Dowd. “O’Dowd had launched a Senate campaign in his home state of California, but the tech executive made it quite clear that he is making it a single-issue campaign, and that issue is Tesla’s Full Self-Driving program.” O’Dowd is a self-proclaimed billionaire and owner of a private software company. At least when The Guardian reported on the same ‘deeply disturbing’ video, they included these salient facts.
More outlets are still examining the Dawn Project test. It’s not exactly the Zapruder tape, but the truth will eventually come out. My colleague David Booth points out another valid issue: “Experience suggests that the more Tesla’s FSD is condemned as inferior, the more Musk cultists might try to self-drive their Model Xes to even more extreme trials.”
Advertisement 5
Article content
The National Highway Transportation Safety Administration (NHTSA) finally opened investigations last year into Tesla’s headline-making collisions. The collisions with first responders are particularly troubling. “NHTSA’s look into Autopilot began in August 2021 after an increasing number of crashes with Autopilot engaged were reported, including collisions with parked and emergency vehicles,” reported Consumer Reports. In June of this year, the investigation was widened from “a preliminary evaluation to an engineering analysis.” It involves Tesla’s four models from 2014 to 2022, about 830,000 vehicles.
There are two primary tent poles in the adversarial sides to this debate: the first is that Tesla is pushing out unripened technology that isn’t ready for public roads, and it is killing people. The other is that the people who kill or get killed are idiots who are misusing the vehicle’s systems, and so it’s not Tesla’s fault. With a narcissistic CEO who is rich enough to outlast and outrun issues that would sink mere mortal car companies — not to mention his deafening cheerleading section — the answer is both.
Advertisement 6
Article content
In 2018, Musk went on 60 Minutes and freaked the host out with a “look Ma, no hands.”
Tesla’s own instructions are for the driver’s hands to be on the wheel.
Musk has played similar games repeatedly, either via traditional media or through his Tweets. At that time, Wired quoted Missy Cummings, who researches human and autonomous car interaction at Duke University researching, as saying that, “[h]is board of directors needs to slap him upside the head.” There are many who agree with Missy, and no doubt many of them are within Tesla itself.
It doesn’t matter what you think of Elon Musk. What matters is whether the products he is selling are safe. NHTSA can’t govern a CEO’s idiotic recommendations to impressionable followers, but it does need to determine whether the vehicles are a danger to anyone (including clowns who misuse them) just like they do with other manufacturers.
Advertisement 7
Article content
Do Tesla’s Autopilot and FSD need to be investigated? Hell yes, but not because some billionaire wannabe politician creates his own experiment and pronounces the results; Autopilot and FSD need to be investigated because the public deserves a full accounting of what they’re sharing the road with. By the time you’ve pissed off Ralph Nader (last week he “blasted Tesla for the software and [made] the argument that it should have never been put in its cars in the first place”), the king of automotive safety has spoken. Nader must feel like he’s once again staring at the arse-end of a Pinto, or the front of a Corvair.
To summarize: a test was run with a Tesla to see if the Full Self-Driving Beta mode would stop in time to not run over a kid. The report says it smucked the kid three times out of three. Was the FSD Beta engaged? Discussions are raging. The person paying for the original test has a vested, singular interest in running Elon Musk out of town. Still, a Tesla supporter put out a call on Twitter asking for a kid to run over so that they could prove to their Twitter followers that FSD works and the kid would be in no danger. People took it seriously and wanted to call Child Services. Someone eventually offered up their kid, and that test turned out fine. And that brings us full circle on why clickbait headlines are garbage, and how you can make anyone believe anything — even that they should let their Tesla drive itself.
Stay connected with us on social media platform for instant update click here to join our Twitter, & Facebook
We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.
For all the latest Automobiles News Click Here