Emails Reveal Elon Musk Oversaw Deceptive 2016 Tesla Self-Driving Video; Could Lead to Criminal Charges

By B.N. Frank

Experts have warned for years that all autonomous aka self-driving software applications are problematic (see 1, 2, 3, 4, 5) – not just Tesla’s (see 1, 2).  Nevertheless, perhaps because of increasing accidents, investigations, and lawsuits filed against the company, issues reported regarding Tesla self-driving applications seem to be getting the most attention.

From Ars Technica:


Musk oversaw staged Tesla self-driving video, emails show

Emails show Musk wanted an aspirational—not actual—demo of Full Self-Driving.

Jonathan M. Gitlin

If there was any doubt that Tesla CEO Elon Musk knew the company’s much-watched 2016 self-driving demo was staged, emails obtained by Bloomberg should lay that to rest. “Just want to be absolutely clear that everyone’s top priority is achieving an amazing Autopilot demo drive,” Musk wrote in an email. “Since this is a demo, it is fine to hardcode some of it, since we will backfill with production code later in an OTA update.”

Musk saw little wrong with this strategy, saying, “I will be telling the world that this is what the car *will* be able to do, not that it can do this upon receipt,” he wrote. But instead of making this clear, the video, released to the world via Musk’s Twitter account, opens instead with white text on a black background telling the viewer that “the person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.”

Musk took to Twitter on the day of the video’s release to tell his followers that the car could read parking signs, and it knew not to park in a disabled spot. He also claimed that someone could use the “Summon” function on a car parked on the other side of the country.

But Summon was only released to Tesla drivers three years later. And the result was rather underwhelming, as the system struggled with navigating low-speed parking lots in a way that makes the suggestion that the system could drive 3,000 miles on public roads unaided rather ludicrous.

As we now know from Tesla’s head of Autopilot software, Ashok Elluswamy, the parking demo actually saw the Model X SUV crash into a fence. A 2021 New York Times article—now mostly confirmed by Elluswamy’s testimony in a lawsuit into the death of Walter Huang—also alleged that the car drove over a curb and through some bushes before finding the fence.

This is not the first time Tesla has shown difficulty in working with facts. In 2019, we discovered that the company’s repeated claims that Autopilot reduced crashes by 40 percent were bogus, and in fact, the system may have increased crashes by 59 percent.

That same year, the National Highway Traffic Safety Administration had to tell Tesla it was misleading customers by claiming that NHTSA had labeled the Tesla Model 3 the safest car it had ever tested.

Once more, with feeling

According to Bloomberg, the video that Tesla released on October 20, 2016, was the subject of a lot of revision. Musk’s chaotic management style—laid bare to the world following his recent purchase of Twitter—was on display back then.

On October 11, 2016, Musk told staff that everyone would be required to write a daily log detailing their contributions to the demo; at Twitter, Musk demanded that staff print out their most recent lines of code for review, an order that was quietly rescinded sometime later (presumably once reality set in). Days after Musk issued his daily log demand, a fourth draft was shared with Musk. This time, the CEO thought there were too many cuts and that the demo should appear “like one continuous take.”

In real-world conditions, the performance of Autopilot and the newer, even more controversial “Full-Self Driving” systems remain poor. NHTSA has multiple open investigations into whether Tesla’s driver assistance systems are safe, including one following hundreds of reports of phantom braking behavior, another to determine if Tesla cars are able to detect the presence of motorcyclists after at least two riders have been killed after they were hit by Teslas, and a third into the propensity of Teslas to crash into emergency vehicles.

Criminal charges are a possibility, too. Intentionally deceiving one’s investors or customers remains a crime in the United States, and federal prosecutors have been looking into whether Tesla’s and Musk’s claims about its driver assistance systems meet that bar. Elluswamy’s testimony surely isn’t helping Tesla’s case.

Jonathan received his BSc in Pharmacology from King’s College London, and his PhD in Pharmacology from Imperial College London, and followed up with postdoctoral work at The Scripps Research Institute in La Jolla, CA, and the University of Kentucky in Lexington, KY, where he also taught International Science and Technology Policy at the Patterson School of Diplomacy and International Relations. It was during his postdoc years that he started writing for Ars Technica, covering the sciences with the occasion foray into racing games.


Activist Post reports regularly about autonomous vehicles (AVs) and other unsafe technologies.  For more information, visit our archives.

Become a Patron!
Or support us at SubscribeStar
Donate cryptocurrency HERE

Subscribe to Activist Post for truth, peace, and freedom news. Follow us on SoMee, Telegram, HIVE, Flote, Minds, MeWe, Twitter, Gab, What Really Happened and GETTR.

Provide, Protect and Profit from what’s coming! Get a free issue of Counter Markets today.


Activist Post Daily Newsletter

Subscription is FREE and CONFIDENTIAL
Free Report: How To Survive The Job Automation Apocalypse with subscription

Be the first to comment on "Emails Reveal Elon Musk Oversaw Deceptive 2016 Tesla Self-Driving Video; Could Lead to Criminal Charges"

Leave a comment