Judge Rules Against Elon Musk and Tesla in Fatal Autopilot Trial

By B.N. Frank

Significant incidents and issues – some fatal – have been reported about all autonomous vehicles (AVs) – most notably Cruise Robotaxis and Teslas.  In regard to Teslas, it’s hard to keep track of all of the problems associated with them; however, recently a Florida judge ruled against the company and owner, Elon Musk, for ignoring known defects and limitations.

From Ars Technica:


Elon Musk and Tesla ignored Autopilot’s fatal flaws, judge says evidence shows

Tesla may owe punitive damages to victim as key trial over Autopilot proceeds.

Ashley Belanger – 11/22/2023, 1:11 PM

A Florida judge, Reid Scott, has ruled that there’s “reasonable evidence” to conclude that Tesla and its CEO Elon Musk knew of defects in Autopilot systems and failed to fix them. Testimony from Tesla engineers and internal documents showed that Musk was “intimately involved” in Tesla’s Autopilot program and “acutely aware” of a sometimes-fatal defect—where Autopilot repeatedly fails to detect cross traffic, Scott wrote.

“Knowing that the Autopilot system had previously failed, had limitations” and, according to one Tesla Autopilot systems engineer, “had not been modified, Tesla still permitted the ‘Autopilot’ system to be engaged on roads that encountered areas of cross traffic,” Scott wrote.

Because a jury could perhaps consider that a “conscious disregard or indifference to the life” of Tesla drivers, Scott granted a motion to seek punitive damages to Kim Banner, whose husband Jeremy was killed in 2019 when his “Model 3 drove under the trailer of an 18-wheeler big rig truck that had turned onto the road, shearing off the Tesla’s roof,” Reuters reported. Autopilot allegedly failed to warn Jeremy or respond in any way that could have avoided the collision, like braking or steering the vehicle out of danger, Banner’s complaint said.

Seemingly most worrying to the judge, Tesla engineers told Scott that following Banner’s death in 2019 and the “eerily similar” death of another Florida driver, Joshua Brown, in 2016, the automaker did nothing to intervene or update Autopilot’s cross-traffic detection system. Tesla continues to market the Autopilot feature as safe for drivers today, Scott wrote.

Activist Post is Google-Free — We Need Your Support
Contribute Just $1 Per Month at Patreon or SubscribeStar

“It would reasonable to conclude that [Tesla] dismissed the information it had available in favor of its marketing campaign for the purpose of selling vehicles under the label of being autonomous,” Scott wrote.

Scott noted that Tesla’s marketing of Autopilot is “important” in this case, pointing to a 2016 video that’s still on Tesla’s website, where Tesla claimed “the car is driving itself.” The video, Scott wrote, shows the car navigating scenarios “not dissimilar” to the cross-traffic encounter with a truck that killed Banner’s husband.

“Absent from this video is any indication that the video is aspirational or that this technology doesn’t currently exist in the market,” Scott wrote.

Scott’s ruling is considered a big blow to Tesla, which had denied liability for Banner’s death, arguing that human error is at fault because “its manual and the ‘clickwrap’ agreement sufficiently warned” Tesla owners of Autopilot’s limitations.

The judge wrote that Tesla would be able to make this argument at trial—which has been delayed—but at this stage, a jury presented with available evidence could reasonably find Tesla guilty of both intentional misconduct and gross negligence. The result of such a verdict could put Tesla on the hook for awarding huge damages that experts have said could also enormously damage Tesla’s reputation. At trial, Banner will likely argue that these warnings were inadequate.

Banner’s attorney, Lake “Trey” Lytal III, told Reuters that they are “extremely proud of this result based in the evidence of punitive conduct.”

Musk’s “de facto leader” of Autopilot team

Despite being called the “de facto leader” of the Autopilot team in internal Tesla emails, Musk did not provide a deposition in Banner’s case, which was one of the first times Tesla had to defend itself against claims that Autopilot was fatally flawed, Reuters reported. In August, Tesla also seemingly attempted to do damage control by seeking to “keep deposition transcripts of its employees and other documents secret,” Reuters noted.

Scott’s order has seemingly now made it clear why Tesla may have considered those depositions and documents so damning.

Tesla’s Autopilot feature has been under investigation for years. In 2021, as Tesla began experimentally testing the feature with drivers it determined were safe, the US launched a major probe after 11 Teslas crashed into emergency vehicles. That same year, Consumer Reports found that Autopilot could easily be rigged to work without a driver in the seat, raising more safety concerns.

By summer of 2022, the National Highway Traffic Safety Administration (NHTSA) had documented 273 crashes involving Teslas using Autopilot systems, and when the season turned to fall, the feds promptly opened a criminal investigation into Tesla Autopilot claims, “examining whether Tesla misled consumers, investors and regulators by making unsupported claims about its driver assistance technology’s capabilities.”

Tesla’s troubles have continued throughout 2023. The year started with Tesla announcing to its investors that it was being investigated by the Justice Department, which had requested documents related to Tesla’s Autopilot and Full Self Driving features. Around the same time, it came out that Tesla had staged the 2016 self-driving demo—a fact that Scott directly called out in his November order—and that Musk had overseen the staging.

“I will be telling the world that this is what the car *will* be able to do, not that it can do this upon receipt,” Musk said in an email to employees where the CEO made it “absolutely clear that everyone’s top priority is achieving an amazing Autopilot demo drive.”

Fatal crashes have continued to be a top concern for officials attempting to understand how safe Tesla’s Autopilot and Full Self Driving features are. In February, Tesla hit a new low and recalled 362,758 cars because its Full Self Driving Beta was considered too dangerous.

But Tesla has not stopped defending its cars amid the constant scrutiny. This year, Tesla had two big wins when the National Transportation Safety Board found that Autopilot had no involvement in a fatal Texas Tesla crash, and a California jury found that Tesla Autopilot was not responsible for a 2019 fatal crash.

With Tesla cars still using potentially defective Autopilot systems on the road today, Scott’s order granting Banner leave to amend her complaint to seek punitive damages could result in discovery that further exposes how Tesla makes decisions regarding driver safety. That could be meaningful to the NHTSA, which earlier this year demanded answers from Tesla on its Autopilot systems, but Tesla reportedly failed to respond.

According to an expert in autonomous systems in vehicles and driver monitoring systems who testified as one of Banner’s witnesses, Mary “Missy” Cummings, Tesla’s Autopilot is substandard in many ways. Cumming’s opinion was that Tesla overrepresented the Autopilot technology “as far more capable than it actually was” and “used insufficient avoidance detection technology.” Cummings also suggested that “the steering wheel torque driver inattention attenuation technology was substandard to driver facing cameras used by other auto manufacturers.” In addition to other flaws, these allegedly major defects, Cummings said, “led to the foreseeability of the resulting crash in this case,” Scott wrote.

Scott based part of his opinion specifically on Cummings’ testimony and testimony from Adam Gustafsson, the systems engineer who investigated both Banner’s and Brown’s Tesla crashes, concluding that, “There is reasonable evidence from which the finder of fact could conclude that Tesla through its officers, employees and agents knew the vehicle at issue had a defective Autopilot system and allowed the vehicle with the system to be driven in areas not safe for that technology.”


Activist Post reports regularly about AVs and other unsafe technologies.  For more information, visit our archives.

Become a Patron!
Or support us at SubscribeStar
Donate cryptocurrency HERE

Subscribe to Activist Post for truth, peace, and freedom news. Follow us on SoMee, Telegram, HIVE, Minds, MeWe, Twitter – X, Gab, and What Really Happened.

Provide, Protect and Profit from what’s coming! Get a free issue of Counter Markets today.


Activist Post Daily Newsletter

Subscription is FREE and CONFIDENTIAL
Free Report: How To Survive The Job Automation Apocalypse with subscription

Be the first to comment on "Judge Rules Against Elon Musk and Tesla in Fatal Autopilot Trial"

Leave a comment