Insiders tear up Tesla’s stance on safety at hard-hitting Elon Musk doc

You are currently viewing Insiders tear up Tesla’s stance on safety at hard-hitting Elon Musk doc

If you own or love a Tesla, or are considering buying one, or share public roads with Tesla cars, you might want to watch the new documentary “Elon Musk’s Crash Course.”

The 75-minute horror show, which premiered on FX and Hulu on Friday, highlights the enduring dangers of Tesla’s self-driving technologies, the company’s lax safety culture, Musk’s PT Barnum-style marketing hype, and weak knee safety regulators who don’t seem to care.

Well-reported and definitively true (I’ve been researching the company since 2016 and can attest to it), the project, which is part of the ongoing New York Times Presents series, could well become a historic piece. Hell-they-thinker variant.


The complete guide to viewing at home

Get Screen Gab for weekly recommendations, analysis, interviews, and flipping discussions about streaming TV and movies that everyone’s talking about.

You may receive promotional content from the Los Angeles Times from time to time.

The middle line is the story of Joshua Brown, a rabid Tesla fan and his autopilot-powered Tesla who was beheaded in 2016 when he drove himself at full speed on a Florida highway under the trailer of a semi-truck.

Lessons learned at Tesla did not prevent the same fatal crash in Florida three years later. Since then, an unknown number of autopilot-related accidents have occurred—unknown to anyone other than Tesla, who has the ability to track their cars via wireless connections—because the government’s decades-long process of collecting crash statistics is not fit for the digital age. The company is currently under investigation by federal safety regulators for its tendency to crash into emergency vehicles parked on the side of the highway.

Here are four more important takeaways from “Elon Musk’s Crash Course”.

The New York Times publishes “Elon Musk’s Crash Course” Part 1 on Friday.


1. Tesla’s Autopilot feature hasn’t received enough testing, former employees claim

According to several former members of the Autopilot development team featured in the documentary, the pressure to deliver Autopilot features to customers quickly, ready or not, was relentless. “There was no deep research phase,” as with other self-driving car companies, says one engineering program manager, “customers mainly expect professional test drivers.”

The testimonies of these developers are a standout feature of Crash Course. It’s rare to hear from inside Tesla because “free speech absolutist” Musk has his employees sign strict non-disclosure agreements and enforces them with numerous well-paid lawyers.

The company said that when Brown’s car got under the truck, the system mistook the side of the trailer for the bright skies and blamed the camera provider. But the Autopilot team inside Tesla was still struggling with how its software could distinguish a truck crossing the highway from the overhead bridge, says software engineer Raven Jiang: “The learning rate wasn’t very good. It was difficult for me personally to believe that the promise would be kept.”

The media was reporting on Theranos scam that caused a “soul-seeking” for Jiang, who was leaving for another job at the time. Autopilot engineering program manager Akshat Patel voices Jiang’s concern. If Patel views Tesla as “an example of scientific integrity, public responsibility, and logical and methodical engineering,” he says, “it’s not.”

2. Fully autonomous Teslas are more science fiction than reality

Tesla is currently selling a non-self-driving feature called Fully Self-Driving for $12,000. There are no fully autonomous cars available to individual buyers.

But that hasn’t stopped Musk from claiming every year that autonomous Teslas are just around the corner.

Clip after clip from “Crash Course” showcases his false claims.

2014: “No hands, no feet, nothing,” says Tesla’s steering wheel. “The car can do almost anything.”

2015: Musk told a crowd he was “quite confident” that autonomy would be achieved within three years, to the point where you could “stay asleep all the time.”

2016: “I think we are less than two years away from full autonomy,” journalist Kara tells Swisher in a conference scene.

2017: “We are on track to switch from Los Angeles to New York fully automatically by the end of the year.”

2018: “By the end of next year, self-driving will be at least 100% to 200% safer than a human.”

2019: Buying a car without Full Autonomous Driving is “like buying a horse”.

2022: In a black cowboy hat and sunglasses: “The car will take you wherever you want, 10 times safer than driving by yourself. It will just completely revolutionize the world.”

The Doctor also explains how he manipulated a widely shared video of Tesla driving himself on the streets of Palo Alto a few months after Brown’s death.

It should be noted that revenue from Autopilot and Full Self-Driving is largely responsible for meeting the compensation goals that have made Musk so rich.

A man with glasses and a burgundy T-shirt

Engineer Raven Jian at “Elon Musk’s Crash Course.”


3. Musk’s fans are not holding back. even on camera

Among his 94 million Twitter followers, Musk especially attracted a rabid fan base, which “Crash Course” occasionally points to. (A tweet says “Elon Lord is the Lord”) But the documentary doesn’t go deep. why many people seem fascinated by it. This is the domain of speculation or perhaps psychology.

Still, there are some select quotes from Tesla fans and Musk supporters sitting down for interviews, apparently unaware of the irony:

“I think Elon wants to leave a mark on the world.”

“Any company would kill to have this level of fandom and commitment.”

“He has the resources that allow him to do things that might be irresponsible or crazy for others.”

4. Regulatory errors are part of the problem

Halfway through “Crash Course,” viewers may begin to wonder: Where are the safety regulators?

Great question. The National Highway Traffic Safety Administration investigated the Brown accident, determined that Autopilot had somehow missed the wide side of a truck in front of the car, and yet determined that there were no faults, Tesla gave permission.

“I was a little stunned,” New York Times reporter Neal Boudette tells the camera. “The system couldn’t see the tractor trailer? And isn’t that a flaw?”

A communications officer for NHTSA at the time tries to explain: “This is a bit complicated and almost illogical, isn’t it? The autopilot didn’t even engage to stop the crash. But the thing is, Autopilot was not designed to stop every accident in every situation.”

The reality is that several senior NHTSA officials in both the Obama and Trump administrations continued to take jobs in the self-driving car industry.

NHTSA, led by Biden and Transportation Minister Pete Buttigieg, is getting harder with Tesla on accessing data and their investigations continue.

Meanwhile, on May 12, a high-speed Tesla crashed into the Newport Beach highway construction site. Three people died. Autopilot or Full Self-Driving included? Police are investigating the incident and NHTSA has launched an investigation.

‘The New York Times Presents: Elon Musk’s Crash Course’

Where: FX

When: Saturday, 10 pm

Streaming: Hulu any time starting Saturday

Evaluation: TV-MA (May not be suitable for children under 17)

window.fbAsyncInit = function() {
FB.init({ appId: ‘134435029966155’, xfbml: true,
version: ‘v12.0’
if (document.getElementById(‘facebook-jssdk’) === null) {
const js = document.createElement(‘script’); = ‘facebook-jssdk’;
js.async = true;
js.setAttribute(‘crossorigin’, ‘anonymous’)
window.setTimeout(function () {
}, 1500);