Placeholder Image

Subtitles section Play video

  • On the morning of May 5th, 2021, Steven Hendrickson was driving his Tesla Model 3 to work.

  • His car was an autopilot as he drove through Fontana, California.

  • At about 2.30 AM, an overturned semi-truck appeared in front of him.

  • Moments later, he was killed.

  • The 35-year-old left behind his wife and two kids.

  • This crash is one of more than a thousand that Tesla has submitted to federal regulators since 2021, the details of which have been hidden from the public.

  • Video and data gathered from these crashes by The Wall Street Journal shows that Tesla's heavy reliance on cameras for its autopilot technology, which differs from the rest of the industry, is putting the public at risk. Tesla's operating an autopilot have been involved in hundreds of crashes across U.S. roads and highways since 2016.

  • Over the years, Tesla CEO Elon Musk has maintained the technology is safe.

  • I do think that long-term, it can reduce accidents by a factor of 10.

  • We're solving just a very important part of AI and one that can ultimately save millions of lives and prevent tens of millions of serious injuries.

  • The safety for a mile is better than the safety for a car.

  • Human driving.

  • Leading expert on autonomous driving, Missy Cummings, who's been critical of Tesla, warned of the risks of semi-autonomous vehicles in 2016.

  • There is no question that someone is going to die in this technology.

  • The question is when.

  • Cummings has also worked as a safety advisor for the federal regulator, the National Highway Traffic Safety Administration, or NHTSA.

  • I am besieged with requests from families of people who have been killed in Tesla crashes.

  • It's really tough to explain to them that this is the way the tech was designed.

  • Since 2021, NHTSA has launched a series of investigations into Autopilot.

  • They've released little information about them and the cars remain on the road.

  • Also in 2021, NHTSA ordered carmakers to report all serious crashes involving semi-autonomous driving systems like Autopilot.

  • But much of the data Tesla submitted has been hidden from public view because the company considers it proprietary.

  • To unlock it, the journal gathered reports from individual states and matched them with the crash data Tesla submitted to NHTSA and found that longstanding concerns about the Autopilot technology are showing up on America's roads.

  • Of the 1,000-plus crashes Tesla submitted so far, the journal pieced together 222 and found that 44 of them occurred when Teslas in Autopilot veered suddenly. 31 occurred when a Tesla in Autopilot failed to stop or yield for an obstacle in front of it.

  • Orlando, Florida.

  • A Model 3 drove into the back of a stopped police car that was attending to a disabled vehicle.

  • Guadalupe County, Texas.

  • A Model 3 plowed through an intersection and off the road.

  • These failure-to-stop crashes, like the one that killed Steven Hendrickson, account for the most serious injuries and deaths in the cases the journal unlocked.

  • I had to hang up the phone and look at both my kids as they're crying, and they're telling me to tell them it's not true, that their dad was in a car accident, and he won't be coming home.

  • Tesla has said drivers operating in Autopilot need to be ready to take control at all times.

  • They say Steven Hendrickson was warned to keep his hands on the wheel 19 times before he crashed and that his car initiated braking before impact.

  • But whether or not a driver is alert, it's important to know how Autopilot performed, and that can only be seen through the large amounts of internal video and data Teslas record.

  • The journal obtained video and partial data from the Hendrickson crash and asked experts to analyze it.

  • The kind of things that tend to go wrong with these systems are things like it was not trained on the pictures of an overturned double trailer.

  • It just didn't know what it was.

  • There were some lights there, but the lights were in unusual positions.

  • A person would have clearly said something big is in the middle of the road, but the way machine learning works is it trains it on a bunch of examples, and if it encounters something it doesn't have a bunch of examples for, it may have no idea what's going on.

  • Experts say this illustrates a fundamental flaw in the Autopilot technology.

  • It relies mainly on cameras or computer vision, with radar as a backup in some models.

  • Other driving assist cars, they have radar, computer vision, and LIDAR, laser ranging, that kind of helps you detect obstacles.

  • But the LIDAR is very expensive.

  • Expensive sensors that are unnecessary.

  • It's like having a whole bunch of expensive appendices.

  • Instead, Tesla's camera-based system relies on humans to train it on obstacles it may encounter.

  • Even though people think these cars are learning while they're driving, they're not.

  • They're only learning at the rate that the companies are deciding to retrain the algorithms.

  • So I'm gonna turn Autopilot on right now.

  • I just pull my finger down like that one time.

  • As you can see on the screen, the visualizations have started.

  • John Bernal is a former Tesla employee who was fired in 2022 after posting videos of Autopilot failures.

  • Tesla has not commented publicly on his dismissal.

  • So as we go through this downtown, you'll see a lot of cars being visualized, just kind of on the side of the road being parked.

  • You'll see pedestrians walking the sidewalk.

  • I originally started at Tesla in August 2020 as a data annotation specialist.

  • That role primarily relied on me taking image data and labeling it to train the technology on what a car was, what a pedestrian was, maybe an animal, a lane line, or what a red light was.

  • That way, with this image data training, the vehicle would then know how to operate in a real-world situation when that data came through the cameras and through the car's computer.

  • The car has eight cameras, specifically three right here in the front.

  • And so as that truck just went across us, it's actually going through the plane of multiple different cameras.

  • And one camera, it may have looked perfectly on the ground and centered, but another camera, it could have been six feet ahead or six feet behind or maybe five feet floating in the air.

  • And those are things I've noticed while labeling these image clips, is that these cameras are not calibrated properly.

  • And so what looks true in one camera will not be true in another camera.

  • If the cameras aren't seeing the same thing, they can encounter significant challenges identifying obstacles.

  • We can see this in an autopilot crash similar to Steven Hendrickson's near Los Angeles in 2021.

  • The Journal obtained a rare set of raw data for the Tesla in this crash from an anonymous hacker.

  • When a crashed pickup truck becomes visible on the road, one of the car's cameras sees it, but the other doesn't.

  • Then, as the obstacle gets closer, the Tesla doesn't recognize it, and it crashes at full speed.

  • Getting this degree of information off a Tesla after it crashes is nearly impossible for the average driver.

  • Tesla says they provide drivers with hundreds of data points with granularity down to the millisecond in a user-friendly format.

  • But that information doesn't show the exact decisions Autopilot made.

  • Those details need to be extracted from the car's internal computer.

  • A Tesla's pretty much a computer on wheels.

  • I work on salvage Teslas, usually in accidents, wrecks.

  • I've seen a lot of videos where it's captured a lot of the drivers being reckless, and I've seen videos where it gives you that question mark of, like, how did that happen?

  • And it makes you curious, like, was it the driver or was it the Autopilot?

  • The video only tells part of the story.

  • The data on the car's computer gives a fuller picture.

  • The computers, they all report data and link to Tesla mothership.

  • And we have the computer off.

  • Diaz sends Autopilot computers he doesn't need across the country to a hacker who then extracts the information.

  • Janelle Hendrickson has not been able to get data from her husband's car.

  • Her lawyers requested it over a year ago.

  • Tesla says they're in the process of gathering it.

  • It's actually been really difficult to get any information about this.

  • We know that Tesla themselves has all the video and all the information, but they will not share it.

  • One thing that's not reflected in the car's data is the driver's confidence in the technology.

  • He trusted that Autopilot, obviously, with his life.

  • He trusted it with my kids' lives.

  • He said that it let him be a little bit less worried when he was driving so much.

  • He'd leave about 3 a.m. to get to work by 6 all over SoCal.

  • If the cars do a pretty good job, not perfect, but pretty good, we develop a false sense of trust in them.

  • And this overconfidence is, it's both coming from the lovers of the tech, but it's also coming from the top down.

  • A Model S and Model X, at this point, can drive autonomously with greater safety than a person.

  • We've gone over this multiple times, like, are we sure we have the right sensor suite?

  • Should we add anything more?

  • No.

  • The car currently drives me around Austin most of the time with no interventions.

  • The Department of Justice is investigating Tesla for its marketing of Autopilot.

  • Tesla disputes claims they misled the public about the car's capabilities.

  • Janelle Hendrickson's case against Tesla is set to go to trial in 2025.

  • People are still gonna buy Tesla.

  • They're still gonna support Elon Musk.

  • Doesn't matter how many accidents there are.

  • But at least understand your car and understand your car's capability before you just put your entire life behind that.

  • Computer vision is such a deeply flawed technology.

  • And we're not doing enough to determine how to fix its gaps and how to make it recognize objects.

  • I look into my crystal ball.

  • You know, what do I see coming?

  • I see that having the car do most of the driving for you and requiring you to pay attention to make sure nothing bad happens.

  • I don't think this is going to be a long-term technology that we're gonna keep in the cars.

On the morning of May 5th, 2021, Steven Hendrickson was driving his Tesla Model 3 to work.

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it