Inside the final seconds of a deadly Tesla Autopilot crash - Washington Post - 0 views
www.washingtonpost.com/...tesla-autopilot-crash-analysis
tesla self-driving automation autopilot culture policy
shared by Javier E on 07 Oct 23
- No Cached
-
In a Riverside, Calif., courtroom last month in a lawsuit involving another fatal crash where Autopilot was allegedly involved, a Tesla attorney held a mock steering wheel before the jury and emphasized that the driver must always be in control.Autopilot “is basically just fancy cruise control,” he said.
-
Tesla CEO Elon Musk has painted a different reality, arguing that his technology is making the roads safer: “It’s probably better than a person right now,” Musk said of Autopilot during a 2016 conference call with reporters.
-
In a different case involving another fatal Autopilot crash, a Tesla engineer testified that a team specifically mapped the route the car would take in the video. At one point during testing for the video, a test car crashed into a fence, according to Reuters. The engineer said in a deposition that the video was meant to show what the technology could eventually be capable of — not what cars on the road could do at the time.
- ...9 more annotations...
-
NHTSA said it has an “active investigation” of Autopilot. “NHTSA generally does not comment on matters related to open investigations,” NHTSA spokeswoman Veronica Morales said in a statement. In 2021, the agency adopted a rule requiring carmakers such as Tesla to report crashes involving their driver-assistance systems.Beyond the data collection, though, there are few clear legal limitations on how this type of advanced driver-assistance technology should operate and what capabilities it should have.
-
“Tesla has decided to take these much greater risks with the technology because they have this sense that it’s like, ‘Well, you can figure it out. You can determine for yourself what’s safe’ — without recognizing that other road users don’t have that same choice,” former NHTSA administrator Steven Cliff said in an interview.“If you’re a pedestrian, [if] you’re another vehicle on the road,” he added, “do you know that you’re unwittingly an object of an experiment that’s happening?”
-
Banner researched Tesla for years before buying a Model 3 in 2018, his wife, Kim, told federal investigators. Around the time of his purchase, Tesla’s website featured a video showing a Tesla navigating the curvy roads and intersections of California while a driver sits in the front seat, hands hovering beneath the wheel.The video, recorded in 2016, is still on the site today.“The person in the driver’s seat is only there for legal reasons,” the video says. “He is not doing anything. The car is driving itself.”
-
Musk made a similar assertion about a more sophisticated form of Autopilot called Full Self-Driving on an earnings call in July. “Now, I know I’m the boy who cried FSD,” he said. “But man, I think we’ll be better than human by the end of this year.”
-
While the video concerned Full Self-Driving, which operates on surface streets, the plaintiffs in the Banner case argue Tesla’s “marketing does not always distinguish between these systems.”
-
Not only is the marketing misleading, plaintiffs in several cases argue, the company gives drivers a long leash when deciding when and how to use the technology. Though Autopilot is supposed to be enabled in limited situations, it sometimes works on roads it’s not designed for. It also allows drivers to go short periods without touching the wheel and to set cruising speeds well above posted speed limits.
-
Identifying semi-trucks is a particular deficiency that engineers have struggled to solve since Banner’s death, according to a former Autopilot employee who spoke on the condition of anonymity for fear of retribution.
-
Tesla complicated the matter in 2021 when it eliminated radar sensors from its cars, The Post previously reported, making vehicles such as semi-trucks appear two-dimensional and harder to parse.
-
“If a system turns on, then at least some users will conclude it must be intended to work there,” Koopman said. “Because they think if it wasn’t intended to work there, it wouldn’t turn on.”Andrew Maynard, a professor of advanced technology transitions at Arizona State University, said customers probably just trust the technology.“Most people just don’t have the time or ability to fully understand the intricacies of it, so at the end they trust the company to protect them,” he said.