Videos of Tesla's Full Self-Driving beta software reveal flaws in system - The Washingt... - 0 views
-
Each of these moments — captured on video by a Tesla owner and posted online — reveals a fundamental weakness in Tesla’s “Full Self-Driving” technology, according to a panel of experts assembled by The Washington Post and asked to examine the videos. These are problems with no easy fix, the experts said, where patching one issue might introduce new complications, or where the nearly infinite array of possible real-life scenarios is simply too much for Tesla’s algorithms to master.
-
The Post selected six videos from a large array posted on YouTube and contacted the people who shot them to confirm their authenticity. The Post then recruited a half-dozen experts to conduct a frame-by-frame analysis.
-
The experts include academics who study self-driving vehicles; industry executives and technical staff who work in autonomous-vehicle safety analysis; and self-driving vehicle developers. None work in capacities that put them in competition with Tesla, and several said they did not fault Tesla for its approach. Two spoke on condition of anonymity to avoid angering Tesla, its fans or future clients.
- ...4 more annotations...
-
Their analysis suggests that, as currently designed, “Full Self-Driving” (FSD) could be dangerous on public roadways, according to several of the experts.
-
That the Tesla keeps going after seeing a pedestrian near a crosswalk offers insight into the type of software Tesla uses, known as “machine learning.” This type of software is capable of deciphering large sets of data and forming correlations that allow it, in essence, to learn on its own.
-
Tesla’s software uses a combination of machine-learning software and simpler software “rules,” such as “always stop at stop signs and red lights.” But as one researcher pointed out, machine-learning algorithms invariably learn lessons they shouldn’t. It’s possible that if the software were told to “never hit pedestrians,” it could take away the wrong lesson: that pedestrians will move out of the way if they are about to be hit, one expert said
-
Software developers could create a “rule” that the car must slow down or stop for pedestrians. But that fix could paralyze the software in urban environments, where pedestrians are everywhere.