No matter how smart or visionary someone may appear to be on the surface, it’s important to remember that in the end they are human and flawed. Just because they manage to succeed in some areas, it doesn’t mean they can do the same in everything they try. That’s why it’s crucially important to avoid putting people on too high a pedestal because it will hurt when they come crashing down. A new documentary film published by the New York Times
At a running time of 74 minutes, “Elon Musk’s Crash Course” tries to cover a lot of ground, but it ultimately seems like it spends too much time on some topics and not enough on some of the most important. Director Emma Schwartz opens the film with a look at the origins of Tesla and efforts to develop automated vehicles going back to a GM promotional film from 1956 featuring the Firebird II concept and the DAR
However, where the film seems to go wrong is spending too much time on Joshua Brown. Brown was the former Navy bomb disposal expert that was the first known fatality while using Tesla’s AutoPilot driver assist system. Significant portions of the film are given over to hearing from some of Brown’s friends about why he was so enamored with AutoPilot that he put more than 45,000 miles on his Model S in the nine months between the release of AutoPilot and his May 2016 death.
While New York Times reporters Neal Baudette and Cade Metz do an able job of explaining some of the limitations of AutoPilot, most of the people who should probably be watching a film like this probably have little or no technical knowledge of how automated driving works. Those members of the general public would greatly benefit from a solid primer that cuts through the many misperceptions around automated driving, many of which have been promulgated by Tesla CEO Musk himself over the years.
Viewers could learn some important lessons about what it actually takes to create, test and validate an automated vehicle if the film had included interviews with the likes of Carnegie Mellon University computer science professor Phillip Koopman or Ann Arbor based-attorney Jennifer Dukarski. Instead we get a lot of words from friends of Brown explaining how interested he was in technology and why he was so adamant about testing the limits of his car.
To the credit of the filmmakers they did include interviews with a pair of former engineers on the AutoPilot team, members of the National Transportation Safety Board (NTSB) including former chair Robert Sumwalt and Bryan Thomas, former communications director of the National Highway Traffic Safety Administration (NHT
Raven Jiang and JT Stukes both acknowledged the limitations of AutoPilot. “We did want to try to make AutoPilot safe,” said Jiang.
“At the time of that crash, I was aware that people were trusting the system to do things that it was not designed or capable of doing,” added Stukes. “The fact that that sort of accident happened is obviously tragic, but it was something that was going to happen.”
One reason it was likely to happen, is that Tesla has decided to go with a camera first and now camera-only system. “There was no deep research phase where various vehicles were outfitted with a range of sensors, many team members would have liked that, instead the conclusion was made first and then the test and development activities began to prove that conclusion correct.”
Over the past 15 years, Musk has pushed forward many great developments in rocketry and popularizing EVs and software defined vehicles. But when it comes to safety critical systems, there are no shortcuts. When lives are at stake, it is the responsibility of those creating technologies to take due care. When they fail to do so as Musk has repeatedly done with AutoPilot and Full Self-Driving, it is the responsibility of safety regulators to reign those people in.
This is where I see the biggest failing of this film. The NTSB has responsibility for investigating transportation accidents of all sorts including aviation, rail, marine and ground vehicles. In the wake of the Brown crash, the NTSB made many excellent recommendations, including requiring more robust driver monitoring for systems that still require human supervision like Tesla AutoPilot/FSD, GM’s Super Cruise and every other system on the road today. The NTSB also recommended that systems should be geofenced to the roads where they can operate safely.
NHTSA is the agency that has regulatory and enforcement authority over the auto industry. In the six years since Brown’s death, there have been numerous other fatal crashes involving AutoPilot misuse but NHTSA has done nothing at all to implement any of the NTSB regulations. Only in the last 12 months since the change of administration in Washington has NHTSA even started to do any serious data gathering on partially automated systems and it’s anyone’s guess as to when something concrete will be done.
The film completely drops the ball on looking into why NHTSA has done nothing to ensure that partial automation systems are actually verified to be safe and effective. If anything, the agency has enabled Tesla to keep moving forward with selling a technology that doesn’t actually work and promoting it more than ever. To be fair, other government agencies such as Federal Trade Commission could also be trying to address at least the marketing side of this problem, but the film doesn’t look into the failings of any government officials.
In a capitalist system, it’s not at all unusual for companies to try to get away with as much as they possibly can in the pursuit of profit. It’s the job of governments to erect the guard rails needed to protect the public from unscrupulous companies. It’s the job of journalists to hold up a light to all of the failings in the process so we as a populace can be aware and hold everyone to account. Elon Musk’s Crash Course had an opportunity to do much more around the story and dropped the ball. More needs to be done to tell the truth about what is happening with all of the parties involved in this tragic tale.
Source: https://www.forbes.com/sites/samabuelsamid/2022/05/23/elon-musks-crash-coursetakes-a-cursory-look-at-engineering-and-regulatory-failure/