Despite literally just passing a variable message sign intermittently reading “Texting while driving is … (pause) … 23x more dangerous” placed just before one of the most hazardous, local maneuvers known as the Rochester Curve, society has gotten smarter about the importance of the age-old cliché of “eyes on the road, hands on the wheel, mind on the drive.”
Or so we thought …
A new study by the Insurance Institute for Highway Safety (IIHS) of people who owned vehicles with advanced driver assist features found that “…large percentages of users (53% of [GM’s] Super Cruise, 42% of [Tesla’s] Autopilot and 12% of [Nissan’s] ProPILOT Assist) indicated they were comfortable treating their systems as self-driving.” Self-driving cars are presently not available to consumers, despite misleading marketing from some manufacturers. The three aforementioned systems have what’s called “partial automation.” The in-the-flesh driver must still accomplish many routine driving tasks since those systems are not ready to launch ubiquitously.
Along those lines, the study reports that Super Cruise and Autopilot users were more likely to engage in activities where they took their hands off the wheel and their eyes off the road. In fact, approximately 50% of Super Cruise and 42% of Autopilot “… users reported triggering a ‘lockout’ of the technology at some point, which occurs when a driver fails to respond to attention warnings.” So far, all mainstream systems require the driver’s active supervision.
A possible reason: some manufacturers were very liberal with their marketing and executives’ public statements, which essentially encouraged drivers to treat the system as autonomous. And that lead to some car owners, like Param Sharma of San Francisco, being recorded on the highway riding as a backseat passenger without any humans in the front seat. Raj Mathai, a KNTV (NBC) news anchor in the San Francisco, rightly described such behavior as “… very illegal.”
The study’s discoveries call into question whether basic engineering rigor (e.g., examining the Safety of the Intended Feature, a.k.a., SOTIF) was appropriately completed for these designs, and whether the public understands the difference between Advanced Driver Assistance Systems (ADAS) and autonomous. “Partial automation systems may make long drives seem like less of a burden, but there is no evidence that they make driving safer,” says IIHS President David Harkey. “In fact, the opposite may be the case if systems lack adequate safeguards.”
As reported by the New York Times in June, the National Highway Traffic Safety Administration (NHTSA) upgraded … “its preliminary evaluation of Autopilot to an engineering analysis [that] … will look at whether Autopilot fails to prevent drivers from diverting their attention from the road and engaging in other predictable and risky behavior while using the system.”
Meanwhile, eight short weeks later Tesla released another Beta version of its software that it tested with only a 1000 (lucky?) users due to “many major code changes.”
Maybe the variable message sign should be warning drivers about more than just texting.
Source: https://www.forbes.com/sites/stevetengler/2022/10/25/new-study-points-out-scary-driver-behavior-in-supposedly-safer-cars/