By B.N. Frank
Dangerous and deadly incidents and issues have been and continue to be reported about Tesla vehicles (see 1, 2, 3, 4, 5, 6, 7, 8, 9, 10). In fact, last month, a lawsuit was filed by police officers injured by a Tesla being operated in autopilot. In August, the U.S. government announced it had started investigating problems associated with that feature. Earlier this week, more bad news was released for owners wanting to use it.
From Ars Technica:
Tesla pulls Full Self-Driving update after sudden braking spooks drivers
Automaker released another new version of its controversial software today.
Tesla’s Full Self-Driving software lived up to its “beta” label this weekend.
On Saturday morning, Tesla CEO Elon Musk announced a delay for the 10.3 update after internal quality-assurance testers discovered that the new version performed worse at left turns at traffic lights than previous versions. Then, on Sunday afternoon, Musk said that Tesla would be “rolling back to 10.2 temporarily” after reports of sudden braking, false warnings, and other issues.
Several owners reported that their vehicles braked suddenly when the software mistakenly reported an imminent collision. Known as automatic emergency braking (or AEB), neither the feature nor its bugs are limited to Tesla—Mazda recalled some of its cars in 2019 for similar problems.
When working properly, forward-collision warning and automatic emergency braking work in concert to prevent a vehicle from crashing into objects in its path. A combination of sensors and software first alerts drivers to an impending collision and, if no action is taken, forcefully applies the brakes.
Cars that experience phantom alerts that lead to emergency braking events risk getting rear-ended if they stop quickly in dense traffic or if traveling at high speeds. “Having AEB activate in the middle of the highway was terrifying,” one Tesla driver reported on Reddit.
While false automatic emergency braking events and false forward-collision warnings appear to be the most widespread issues, other drivers reported false lane-departure warnings and a disabling of both forward-collision warnings and automatic emergency braking without warning.
Further Reading
Elon Musk announces another price hike for “full self-driving” package
Tesla has offered Full Self-Driving as an expensive option on its vehicles, with people paying up to $10,000 for the prospect of gaining access to the software when it’s released. But years have elapsed since the option was first introduced, and only recently has a select group of owners been able to access a beta version of the feature.
Tesla distributed Full Self-Driving to an initial wave of 2,000 drivers, and it has been expanding access to limited groups based on a driver’s “safety score,” which the company calculates from telemetry data gathered by its vehicles. The first expansion gave access to drivers with scores of 100 out of 100, while the next group had scores of 99.
Today, the automaker released another update, 10.3.1, and began pushing it to eligible vehicles.
One Reddit user suggested that the most widespread bug might be related to a conflict between the automatic emergency braking system and Tesla’s Sentry Mode, a security system that monitors the vehicle when parked. If true, that would be a concerning development, suggesting that a bug in a noncritical feature caused an issue with a safety system. Typically, automakers keep vehicle safety systems separate because the cost of development for those features is significantly higher.
NTSB letter
Further Reading
NTSB: Tesla’s Autopilot UX a “major role” in fatal Model S crash [Updated]
Tesla is also under pressure to address two safety recommendations made by the National Transportation Safety Board more than four years ago. The agency made them after investigating the crash that killed Joshua Brown, whose Model S broadsided a semi trailer in his path. The agency determined that Brown drove on Autopilot for extended periods without his hands on the wheel and that he was using the feature on roads for which it wasn’t designed. The NTSB has asked Tesla to limit when and where Autopilot can be used and to implement stricter monitoring for driver attentiveness.
“Our crash investigations involving your company’s vehicles have clearly shown that the potential for misuse requires a system design change to ensure safety,” NTSB Chairwoman Jennifer Homendy wrote. “If you are serious about putting safety front and center in Tesla vehicle design, I invite you to complete action on the safety recommendations we issued to you four years ago.”
Autopilot is a Level 2 advanced driver-assistance system, meaning that drivers must monitor the vehicle at all times. Currently, most automakers, including Tesla, test for driver engagement by periodically checking whether torque is being applied to the steering wheel. Other systems, like GM’s SuperCruise and Ford’s BlueCruise, use eye tracking. No system, though, is foolproof.
The NTSB’s letter does not seem to have affected Tesla’s stock price, which soared to a record high and sent its market cap over $1 trillion for the first time, perhaps on the news that rental car company Hertz placed an order for 100,000 Model 3s.
Activist Post reports regularly about Teslas and other unsafe technology. For more information, visit our archives.
Become a Patron!
Or support us at SubscribeStar
Donate cryptocurrency HERE
Subscribe to Activist Post for truth, peace, and freedom news. Follow us on Telegram, HIVE, Flote, Minds, MeWe, Twitter, Gab and What Really Happened.
Provide, Protect and Profit from what’s coming! Get a free issue of Counter Markets today.
Be the first to comment on "Tesla’s Full Self-Driving Software Update Included “sudden braking, false warnings, and other issues.”"