A Tesla in full self-driving mode makes a left turn out of the middle lane on a busy San Francisco street. It jumps in a bus lane where it’s not meant to be. It turns a corner and nearly plows into parked vehicles, causing the driver to lurch for the wheel. Car reviewer AI Addict has captured these scenes, and other scenarios like it are cropping up on YouTube. One might say that these are all mistakes any human on a cell phone might have made. But we expect more from our AI overlords.
Earlier this month, Tesla began sending out over-the-air software updates for its Full Self-Driving (FSD) beta version 9 software. This advanced driver-assist system relies only on cameras rather than cameras and radar-like Tesla’s previous ADAS systems.
In reaction to videos displaying unsafe driving behavior, like unprotected left turns, and other reports from Tesla owners, Consumer Reports issued a statement on Tuesday saying the software upgrade does not appear to be safe enough for public roads and that it would independently test the software update on its Model Y SUV once it receives the necessary software updates.
The consumer organization said it’s concerned that Tesla is using its existing owners and their vehicles as guinea pigs to test new features. Making their point for them, Tesla CEO Elon Musk did urge drivers not to be complacent while driving because “there will be unknown issues, so please be paranoid.” Many Tesla owners know what they’re getting into because they signed up for Tesla’s Early Access Program, which delivers beta software for feedback. Still, other road users have not given their consent for such trials.
Tesla’s updates are shipped out to drivers all over the country. The electric vehicle company did not respond to a request for more information about whether it considers self-driving regulations in specific states — 29 states have enacted laws related to autonomous driving. Still, they differ wildly depending on the form. Other self-driving technology companies like Cruise, Waymo, and Argo AI told CR they test their software on private tracks or use trained safety drivers as monitors.
“Car technology is advancing quand automation has a lot of potentialpotentialsicymakers need to step up to get strong, sensible safety rules in place,” says William Wallace, manager of safety policy at CR, in a statement. “Otherwise, some companies will treat our public roads as if they were private proving grounds, with little holding them accountable for safety.