Feds: Tesla Autopilot Sped up Prior to CrashDate:
06/11/2018Tag: @Tesla @NTSB #autopilot @Apple @Waymo #Waymo #psd Feds: Tesla Autopilot Sped up Prior to CrashThe fatal Tesla crash just got more bizarre – turns out the semi-autonomous SUV accelerated before the accident, making no effort to brake or change course. That’s according to the National Transportation Safety Board’s initial report. The March 23 crash killed an Apple engineer, Walter Huang, 38, and the Tesla’s semi-autonomous “Autopilot” was engaged at the time of the accident – though for the last 6 seconds prior to the crash, the driver’s hands weren’t detected on the wheel (Autopilot urges the driver to maintain contact with the wheel, and it blasts several audio and visual warnings if contact is broken for too long). And the semi-autonomous system either malfunctioned or made a bizarre attempt to avoid the crash. According to the NTSB’s report, “At 3 seconds prior to the crash and up to the time of impact with the crash attenuator, the Tesla’s speed increased from 62 to 70.8 mph, with no precrash braking or evasive steering movement detected.” Of course, this is completely counterintuitive to the normal human reaction, which is to slam on the brakes – not necessarily the right maneuver, but we’re highly emotional creatures. Humans are prone to panic – the exact reason autonomous vehicles should, in theory, be much safer than human drivers. PC World discussed a fascinating case where a Waymo self-driving vehicle may have scared the bejeezus out of its riders even if it was perfectly safe. In the given example, the autonomous vehicle detected a car on the right side of an intersection about to run a red light, and rather than slow down, the self-driving car sped up (it still had the right of way, after all). To the human occupants, this was probably terrifying, but from a rational perspective, the AI made the right move. This could have been what the Tesla Autopilot was attempting, but we’ll never know. To date, Tesla hasn’t said whether its semi-autonomous system functioned properly, and results don’t necessarily spell the answer – speeding up could’ve simply been the best option, under the circumstances. The NTSB’s further findings, and Tesla’s own conclusion, could have huge ramifications for self-driving vehicles. These little missteps were inevitable, and how they’re handled could either delay the tech almost indefinitely or provide valuable insight on how to avoid them in the future. Read more here: https://www.ntsb.gov/investigations/AccidentReports/Pages/HWY18FH011-preliminary.aspx |