Adjusting Our Hopes for Self-Driving Technology
Many reading news last week about the first fatality involving a self-driving Tesla reacted with pessimism about the technology’s future, but we remain optimistic. We’ve predicted in earlier blogs that we’ll see a lot of truck accidents and car accidents involving autopilot technology, and a lot of legal wrangling over what should be allowed on our highways. But the National Highway Transportation Safety Administration is working to expedite the safe adoption of this technology because evidence indicates that even though there will be failures, autopilot technology should help in reducing accidents and making our highways safer.
One key fact is that 94% of crashes are caused by human error.
Autopilot technologies are already helping reduce those errors with automatic braking assist and protective warnings. If implemented correctly, these systems will become increasingly accurate and powerful in assisting the driver.
The key is that the technology is not yet ready for a full handoff of responsibility. Tesla apparently has warned drivers that, “the feature is not for all conditions and not sophisticated enough for the driver to check out.” When used in conjunction with driver oversight, the company estimates that self-driving mode reduces the risk of an accident by 50%.
The driver who was killed, an ex-Navy Seal named Joshua Brown, was apparently a huge fan of his Tesla and its technology, but it appears that he may have trusted it too much. Perhaps it would be best if we changed the term from “self-driving” to “assisted-driving” to keep the balance of responsibilities in its proper place. For more information on “assisted-driving” cars, here’s a good Q & A from the Richmond Times-Dispatch.