American Car
Actually What the Tesla Autopilot Accidents Tell Us About Trusting Our Cars

 Good thing I did, because I was about six inches from slamming the top of the rear hatch into an overhead concrete beam about five feet off the ground. The wall at bumper-height was still at least 10 feet back, hence the silence from the sensor-drive system. I was right not to trust it, because it wasn’t designed to look up.

A few years ago I was backing an SUV into a parking spot in a Boston garage, listening for the telltale beeps of the ultrasonic parking sensors to tell me I was getting too close. Even though the parking-assist system was active, the chirps never came, so I got out to see what was the matter.

Advertisement – Continue Reading Below

Last week we learned of the first fatal accident involving a Tesla operating under the brand’s lauded Autopilot system; a Model S struck the broadside of a tractor-trailer. (Autopilot was blamed for another accident this week, this one involving a Model X.) As the accident demonstrates, Tesla has to deal with the same kind of trust problem. Or, rather, an over-trust problem.

Our condolences for the tragic loss https://t.co/zI2100zEGL

— Elon Musk (@elonmusk) June 30, 2016

Most Popular

We’re at a dangerous moment in vehicular autonomy. Our cars can’t drive themselves in every scenario; they’re filled with imperfect driver aids still require human oversight. Often these systems are so good that we trust them too much—enough to induce the sort of dependence that seems to have factored into the Tesla crash.

The problem is the handoff. Your Tesla may indeed be able to drive 200 miles on its own, when the roads are good and the lines are clear. But what happens on the 201st mile, when a construction zone scrambles the lane markings and there’s a hunk of blown-out tire carcass dead smack in your lane? But this is the exact sort of situation that punishes complacency. It’s hard not to lose your attention span when you’ve spent the past hour, or three, playing the role of passenger as your car tracks straight and true down the highway. How can you be truly ready the moment Autopilot needs you to take over?

There’s a moment when you’re worse off than if you had no driver assist technologies at all.

When an autonomous car abruptly quits driving—or simply makes a mistake—there’s a moment when you’re worse off than if you had no driver assist technologies at all. That’s why Volvo is planning a pilot program in Gothenburg, Sweden, to study avoiding this very moment altogether. One hundred XC90 drivers will test a system that handles all the driving, not just part of it. If, for whatever reason, the car has a sensor failure or can’t understand its surroundings, it’ll find a safe place to pull over. That scenario is vastly preferable to what we have now, where a car with lane-keeping and active cruise control systems will essentially throw up its virtual arms and storm out of the room while the car continues down the road at 70 mph.

Today, our best systems operate as backstops against human error rather than the other way around. I can attest to that. Last year I was driving an Audi RS7 when I took a sidelong glance out the window at the exact moment the car in front of me slammed on its brakes. This prompted the Audi to slam on its brakes without my help, averting a certain rear-end collision. The system was running in the background until it sprang to life and saved the day. That’s where our current technology excels—adding a dab of steering if you wander out of your lane, or hitting the brakes if you’re late to notice a traffic slowdown. But the limitations are pretty obvious.

Tesla told regulators about Autopilot crash nine days after accident https://t.co/FygqG22Q70

— Reuters Top News (@Reuters) July 6, 2016

Once while driving in L.A., I intervened when Volvo’s Pilot Assist tried to follow a Scion xB up an off-ramp like a wayward puppy wandering after a scent. I’ve had Lincoln’s automated perpendicular parking system attempt to back into a spot that was already occupied. And I’ve lost count of the lane-keeping systems that canceled their assistance after deeming me insufficiently involved. That’s the paradox of the handoff. “Say, you haven’t touched the steering wheel for 15 seconds, so I’m going to quit helping you steer. Let’s hope you notice!”

Better hardware and programming will solve some of these problems, like the need to scan for obstacles at windshield height. More issues will be solved by experience, as the semi-autonomous fleet accrues more miles. The real problem, intractable as ever, is human behavior. The issue is not that we don’t trust our cars to take the wheel. It’s that even with autonomous technologies barely a few years old, we already trust them too much.

<div

Related Post