AAA study says Americans have grown more afraid of self-driving cars. Good!

AAA study says Americans have grown more afraid of self-driving cars. Good!

The AAA this morning released its annual survey of Americans’ attitudes toward vehicle automation. And while there’s strong consumer interest in advanced driver assistance systems, apprehension has grown regarding full autonomy — 68% of those surveyed are outright afraid of self-driving cars, up 13 percentage points from last year. Yes, “afraid” is the word AAA used.

“We were not expecting such a dramatic decline in trust from previous years,” said Greg Brannon, AAA’s head of automotive research. “Although with the number of high-profile crashes that have occurred from over-reliance on current vehicle technologies, this isn’t entirely surprising.”

He’s likely talking about Teslas. The brand dominates autonomy-related crashes reported to NHTSA, for the simple reason that there are more of them so-equipped on the road. And while Tesla crashes may not be numerous in the context of U.S. car crashes as a whole, their technology’s seeming affinity for, among other things, slamming into the back of parked emergency vehicles tends to leave an impression on the public.

 

“Trust me, everyone hates me today. All the manufacturers, NHTSA … I’m good with that. I don’t mind being the bad cop because I do think we’re at the most important time in history since we figured out brake lights and headlights. I think that we are in the scariest time in transportation with autonomy and technology.

 

“I think we’re making some big mistakes.”

 

AAA says the public has also been hoodwinked by the marketing terms used for driver-assist suites. The survey found that “nearly one in 10 drivers believe they can buy a vehicle that drives itself while they sleep.” Wake up, people, you can’t buy a vehicle that offers that. The big sleep, though, definitely.

AAA has long blamed names such as Autopilot, ProPILOT, Pilot Assist and the whopper of them all, “Full Self-Driving,” for the fact that 22% of Americans expect driver assists to chauffeur them without supervision. There’s deception here, sure, but there’s also a whole lot of willful ignorance.

The 68% who are afraid might honestly not be scared enough. Even with advanced driver assist systems like adaptive cruise, features that six of 10 people in the survey find desirable, you’re wise to be ever-vigilant.

See also  Tesla Says It's Already Well on Its Way to Doubling Last Year's Output

That’s because the very nature of these systems makes constant vigilance hard, as Dr. Missy Cummings will tell you. Fresh off a New York Times profile, the George Mason University engineering professor, former NHTSA senior adviser for safety, and longtime autonomy and robotics researcher delivered a lecture last week at the University of Michigan’s Center for Connected and Automated Transportation. We’ve attached a video, above, and as college lectures go, it’ll hold your attention.

First, let’s just interject here that when Cummings was appointed to her role at NHTSA, Elon Musk called her “extremely biased against Tesla,” which had the effect of siccing his fans on her — she received death threats from Tesla-stans, and her family had to move out of their home for a time. Now, for a credible understanding of how human beings interact with high-performance technology, who are you going to believe, an auto company CEO, or an Annapolis grad who was one of the Navy’s first female fighter pilots? Only one of these human beings can land an F/A-18 on an aircraft carrier. 

When she went to NHTSA, Cummings said, she had “been complaining about NHTSA for years” and relished the chance to attempt fixing it. Musk is no fan of the agency either, so you’d think the enemy of Elon’s enemy is his friend. And the two of them presumably share a common goal: safer cars. Musk might even appreciate her sense of humor. When at Duke University, she named its Humans and Autonomy Lab — HAL.

Yet she had to have a security assessment done on the venue where she gave the U of M lecture. “Trust me, everyone hates me today,” she told the audience. “All the manufacturers, NHTSA, my 15-year-old daughter, everyone hates me.” (Her daughter has her learner’s permit; imagine being taught to drive by one of the foremost authorities on vehicle safety, your mom.)

But, Cummings says, “I’m good with that. I don’t mind being the bad cop because I do think we’re at the most important time in history since we figured out brake lights and headlights. I think that we are in the scariest time in transportation with autonomy and technology.

See also  I Don't Know If Toyota's Marketing Plan Will Work For The bZ4X

“I think we’re making some big mistakes.”

Her fat-chance goal is for the perpetually put-upon and glacially paced NHTSA to impose some order on the Wild West of these technologies. Musk put an embryonic FSD on the streets because he could. There was nothing to stop Tesla from beta-testing the system among an unwitting public.

And in an analysis Cummings sent NHTSA last fall of driver-assist systems from GM, Ford, Tesla, et al, she determined that cars using these systems that were involved in fatal crashes were traveling over the speed limit in 50% of the cases, while those with serious injuries were speeding in 42% of cases. In crashes that did not involve ADAS, those figures were 29% and 13%. So one simple solution would be for NHTSA to mandate speed limiters on these systems. “The technology is being abused by humans,” she told the Times. “We need to put in regulations that deal with this.”

The technology by definition lulls you. Cummings described an experiment at Duke in which she and other researchers placed 40 test subjects behind the wheel of a driving simulator for a four-hour “trip” using adaptive cruise. At the 2½-hour mark, a moose ambled slowly across the road. Only one test subject had presence enough to avoid the moose — the other 39 clobbered it.

Autopilot, Super Cruise, Blue Cruise … she’s grown to dislike systems that are, or can be tricked into being, hands-free. “They put it on, whatever version of ‘autopilot’ they have, and then they relax. They relax, because indeed, that’s what they’ve been told.” They might be paying attention “for the most part,” she said. For the most part is not enough.

As for full autonomy, Cummings laid out a description of the learning curve that humans and now technology have to climb, from first acquiring a basic skill to ultimately full expertise, at which we’ve mastered the skill-based reasoning that helps us know when to break a rule to get out of an unusual situation safely. Handling uncertainty is the hump that technology, which is inherently rules-based, might never get over.

See also  Stay safe and connected with the Garmin inReach Mini 2, now $100 off

She showed how an autonomous vehicle was stopped in its tracks during testing because it interpreted a movers’ truck as not just a truck, but as a collection of a truck, four poles, traffic signs, a fence, a building, a bus, and “a gigantic person who was about to attack.”

That’s an eight-year-old example, she admits, but “still very much a problem,” as illustrated by the now-infamous phantom-braking crash in the Bay Bridge tunnel in San Francisco last Thanksgiving, in which the driver of a Tesla blamed “Full Self-Driving” for mysteriously changing lanes and then slamming on the brakes, resulting in an eight-car pileup that injured a 2-year-old child.

“And for all you Tesla fanboys who are jumping on Twitter right now so you can attack me, I’m here to tell you that it’s not just a Tesla problem,” she said. “All manufacturers who are dealing in autonomy are dealing with this problem” — her next examples involving GM Cruise cars in San Francisco, including one that apparently attempted to drive through an active firefighting scene. “San Francisco, oh boy, they’re at their wits’ end with Cruise.”

But for all the problems, there’s still some upside. “Even though I just complained a lot about Cruise, I am in awe of Cruise and of Waymo and of all the other car companies out there that have not had any major crashes. They have not killed anybody since the Uber issue, So I am very amazed … I think the self-driving community has done a very good job of policing themselves.”

Those are just some highlights. It’s a fascinating lecture from someone who seems to be sincerely working to keep you and me safe. And it’s well worth an hour of your time.

Meanwhile, take a cue from AAA and Missy Cummings: Don’t trust, don’t let your guard down, don’t relax behind the wheel — that’s never been more true than it is now.