Who’s Driving? Autonomous Cars Entering the Most Dangerous Phase

 Inside a Tesla Model S car equipped with autopilot in Palo Alto, California. Photograph: Bloomberg/Bloomberg via Getty Images
Inside a Tesla Model S car equipped with autopilot in Palo Alto, California. Photograph: Bloomberg/Bloomberg via Getty Images

Autopilot controls are not yet fully capable of functioning without human intervention – but they’re good enough to lull us into a false sense of security


Powered by Guardian.co.ukThis article titled “Who’s driving? Autonomous cars may be entering the most dangerous phase” was written by Olivia Solon, for theguardian.com on Wednesday 24th January 2018 08.01 UTC

When California police officers approached a Tesla stopped in the centre of a five-lane highway outside San Francisco last week, they found a man asleep at the wheel. The driver, who was arrested on suspicion of drunk driving, told them his car was in “autopilot”, Tesla’s semi-autonomous driver assist system.

In a separate incident this week, firefighters in Culver City reported that a Tesla rear-ended their parked fire truck as it attended an accident on the freeway. Again, the driver said the vehicle was in autopilot.

 

The oft-repeated promise of driverless technology is that it will make the roads safer by reducing human error, the primary cause of accidents. However, automakers have a long way to go before they can eliminate the driver altogether.

What’s left is a messy interim period when cars are being augmented incrementally with automated technologies such as obstacle detection and lane centering. In theory, these can reduce the risk of crashes, but they are not failsafe. As a Tesla spokeswoman put it: “Autopilot is intended for use only with a fully attentive driver.”

However, research has shown that drivers get lulled into a false sense of security to the point where their minds and gazes start to wander away from the road. People become distracted or preoccupied with their smartphones. So when the car encounters a situation where the human needs to intervene, the driver can be slow to react.

At a time when there is already a surge in collisions caused by drivers distracted by their smartphones, we could be entering a particularly dangerous period of growing pains with autonomous driving systems.

“People are already inclined to be distracted. We’re on our phones, eating burgers, driving with our knees,” said Nidhi Kalra, senior information scientist at the Rand Corporation. “Additional autonomy gives people a sense that something else is in control, and we have a tendency to overestimate the technology’s capabilities.”

Steven Shladover, of the University of California, Berkeley’s Path programme, was more sharply critical of car manufacturers: “These companies are overselling the capabilities of the systems they have and the public is being misled.”

Waymo, Google’s self-driving car spin-off, discovered the handoff problem when it was testing a “level 3” automated driving system – one that can drive itself under certain conditions, but in which the human still needs to takeover if the situation becomes tricky. The next level, four, is what most people consider “fully autonomous”.

Most of the advanced driver assist features introduced by Tesla, Mercedes, BMW and Cadillac are categorised as level 2 automation.

During testing, Waymo recorded what its CEO, John Krafcik, described as “sort of scary” video footage of drivers texting, applying makeup and even sleeping behind the wheel while their cars hurtled down the freeway. This led Waymo to decide to leapfrog level 3 automation altogether, and focus on full autonomy instead.

“We found that human drivers over-trusted the technology and were not monitoring the roadway carefully enough to be able to safely take control when needed,” said the company in its 2017 safety report.

Ian Reagan from the Insurance Institute for Highway Safety (IIHS) shares Waymo’s caution, although he acknowledges that the safety potential for automated systems is “huge”.

“There are lots of potential unintended consequences, particularly with level 2 and 3 systems,” he said, explaining how the IIHS had bought and tested several cars with level 2 automation including vehicles from Tesla, Mercedes and BMW. “Even the best ones do things you don’t expect,” he said.

During tests the IIHS recorded a Mercedes having problems when the lane on the highway forked in two. “The radar system locked onto the right-hand exit lane when the driver was trying to go straight,” he said.

Tesla’s autopilot suffered from a different, repeatable glitch that caused it to veer into the guardrail when approaching the crest of a hill. “If the driver had been distracted, that definitely would have caused a crash,” he said.

Concern over this new type of distracted driving is forcing automakers to introduce additional safety features to compensate. For example, GM has introduced eye-tracking technology to check the driver’s eyes are on the road while Tesla drivers can be locked out of autopilot if they ignore warnings to keep their hands on the steering wheel.

That hasn’t stopped some enterprising owners from finding a way to trick the autopilot warning system by wedging an orange or a water bottle into the steering wheel.

In spite of these problems, Tesla’s CEO, Elon Musk, remains bullish about his company’s autonomous technology, even suggesting that by 2019 drivers would be able to sleep in their cars – presumably without being arrested by highway patrol officers.

guardian.co.uk © Guardian News & Media Limited 2010

Published via the Guardian News Feed plugin for WordPress.

Leave a Reply

Your email address will not be published. Required fields are marked *