What's in a name? When it comes to trusting self-driving technology, a lot - Action News
Home WebMail Friday, November 22, 2024, 05:13 PM | Calgary | -11.1°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Business

What's in a name? When it comes to trusting self-driving technology, a lot

Drivers are far too trusting of self-driving technology, making them too comfortable with distractions behind the wheel, a new report says. And a major reason why seems to be the names that car companies give to their rudimentary autonomous systems.

Auto safety group polled drivers about what distractions they would be OK with in a 'self-driving' car

An employee drives a Tesla Model S hands-free on a highway in Amsterdam in 2015. The company is seen as a leader in self-driving technology, but does advise that drivers using its Autopilot system be ready to take over the wheel at all times. (Jasper Juinen/Bloomberg)

Drivers are far too trusting of self-driving technology,makingthem too comfortable with distractions behind the wheel and a major reason why seems to be the names that car companies give to their rudimentary autonomous systems.

That's one of the main takeaways from a report released Thursday by the Insurance Institute for Highway Safety, a U.S. watchdog agency dedicated to making driving safer.

The reportis based on data collected from a telephone poll commissioned by the group last fall, in which2,005 American drivers were askedabout whether they thought certain behaviours would be acceptable behind the wheel, providedthe driverwere in a car equipped with different types of self-driving technologies.

Crucially, respondents weren't told what thesystems did or what company produced them; rather, they were just toldthe following driver-assist systems were in place: Autopilot,Traffic Jam Assist, Super Cruise, Driving Assistant Plusand ProPilot Assist.

All of those systems are currently in place on cars sold in the U.S. and all have similar capabilities in terms of adjusting speed, maintaining safe following distances, and some level of automatic steering.

They are also all considered to be Level 2 systems on the industry's six-level grading scale, where a zerowould be a car with no assistance at all, a Level 1 would be something equipped with something basic likecruise control, and a Level 5would be a fully autonomous vehicle.

Autopilot is Tesla's system, Traffic Jam Assist is installed on some Audi andAcura vehicles, Super Cruise is from Cadillac, Driving Assistant Plus is forBMWs and ProPilot Assist is a Nissan product.

Kyla Jackson, a member of the Waymo early rider program, a self-driving car service, demonstrates the power button in an autonomous vehicle in Arizona. While many companies are touting self-driving technology, very few vehicles on the road today are truly capable of completely driving themselves. (Caitlin O'Hara/Bloomberg)

All of the systemsrequire drivers to stay fully alert and engaged while behind the wheel,yet according to survey respondents, each gives drivers a licence to misbehave.

None moreso than Tesla's Autopilot.Almost half of the driverspolled by the IIHS said it would be OK for a driver to take their hands off the steering wheel if Autopilot is engaged.

More than one-third said they'd take their feet off the pedals, andabout the samenumber of respondents would allow themselves to look out the window or talk on a cellphone with the system deployed.Almost one in 10 said it's OK to read a book or watch a movie.

More than one in 20 said they would go as far as having a nap while in the driver's seat of a Tesla on Autopilot.

All such behaviour is strictly forbidden while the system is in place, according to Tesla,as it is with all the systems tested in the survey. But Tesla's Autopilot elicited the most lax behaviour,with the IIHS suggesting a big reason for that is the name of the system.

The term "autopilot" comes from aviation, where autopilot systems reduce workloads for pilots on long-haul trips by automating as many routine tasks as possible. Federal Aviation Administration (FAA) rules mandate that a pilot must be able to take over from an autopilot system at any moment, but the term has come to mean something quite different for civilians.

"Even though autopilots in no way replace human pilots, that is exactly the connotation the term autopilottypically brings to mind," the IIHS report said.

'While a name alone cannot properly instruct drivers on how to use a system, it is a piece of information and must be considered so that drivers are not misled about the correct usage of these systems."

While the IIHS report noted that all of the self-driving systemsmentioned resulted in a disturbing number of respondents saying it was OK to do certain behaviours, Tesla's system scored the highest for every action, which is why the company features so prominently in the report.

Indeed, you don't have to look very hardto find a Tesla driver giving their Autopilot systemsill-advised responsibility.

Social media is replete with videos of Tesla drivers who feel comfortable enough in their cars to take a nap, eat a mealor engage inmany other dangerous behaviours.

In March, a Tesla driver in Florida was killed when his Model 3 crashed into the side of a tractor-trailer. A preliminary investigation from the National Transportation Safety Board found that the driver's hands were not on the wheel when his vehiclewent completely under the truck, and the Autopilot system was deployed.

The same was true in a 2016 crash in California in which a Tesla SUVhit a concrete lane divider, and a 2016 crash in Florida, when an Ohio man was killed.

"Tesla's user manual says clearly that the Autopilot's steering function is a 'hands-on feature,' but that message clearly hasn't reached everybody," said IIHS president DavidHarkey.

"Manufacturers should consider what message the names of their systems send to people."