-12 C
New York
Saturday, February 4, 2023

Driven to distraction: how close are we to watching films in self-driving cars? | Self-driving cars


Self-driving cars could lead to drivers being allowed to watch films on the motorway, under changes to the Highway Code proposed by the Department for Transport

The planned updates are intended to smooth the way for adoption of autonomous vehicles on British roads. But they have sparked concern from some, who fear that new regulations may be introduced before the technology is there to support them.

What is a self-driving car?

The actual definition of a “self-driving” car is hotly contested. At one end of the spectrum, simple driver-assist technologies such as cruise control are decades old, and have been largely incorporated into existing rules without difficulty. At the other end of the spectrum, the dream of a fully autonomous vehicle, that can handle any driving a human can, remains the stuff of science fiction.

In between the poles is where disputes lie. Tesla’s “autopilot” technology, for instance, can follow the lanes of a motorway, and handle junctions without intervention, but even drivers who pay the fee to upgrade to what the company calls “full self-driving” must remain at the wheel and alert at all times, in case the car’s software is unable to cope with something unexpected.

The industry uses a six-point scale, from 0 to 5, to cover the differences, and considers anything above 3 “automated” to some degree. A level 3 car, like a Tesla, can perform “most” driving tasks, but requires human override occasionally. A level 4 car, like the robot taxis being tested in San Francisco and Phoenix, can perform all driving under specific circumstances – like in a defined city area, for instance – but still preserves the option for human override. Only a level 5 car, which never needs a human to take over and could be made without a steering wheel entirely, is considered “full automation”.

Why can I watch a film but not use my phone?

The proposal would allow drivers to watch content “not related to driving on built-in display screens, while the self-driving vehicle is in control”. Mobile phones remain specifically banned, however, “given the greater risk they pose in distracting drivers as shown in research”. For a level 5 self-driving car, such a distinction would be moot, since drivers should not be expected to take control ever.

For less advanced automation, though, the distinction matters: a built-in screen can be closely linked with the car’s systems, making it easier to alert the driver that they need to pay attention to the road.

Can it really be safe to watch a film while a car is driving?

If the tech lives up to its promise, it should be. A good implementation of a level 3 or level 4 self-driving car – one that expects drivers to take over occasionally – will also take into account the fact that drivers are naturally poor at monitoring the operation of a machine they do not need to control. This is known as the “paradox of automation”: the more efficient an automated system, the more crucial the human contribution when they’re required.

If you have a normal car, the vast majority of your driving is likely to be rote and routine. But if you have a self-driving car that can handle 99% of tasks, then you’re going to be put back in charge only in the most difficult 1% of situations.

Many of the setbacks of self-driving cars over the past decade have involved dealing with that problem: how do you ensure that a driver is ready to take over at a moment’s notice, when the promise of the technology involves setting them free to do other things?

But the latest generation of self-driving cars prioritise “safe disengagement”, pulling over to the side of the road and coming to a stop when there’s difficulty, rather than handing control back to the driver at 70mph. If those safety features are required, then it really can be safe to watch a film while driving.

Whose fault is it if there is a crash?

That’s one fight that is still being waged. The British proposals warn that “motorists must be ready to resume control in a timely way if they are prompted to”, the definition of level 4 automation. In most crashes involving self-driving cars, the motorist has technically been at fault – because they haven’t been able to take control in the split second before tragedy occurred. Drivers have been charged in crashes involving Tesla cars, and an experimental Uber self-driving car.

But experts have referred to the human drivers in these situations as “moral crumple zones”, parts of the system designed to soak up legal and moral responsibility without having the power to actually improve safety. “While the crumple zone in a car is meant to protect the human driver, the moral crumple zone protects the integrity of the technological system, at the expense of the nearest human operator,” says Madeleine Clare Elish, who coined the term in 2019.

But will self-driving cars ever really come to the UK?

Level 3 automation is on British streets already, and level 4 is close behind. Companies in Oxford and Milton Keynes have been testing cars on the road for a couple of years, with increasingly positive results. A simpler version of a “driverless car” can see a company pairing a level 4 AI with wireless broadband, enabling remote safety drivers who don’t need to sit in the car behind a steering wheel.

But the industry has long struggled with the hardest part of driving a car: other people. Heavily pedestrianised areas, busy unsigned intersections and pulling out into dense traffic all pose significant problems that may prevent level 5 automation from ever becoming a reality.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles