The start of the 21st Century has so far delivered many revolutionary technological developments. With the rollout of Tesla’s Autopilot system and self-parking cars becoming commonplace on the road, it’s easy to assume that fully-autonomous cars will be driving around a corner near you soon. In fact, you might be forgiven for wondering whether you will need a driving licence at all in the near future!

At How-2-Drive we’re fascinated by the many ways technology can be used to improve road safety. So we set about finding out whether the drivers of tomorrow will still need to pass their driving test. Read on to learn more about this fascinating topic.

Big money stakes in autonomous vehicles

The UK Government is one amongst several major economies that is investing heavily in autonomous vehicle technology. With the industry expected to be worth about £28bn to the UK economy by 2035, it’s easy to see why. Significant progress has already been made, with shuttle buses and delivery vehicles being trialled in London. Most recently, self-driving cars were also tested on public roads in London at the end of 2018.

So, it’s clear that the Government feels that driverless technology has a place in a future Britain. But how close is it to becoming an every day reality?

Driverless technology still needs a lot of testing

Anyone with an interest in science will know that there is a big difference between testing under controlled conditions in a lab and testing in the field. For example, Google’s driverless cars have been tested on public roads since 2010 and have now covered in excess of 700,000 miles. However, RAND Corporation suggests that different testing methods will have to be found. They claim that with the design of current autonomous vehicle tests it could take hundreds of years to establish the safety of self-driving cars beyond all reasonable doubt.

However, with all the real-world tests being done it’s easy to assume driverless cars are here already. But the reality is that Tesla’s Autopilot only works on motorways, and even then only in good weather.

It’s unlikely that fully-autonomous cars (cars that don’t have any driver controls) will appear until the later half of the next decade. The current generation of cars can only be considered semi-automonous at best. For safety and legal purposes they still require a licensed and insured driver to be behind the wheel at all times.

Furthermore, it was revealed late last year that when tested in the UK, US self-driving cars were unable to identify London buses. It’s clear that the AI that powers these vehicles still needs time to learn and adapt. If your autonomous car can’t tell when you’re about to hit a bus, that’s an issue worthy of a little more testing!

What’s more, India has already banned driverless cars outright, citing a need to protect jobs. But the BBC observes that its chaotic and congested road conditions are likely to make the country a difficult place to implement driverless tech anyway. Other governments might be prepared to stand in the way of universal adoption of driverless cars, for the foreseeable future at least.

Variation is the essence of life

Human behaviour is hard to predict and prone to changing over time in response to environmental changes and shifting norms of behaviour. Any technology that is designed to protect human life must be capable of learning and adapting to these peculiarities.

Predictable AI is open to exploitation — for example, criminals might abuse it in order to stop a car so it can be stolen under duress. If self-driving cars can be trusted to always stop in time, pedestrians might not bother using designated crossings any longer — or even checking if the road is clear at all.

So, not only will autonomous vehicles need to be able to behave safely in a variety of different circumstances, they’ll also need to be able to learn and adapt to new situations. That’s going to require some pretty sophisticated technology!

The ethics of self-driving cars

And that leads us on to ethics. When the AI powering a self-driving car senses an impending collision, who should it kill and who should it spare? Is it okay to kill an elderly gent to save a young mother and her baby? It is a good business decision to sacrifice the paying customer who bought your car in the first place in order to save a pedestrian? Humans make these kinds of decisions instinctively. However, codifying these innate instincts into documented decision trees raises profound ethical issues.

It’s possible that governments might intervene so that we end up with different countries requiring cars to be programmed with different ethical codes. Or perhaps public outrage at the ethical decisions made by driverless cars could lead to the boycott or outright ban of driverless vehicles in some countries.

According to research conducted by MoneySuperMarket in the UK, 73% of respondents would not feel safe driving alongside fully autonomous cars on motorways. From this we could tentatively conclude that the industry has yet to win the hearts and minds of consumers.

As India has already demonstrated, just because self-driving cars are technically feasible doesn’t mean they will become universally adopted across the world. This is something worth keeping in mind, especially if you plan to live and work outside the UK.

Fine weather for a drive

We’re all too familiar with inclement weather in Britain. At the moment, even cutting edge autonomous tech is reliant upon fine weather and visibility. If there’s one thing that’s more difficult than predicting human behaviour, it’s predicting the British weather! Best to hang onto that licence for now and not have to rely upon your car being able to drive you home.

Security, or lack of

Back in 2015, a group of hackers successfully pulled off a proof of concept demonstration by hacking the latest Jeep Cherokee model at the time. The hackers were able to obtain full remote control of the car and could’ve done the same thing with any of the thousands of other Cherokees on the road.

These Cherokees weren’t even designed to be driverless cars. But that didn’t stop a couple of plucky hackers who were able to disable all in-car controls, rendering the driver powerless.

In the age of the minimum viable product we are first-hand witnesses of the race to bring revolutionary tech to market at the earliest opportunity. Security considerations are often pushed onto the back burner in order to allocate more funding to developing the features that sell a product. This can result in disconcerting oversights in the way that security measures are implemented.

With regards to self-driving cars UK law dictates that security measures must be implemented. However, 60% of respondents in the MoneySupermarket survey indicated concerns about the potential for driverless cars to be hacked. It seems that overcoming the technical challenges of getting the security right in the first place is only part of the battle that lies ahead for the industry.

Conclusion

Like the human brain itself, no technology is perfect. The artificial technology used to control autonomous vehicles will need time to learn to adapt to the unpredictable nature of humans. Even then, the greater challenge could be convincing the public that driverless cars are as safe as human drivers.

It’s likely that fully-autonomous cars are still some way off, and even when they do become a reality, like any new technology they will be expensive at first. For the time-being, UK law dictates that even cars with autonomous features will require a human with a driving licence to be sitting in the driving seat, ready to take control if need be.

In summary, a driving licence is still a good investment for the foreseeable future! The sooner you pass the sooner you’ll reap the benefits. Why not get started and book your first half-price lesson today?