My Lords, in following the noble Lord, Lord Naseby, I will start by reflecting on some of the safety issues he raised and pick up in particular a phrase the Minister used in his introductory speech—that the introduction of self-driving cars could mean that the lack of human error saves lives.
Automation does not remove human error. It simply changes the potential site of it from an individual vehicle to a programming system, an algorithm and the control system applied to many vehicles. There is the potential for one error to be multiplied many times, with disastrous impacts. It also allows individual actions, possibly malevolent ones, to produce mass effects. A number of noble Lords referred to what has happened with people running around with traffic cones in San Francisco. We are speaking just a couple of weeks after the National Cyber Security Centre produced its seventh annual review, noting that the UK’s critical infrastructure is at grave risk. By relying on this single system, or multiple systems, we are potentially creating a much higher critical safety risk and a resilience risk.
This morning I was talking about climate adaptation and resilience with the National Trust. We need to look at all this in terms of our systems. If we rely on these systems and keep using them for years, what will happen to the skills of drivers should we suddenly need individual people to take to the road and control vehicles? What happens, as we have seen in San Francisco, if they all suddenly stop working or decide to assemble in one place? What does that do to the functioning of our society?
I was going to start with the Bonn climate talks, COP 23, in 2017. I recall a state of near panic among members of the climate community because it was thought that we could see self-driving HGVs all over the place any day soon. That could have massive climate impacts as they now spend a large amount of their time parked up because their drivers need rest periods and there is limited availability of drivers. If you put self-driving into the equation, you potentially massively increase the climate impacts. That was 2017, and there is actually much less fear now.
I begin my contribution today by making a call for realism and an understanding of what this Bill is actually about and the environment in which it arrives. I have already challenged the noble Baroness, Lady Penn, when she was wearing the Treasury hat. A government spokesperson briefed the Telegraph that this Bill would mean that we could see self-driving vehicles on our roads by 2026,
“if they were proved safe”.
When I challenged her, the noble Baroness, Lady Penn, being the skilled operator that she is, agreed that yes, if they were proved safe, this would be possible. Well, I might run a two-hour marathon if I were 30 years younger and had entirely different genetics, but that is not the world that we live in. I am asking for an acknowledgement of the realism of the situation as we conduct our debate going forward on this Bill.
I start with a potential positive impact if we were to see self-driving vehicles, even operating in small areas in controlled circumstances—which I think is a far more likely possibility. One study I looked at noted that, for self-driving vehicles to operate effectively:
“Roads may need to be kept free of small debris”
and “uneven” surfaces smoothed. A number of noble Lords have already referred to the current state of our roads. Let us imagine that we are going to go ahead with self-driving autonomous vehicles. Just think about what our roads might look like for the rest of us to enjoy. However, I ask for a little realism here. Do we actually have the capacity—the financial, human skills or machinery capacity—to deliver roads entirely free of debris and uneven surfaces? I rather doubt it.
This raises an important point, as mentioned by the noble Baroness, Lady Bowles, that I want to highlight and which we will come to in the detail of our later debate: we need to think about statistics and data. The road conditions in the US, France and Australia are very different from here. Can we extrapolate figures on safety from there and apply them here? If we cannot, how do we get figures at scale in the UK? That is a terribly important point.
I am not sure that anyone has directly referred to this, but it is worth noting the issue of safety. The Transport Select Committee has looked at this in some detail and I think we are going to have large debates on this at a later stage. Is the careful and competent human driver the right test to be applying? Improved safety is not a given in the real circumstances of our roads. As the RAC Foundation has said:
“When we put our lives in the hands of automated machinery, we expect it to perform to the highest standards of safety”.
That is an expectation that people have. We know that human beings make mistakes, and we know that, as
pedestrians, cyclists and other drivers, we make allowances for other peoples’ mistakes. However, we are not necessarily going to apply the same criteria to autonomous vehicles.
This debate has moved as we have progressed through. A number of the early speeches were very much focused on the positive opportunities seen in autonomous vehicles. The noble Lord, Lord Bourne of Aberystwyth, was one of those people. However, I want to address some policy points about the environmental risks of self-driving or autonomous cars.
If as a result of such cars we see more vehicles on our roads and more and longer journeys, we could see increased emissions. I think most of us assume that these will be electric vehicles, but about half of the PM2.5—small particulate matter pollution from vehicles —comes from brakes and tyres. Autonomous vehicles still have brakes and tyres. There are the congestion issues; there is also the noise and the sheer disruption caused by vehicles moving around our roads.
There is some real data on this from partially autonomous vehicles. In 2019, a study in California found that the owners of partially autonomous vehicles were taking them on longer journeys, particularly at weekends. That makes sense when you think about it: you can put your feet up, play a computer game, read or have a sleep, and so you decide that you are going to take a long weekend trip to the other end of the UK. If lots of people do that, it has a real and significant environmental and social impact.
There is another risk. Let us imagine the situation with the theatre up the road, when lots of people have an autonomous vehicle. It costs heaps to park in the city but, as you do not need to park an autonomous vehicle, they decide to get their cars to drive them to the theatre and then send them home again. They then call their car when they want to leave in the evening. Can noble Lords think about what Charing Cross Road might look like under those circumstances? What kind of chaos would it cause and what might it do to the buses?
I turn to an issue that no one has raised but which is really important, because we need to look at many areas beyond autonomous vehicles. There is a temptation to think of the cloud and algorithms as being immaterial and that things that happen out there in the cloud have no real-world, physical consequences. Actually, we can thank researchers from MIT—Sudhakar et al, in an article published in the IEEE Micro journal—for calculating, using and processing the data and algorithms to find out what the environmental cost could be. I hope that noble Lords will forgive me for putting in some large numbers.
Data centres now produce 0.3% of global climate change emissions; that is the same as Argentina. The MIT study shows that, if the world introduced a billion autonomous vehicles, the demand for energy for those data centres would double. It adds that
“if an autonomous vehicle has 10 deep neural networks processing images from 10 cameras, and … drives for one hour a day, it will make 21.6 million inferences each day. One billion vehicles would make 21.6 quadrillion inferences. To put that into perspective, all of Facebook’s data centers … make a few trillion inferences each day … 1 quadrillion is 1,000 trillion”.
Take the numbers away and that is a huge demand for energy, computing power and all the technology, computers and databases, so where will we find the capacity in the world to deliver that? We still have children in Africa who do not have a lightbulb to do their homework at night and areas of India that need the most basic levels of infrastructure. We need to look at all this in that policy context.
I will bring up two more points. The noble Lord, Lord Cameron of Dillington, made a really disturbing suggestion: we might have to bring in anti-jaywalking laws to allow for autonomous vehicles. What are our economy and society for? Are we here to serve the needs of people or are we here to service the machines? That is a question that the noble Lord’s point raises.
The noble Lord, Lord Tunnicliffe, raised an important issue that might be seen as explicitly excluded from the scope of the Bill: delivery vehicles and drones. They are examples of autonomous systems that may not use the roads but that multinational companies see as replacing human beings in delivering goods while using lots of our public spaces, including the air and pavements. Can the Minister tell us now or by letter later—I understand that it might not be in his briefing—what the Government’s thinking is about ensuring the regulation of those?
Finally, that brings me to reinforce the point about the need for this all to be inclusive by design, made by both the noble Baroness, Lady Brinton, and the noble Lord, Lord Holmes. We need to think about how our streets, pavements and airspace work for people, not for the benefits of multinational companies and their machines.
5.34 pm