My Lords, this amendment is also in the names of the noble Lord, Lord Clement-Jones, and the noble and gallant Lords, Lord Houghton of Richmond and Lord Craig of Radley. I am very grateful to them for joining me in this amendment, and I convey the apologies of the noble Lord, Lord Clement-Jones, who is unable to be present today because he had a prior, immovable commitment to be abroad representing your Lordships’ House in a meeting.
Amendment 59 focuses on the protection and guidance that Armed Forces personnel engaged in the deployment and use of new technologies will need to ensure that they comply with the law, including international humanitarian law, and that will explain how international and domestic legal frameworks need to be updated—all because of the predicted increased use of novel technologies that could emerge from or be deployed by the Ministry of Defence, UK allies or the private sector.
Today the private sector is often deployed with our Armed Forces on overseas operations as part of a multinational force. The amendment imposes an obligation
on the Secretary of State to commission a review of the relevant issues, sets out what that review must consider and obliges the Secretary of State to lay a report before Parliament of the report’s findings and recommendations.
That is the focus of the amendment but underlying it is a much broader issue about the duties of the Government for our Armed Forces in respect of the development, deployment and use of these technologies, and another complementary obligation on the Government to ensure that they are parliamentarily accountable for these developments—to the extent, of course, that they can be.
Noble Lords will recall that the same amendment was tabled and debated during the passage of the overseas operations Bill but was not pressed to a vote. Separately, on behalf of those noble Lords who supported it, I told the Minister that it was our intention to bring it back in this context, which is perhaps a more appropriate and broader context for the amendment.
I thank the Minister and pay tribute to her and to the MoD officials who are wrestling with the complex legal challenges posed by the development and deployment of these weapons systems for their work on that, and for their repeated engagement with me and other noble and noble and gallant Lords, including those who have put their names to this amendment. As a result of that engagement, I am very aware that the Ministry of Defence continues, and has continued over recent months at pace, both domestically and internationally, to work hard on this, and is making progress with these complex challenges.
I do not want to take unnecessary time going over again all the arguments made in support of the measure in the overseas operations Bill context. I take them as read. There are still unanswered questions, but I hope that, over time, they may be answered. I shall refer to some of them, and more recent developments, for another purpose, which is to set the context, and reinforce the importance, of addressing these challenges—so I shall repeat a few points that I made in earlier debates.
First, the integrated review, published in March, was the third defence and security review since 2020, which alone is an indication of the pace at which these developments are taking place. It was described as forward-facing, recognising both current and future threats against the UK, and set out the capabilities that will need to be developed to deter and engage them. It does do that—imperfectly, I have to say, but it does do it.
When the Prime Minister made a Statement on the review in November last year, he said that
“now is the right time to press ahead”
with the modernisation of the Armed Forces because of
“emerging technologies, visible on the horizon”.—[Official Report, Commons, 19/11/20; col. 488.]
The Prime Minister said that these would “revolutionise warfare” and I think he was right. The CGS, General Sir Mark Carleton-Smith, said that he foresees the army of the future as
“the integration of boots and bots”.
The noble and gallant Lord, Lord Houghton of Richmond, who is with us today, has repeatedly warned your Lordships about the risks posed by the intersection of artificial intelligence and human judgment and has spoken wisely about the risks posed by technology interacting with human error.
These risks are with us now and they are very real. Last month retired General Stanley McChrystal, who led the coalition forces in Afghanistan for two years, said that artificial intelligence inevitably will come to make lethal decisions on the battlefield. However, he acknowledged the “frightening” risks of potential malfunction or mistake. He said:
“People say, ‘We’ll never give control over lethal strike to artificial intelligence.’ That’s wrong. We absolutely will. Because at a certain point, you can’t respond fast enough, unless you do that. A hypervelocity missile, hypersonic missile coming at the United States aircraft carrier, you don’t have time for individuals to do the tracking, you don’t have time for senior leaders to be in the decision loop, or you won’t be able to engage the missile.”
Now, at a less strategic level, military-grade autonomous drones can fly themselves to a specific location, pick their own targets and kill without the assistance of a remote human operator. A UN report about a March 2020 skirmish in the military conflict in Libya records that such a drone made its wartime debut. The report states that retreating forces
“were subsequently hunted down and remotely engaged by the unmanned combat aerial vehicles”,
but does not say explicitly that this lethal autonomous weapon system killed anyone. But it certainly tried to.
The very real fear is that autonomous weapons will undermine the international laws of war. These laws are premised on the idea that people can be held accountable for their actions even during wartime and that the right to kill during combat does not confer the right to murder civilians. But how can autonomous weapons be held accountable? Who is to blame for a robot that commits war crimes? Who would be put on trial: the weapon, the soldier, the soldier’s commanders, the corporation that made the weapon, or the person who wrote the code that gave the weapon the ability to do this?
In a world without regulations that compel meaningful human control of autonomous weapons, there will be war crimes with no war criminals to hold accountable, and the laws of war, along with their deterrent value, will be weakened significantly. I say “deterrent value” because I think, from my experience, that the laws of war and international humanitarian laws work because they are observed, not because they are enforced. It is important that we find some way of collectively reviewing these laws so that they can continue to be observed in this more complicated—and, in many ways, terrifying—new world that we are moving rapidly into.
On 21 October 2021, NATO Defence Ministers agreed to NATO’s first ever strategy for artificial intelligence—AI—which states:
“At the forefront of this Strategy lie the NATO Principles of Responsible Use for AI in Defence, which will help steer our transatlantic efforts in accordance with our values, norms, and international law. The NATO Principles of Responsible Use … are based on existing and widely accepted ethical, legal, and policy commitments under which NATO has historically operated
and will continue to operate under. These Principles do not affect or supersede existing obligations and commitments, both national and international.”
Our Government must have agreed these principles. When will the Minister make a Statement to Parliament on them, allow them to be debated and allow Ministers to be questioned on their sufficiency or their breadth and depth? The provisions of Article 36 of Protocol 1, additional to the 1949 Geneva conventions, commit states, including our own, to ensure the legality of all new weapons, means and methods of warfare by subjecting them to a rigorous and multidisciplinary review. I have no reason to believe that we have not complied with our legal obligations in that respect, but, unfortunately, as we are not one of the eight nations in the world that publish a review of legal compatibility, including the United States of America, I have no Minister’s reassurance in that regard. When will we get that assurance or transparency?
5.30 pm
The important purpose of this amendment is to protect our Armed Forces from the accusation or from the inadvertence of breaching international humanitarian law and the laws of warfare while operating these weapons—but it is also to encourage the Government to create the appropriate framework and transparency in these matters, which will be essential if we are to move forward, and to allow Parliament to be a part of that process. This technology creates awesome responsibilities and challenges for their operators, but it does the same for our Governments. Will the Minister take the long view on those challenges? I accept that today she will only be able to respond, in a reassuring way, that the actions set out in the amendment are being undertaken and that Ministers will appropriately reveal details of progress and invite parliamentary scrutiny when they are able so to do.
In the meantime, can the Minister review everything that I have said in these debates, or ask her officials to, and answer my questions?