My Lords, I have added to my choreography before standing at the Dispatch Box: can I get a Polo mint in before the noble Lord, Lord Coaker, concludes? The answer is no. That is the first question I am able to answer.
I thank the noble Lord, Lord Browne, for tabling Amendment 59, which is supported by the noble Lord, Lord Clement-Jones, and the noble and gallant Lords, Lord Houghton and Lord Craig, and engages with the subject of novel technologies. It is a significant issue that merits discussion, and I am grateful to the noble Lord for his kind remarks.
There is no doubt that the increasing adoption of innovative technologies is changing how military operations are conducted. The noble Lords’ analysis—that we need to be particularly mindful of the legal ramifications—is hard to dispute. From the engagement that I and the department have had with the noble Lords, I know that they understand very well the broader complexities likely to be created by Defence use of AI and are anxious that we should address these issues both purposefully and systematically. This scrutiny and challenge is welcome, because we are grappling with questions and subjects that are indeed very complex.
I hope to reassure your Lordships that the department is alert to these issues and has worked extensively on them over the course of the last 18 months. Noble Lords will understand that I cannot set out details until these positions have been finalised, but work to set a clear direction of travel for defence AI, underpinned by proper policy and governance frameworks, has reached an advanced stage. Key to this is the defence AI strategy, which we hope to publish in the coming months, along with details of the approaches we will use when adopting and using AI. This commitment, which is included in the National AI Strategy, reflects the Government’s broader commitment that the public sector should set an example through how it governs its own use of the technology. Taken together, we intend that these various publications will give a much clearer picture than is currently available, because we
recognise that these are important issues that attract a great deal of interest, and we need to be as transparent and engaged as possible.
Noble Lords asked pertinent questions. I think the noble and gallant Lord, Lord Craig, asked some of these: where in the chain of command does responsibility for AI-related outcomes reside? When might the Government have an obligation to use AI to protect service personnel from harm? What are the military and moral consequences of machine-speed warfare? These are vital questions, and we recognise that we do not yet have all the answers.
Nor can we hope to arrive at these answers on our own. We have to persist in our engagement with our international partners and allies, and with our own public and civil society. It is perfectly legitimate for parliamentarians to take an interest in this subject, to ask questions and to table debates. I hope that our forthcoming publications will provide a solid platform for an ongoing effort of public engagement and efforts to enhance public understanding, subject to the usual caveats that may apply to the release of Defence information.
To turn to the subject of the proposed amendment, we are committed to ensuring that our Armed Forces personnel have the best possible care and protection, including protection against spurious legal challenges. I assure noble Lords that, regardless of the technologies employed, all new military capabilities are subject to a rigorous review process for compliance with international humanitarian law. Furthermore, we also adjust our operating procedures to ensure that we stay within the boundaries of the law that applies at the time.
International and domestic frameworks provide the same level of protection around the use of novel technologies as for conventional systems because their general principle is to focus on the action, rather than the tool. These frameworks therefore offer appropriate levels of protection for our personnel. Earlier this year, we acted to bolster this protection in historical cases, for example, through the overseas operations Act.
In respect of artificial intelligence, I have mentioned our forthcoming AI strategy and our plan to publish details of the approaches we will use when adopting and using AI. This is really where we come to the nub of the issue. The noble Lord, Lord Browne, put his finger on it, as did the noble and gallant Lord, Lord Houghton, and the noble Lord, Lord Coaker. I want to try to encapsulate what I hope will be a substantive and reassuring response to them all.
These approaches will not affect or supersede existing legal obligations, but they will ensure coherence across defence. They will also drive the creation of the policy frameworks and systems that, in practical terms, are needed to ensure that personnel researching, developing, delivering and operating AI-enabled systems have an appropriate understanding of those systems and can work with and alongside them in compliance with our various legal and policy frameworks.
The noble Lord, Lord Browne, specifically referred to the NATO AI principles. Essentially, NATO’s position is that alliance members can sign up to these NATO-wide standards or they can produce their own to a similar standard. We support NATO’s leadership in the
responsible use of artificial intelligence and, as I have indicated, we intend to publish details of our own approach in early course.
In addition, we will continue to engage internationally, including through the United Nations Conference on Certain Conventional Weapons, to promote consensus on international norms and standards for the use of new and emerging technologies on the battlefield, while continuing to act as a responsible leader in this area.
I think it was the noble Baroness, Lady Smith, who asked about the phrasing I used in response to her noble friend Lord Clement-Jones’s question last week. From memory, I said two things: first, the UK has no systems that could unilaterally employ lethal force without human involvement at some stage in the process. I think that I went on to say that, sharing the concerns of government, civil society and AI experts around the world, the UK opposes the creation and use of systems that would operate without context-appropriate human involvement. I think that is the phrase the noble Baroness sought clarification on.
The phrase means that a person is exercising some form of control over the effect of the use of the weapon in a way that satisfies international humanitarian law. This could be some form of control over the operation in real time, or it could be setting clear operational parameters for a system. I hope that that has been helpful to the noble Baroness in explaining what was behind the use of that phrase.
I have endeavoured to provide reassurance to noble Lords that the Ministry of Defence takes these matters very seriously, is already doing all that needs to be done, and is planning to be proactive in communicating our approach appropriately to Parliament and the public. On this basis, I suggest that the amendment is not needed.
I also say, with the greatest respect to the noble Lord, Lord Browne, and no sense of impertinence, that I do question the utility of requiring a review and a report. This will necessarily be only a snapshot; it will quickly become out of date when we are dealing with a rapidly evolving subject matter. Not to put too fine a point on it, the effort of staffing it risks reducing the capacity needed within the department for developing the extensive systems and frameworks that we need to ensure the proper handling of AI.
I must say that I have enjoyed this debate, as I always enjoy my engagement with the noble Lord, Lord Browne—but, for these reasons, I ask that he withdraw his amendment.