UK Parliament / Open data

Armed Forces Bill

I thank the Minister for her response to this debate and, with the indulgence of the Committee, I will refer to parts of her response. I was greatly appreciative of it all, but some parts I welcomed more than others.

I will start with the last point. The criticisms the Minister made about the vehicle that I tabled in order to have this debate was correct. It is implicit in the way I debate these issues that they are moving so fast that probably there is no point in time at which we could publish a report that would not quickly go out of date.

I accept that. In fact, for that reason I wish that people, and sometimes senior military officers—but thankfully no British ones—would stop talking about a “race” for this technology. A race requires a line, and the development of this technology has no winning line that we know of.

In fact, the likelihood is that when we move to AGI, which is a hypothetical but likely development, whereby an intelligent agent understands or learns any intellectual task that a human being can, it may well be that we think we are at the line, but the machine does not think we are at the line and runs on and looks back at us and laughs. So I accept all of that but, at some point, we need to find a framework in which we in Parliament can connect with these issues—a methodology for the Government to report to Parliament, to the extent that they can, and for all of us to take responsibility, as we should, for asking our young people to go into situations of conflict, with the possibility that these weapons will be used, with all the implications.

So that is what I am seeking to get. I want a 24 year-old who is asked to take some responsibility in an environment in which these weapons are deployed to know with confidence that he or she is acting within the law. That is my shared responsibility with the Government; we need to find a way of doing that. This may be an imperfect way, but we may always be in an imperfect situation with a moving target. So I thank all noble Lords for their contributions to this debate. None of these debates answers any questions fully, but they all add to our collective knowledge.

I thank the noble and gallant Lord, Lord Houghton, for his unqualified support. He took me slightly by surprise with the deployment of his eloquence to make the case for deploying the law as a weapon of war. I fear that I agree with him—I used to be a lawyer—but I will have to think long and carefully before I give him my unqualified support for that. However, I suspect that, as always, I will end up supporting what he said.

6 pm

I thank the noble and gallant Lord, Lord Craig of Radley, for his contribution highlighting an issue which is alive today because of the operation of the overseas operations Act. Anyone deployed in a conflict who operates technology remotely from the United Kingdom is not covered by the provisions of that Act, because they are not deployed into the environment. In my view, that is a breach of that person’s human rights, because those who are deployed with them have an advantage over them. We made that mistake, and we need to go back and correct it.

The noble Baroness, Lady Smith of Newham, in the absence of the genuine expert in her party on the issues of artificial intelligence, raised an exceptionally good point. The Minister responded to it very cleverly but very accurately. We deploy weapons at the moment—I will not identify what they are, because it does not really matter—which act so independently that many would consider them to be autonomous. It is no answer to say that someone designed the framework within which they operate and therefore there is meaningful human control of them; there is not, in many people’s view, although I am not saying that that is my view. We need to find some method

internationally—because this is another global problem and needs a global answer. Otherwise, we will have no stability in our strategic defence. That may be too much to ask us to resolve today, but we should, in the long view, consider that. I thank my noble friend Lord Coaker for reminding me of at least half a dozen things that I should have put in my speech.

The Minister has given me reassurance upon the reassurance that she and her officials are already giving me in their regular engagement with me. It is not happening all the time, but with sufficient regularity for me to be reassured. I spent part of this weekend with some people who are genuine experts on artificial intelligence. I am pretty frightened by what can happen with artificial intelligence, but when they started talking about AGI, I was terrified by what may be coming down the track at us. At the moment, we do not have it, and we will have to move very quickly to try to get it regulated before it is fully developed.

Although I will withdraw the amendment, there is always the possibility that this environment and its potential will develop sufficiently for me to need to retable it at Report to extend this debate a little further. I beg leave to withdraw the amendment.

Type
Proceeding contribution
Reference
815 cc439-441GC 
Session
2021-22
Chamber / Committee
House of Lords Grand Committee
Subjects
Back to top