My Lords, I will speak to Amendment 57 in my name, Amendment 59 in the name of the noble Baroness, Lady Jones, and the Clause 14 stand part notice from the noble Lord, Lord Clement-Jones. In doing so, I register my support for Amendment 59A in the name of the noble Lord, Lord Holmes.
The Government assert that there is no diminution of rights in the Bill, yet Clause 14 removes the right not to be subject to an automated decision and replaces that right with inadequate safeguards, as the noble Lord, Lord Clement-Jones, said. On the previous day in Committee, the Minister made the argument that:
“These reforms clarify and simplify the rules related to solely automated decision-making without watering down any of the protections for data subjects or the fundamental data protection principles”,—[Official Report, 25/3/24; col. GC 146.]
but I hope he will at least accept that safeguards do not constitute a right. The fact that the Secretary of State has delegated powers to change the safeguards at
will undermines his argument that UK citizens have lost nothing at all; they have lost the right not to be subject to an automated decision.
The fact that the Government have left some guard-rails for special category data is in itself an indication that they know they are downgrading UK data rights, because the safeguards in place are not adequate. If they were adequate, it would be unnecessary to separate out SPC data in this way. I hammer the point home by asking the Minister to explain how the protections will work in practice in an era of AI when risks can come from inference and data analytics that do not use special category data but will still have a profound impact on the work lives, health, finances and opportunities of data subjects. If it is the case that data about your neighbourhood, shopping habits, search results, steps or entertainment choices is used to infer an important decision, how would a data subject activate their rights in that case?
As an illustration of this point, the daughter of a colleague of mine, who, as it happens, has a deep expertise in data law, this year undertook a video-based interview for a Russell group university with no human contact. It was not yet an ADM system, but we are inching ever closer to it. Removing the right, as the Government propose, would place the onus on students to complain or intervene—in a non-vexatious manner, of course. Will the Minister set out how UK citizens will be protected from life-changing decisions after government changes to Article 22, particularly as, in conjunction with other changes such as subject access requests and data impact assessments, UK citizens are about to have fewer routes to justice and less transparency of what is happening to their data?
I would also be grateful if the Minister could speak to whether he believes that the granularity and precision of current profiling deployed by AI and machine learning is sufficiently guaranteed to take this fundamental right away. Similarly, I hope that the known concerns about bias and fairness in ADM will be resolved over time, but we are not there yet, so why is it that the Government have a wait-and-see policy on regulation but are not offering the same “wait and see” in relation to data rights?
On Amendment 59 in the name of the noble Baroness, Lady Jones, the number of workers anticipated to be impacted by AI is simply eye-watering. In last Friday’s debate on AI, it was said to be 300 million worldwide, and one in four across Europe. But how workers work with AI is not simply a scary vision of the near future; it is here now.
I have a family member who last year left an otherwise well-paid and socially useful job when they introduced surveillance on to his computer during his working from home. At the time, he said that the way in which it impacted on both his self-esteem and autonomy was so devastating that he felt like
“a cog in a machine or an Amazon worker with no agency or creativity”.
He was an exemplary employee: top of the bonus list and in all measurable ways the right person in the right job. Efficiency in work has a vital role but it is not the whole picture. We know that, if able and skilled workers lose their will to work, it comes at a considerable cost
to the well-being of the nation and the public purse. Most jobs in future will involve working with or even collaborating with technology; ensuring that work is dignified and fair to the human components of this arrangement is not a drag on productivity but a necessity if society is to benefit from changes to technology.
1.15 pm
Amendment 57 in my name would prevent the Secretary of State making any amendments to new Articles 22A, 22B or 22C if such amendments reduce, minimise or undermine the existing standards and protections for children’s data. I hope I have made it clear that I am not setting myself against automated decision-making. Training a model using thousands of scans of people’s lungs can enhance a doctor’s ability to identify potential tumours accurately. Nor do I wish for children to miss out on the benefits of such technology, as the Minister appeared to suggest last week—merely that it should be deployed only when it is in their best interests, as discussed in our debate on the previous group.
Moreover, if the noble Lord, Lord Clement-Jones, is successful in his desire that Clause 14 should not stand part of the Bill, this amendment and Amendment 46 will be unnecessary, but noble Lords will recognise a steady drum beat of resistance against the Government’s plans to change data rights to benefit the commercial interests of tech companies at the expense of children.
In his answer relating to legitimate interests, the Minister pointed out that, when amending or adding to Annexe 1, the Secretary of State already has a duty to have regard to
“the need to provide children with special protection with regard to their personal data”.
Unless the Minister can tell me otherwise, I believe that is the only instance where she is required to do so when exercising her powers. So there is a place for some of the broader amendments from the second group that speak to the status of children throughout the Bill. I remind the Minister of my suggestion that recital 38 be put on the face of the Bill, as the Government have done with so many other recitals to give “legal certainty” or “clarity”.
Irrespective of that wider point, I trust that the Minister will at least agree with me that having regard to something is quite different from ensuring something. There is a difference between a vague notion that all is changed but nothing diminished and the certainty demanded by the children’s amendments. I ask the Minister to be certain when he replies to address the question of whether “having regard” is the same bar as “ensuring no diminution of standards”.