UK Parliament / Open data

Data Protection and Digital Information Bill

My Lords, this group, in which we have Amendments 41, 44, 45, 49, 50, 98A and 104A and have cosigned Amendments 46 and 48, aims to further the protections that we discussed in the previous group. We are delighted that the noble Lord, Lord Clement-Jones, and others joined us in signing various of these amendments.

The first amendment, Amendment 41, is a straight prohibition of any data processing that would contravene the Equality Act 2010. All legislation should conform to the terms of the Equality Act, so I expect the Minister to confirm that he is happy to accept that amendment. If he is not, I think the Committee will want to understand better why that is the case.

Amendment 44 to new Article 22B of the UK GDPR is, as it says, designed,

“to prevent data subjects from becoming trapped in unfair agreements and being unable to exercise their data rights”,

because of the contract terms. One might envisage some sensitive areas where the exercise of these rights might come into play, but there is nothing that I could see, particularly in the Explanatory Notes, which seeks to argue that point. We have no knowledge of when this might occur, and I see no reason why the legislation should be changed to that effect. Special category data

can be used for automated decision-making only if certain conditions are met. It involves high-risk processing and, in our view, requires explicit consent.

The amendments remove performance of a contract as one of the requirements that allows the processing of special category data for reaching significant decisions based on automated processing. It is difficult to envisage a situation where it would be acceptable to permit special category data to be processed in high-risk decisions on a purely automated basis, simply pursuant to a contract where there is no explicit consent.

Furthermore, relying on performance of a contract for processing special category data removes the possibility for data subjects to exercise their data rights, for example, the right to object and the ability to withdraw consent, and could trap individuals in unfair agreements. There is an implicit power imbalance between data subjects and data controllers when entering a contract, and people are often not given meaningful choices or options to negotiate the terms. It is usually a take-it-or-leave-it approach. Thus, removing the criteria for performance of a contract reduces the risks associated with ADM and creates a tighter framework for protection. This also aligns with the current wording of Article 9 of the UK GDPR.

Amendment 45 changes the second condition to include only decisions that are required or authorised by law, with appropriate safeguards, and that are necessary for reasons of substantial public interest. The safeguards are retained from Section 14 of the DPA 2018, with amendments to strengthen transparency provisions.

Amendment 49 seeks to ensure that the protections conferred by Article 22C of the UK GDPR would apply to decisions “solely or partly” based on ADM rather than just “solely”. This would help to maximise the protections that data subjects currently enjoy.

Amendment 50 is another strengthening measure, which would make sure that safeguards in the new Article 22C are alongside rather than instead of those contained in Articles 12 to 15.

Our Amendment 104A would insert a new Section into the 2018 Act, requiring data controllers who undertake high-risk processing in relation to work-related decisions or activities to carry out an additional algorithmic impact assessment and make reasonable mitigations in response to the outcome of that assessment.

I ought to have said earlier that Amendment 98A is a minor part of the consequential text.

An improved workplace-specific algorithmic impact assessment is the best way to remedy clear deficiencies in Clause 20 as drafted, and it signals Labour’s international leadership and alignment with international regulatory and AI ethics initiatives. These are moving towards the pre-emptive evaluation of significant social and workplace impacts by responsible actors, combined with a procedure for ongoing monitoring, which is not always possible. It also moves towards our commitment to algorithmic assurance and will help to ensure that UK businesses are not caught up in what is sometimes described as the “Brussels effect”.

7.15 pm

The impact assessment should cover known impacts on work and workers’ rights and the exercise of those, combining the best of audit technology and legal impact assessments. There would also be a duty to respond appropriately to the findings of that assessment. One of the simplest and most effective ways to boost transparency and consultation provisions is to attach them to these improved impact assessments by requiring disclosure of the assessment, at least in summary form, and permitting requests for additional information relevant to that assessment.

In our view, the definition of “high risk” in the Bill should be deemed to include significant impacts on work and workers. For clarity, this includes: any impact on equal opportunities or outcomes of work, access to employment, pay, contractual status, terms and conditions of employment, health and well-being, lawful association rights, and associated training. This could be done by a discreet deeming provision at several places in the Bill. These factors would also provide for a threshold for the more rigorous workplace assessment.

In our view, the core components of that assessment are: a requirement to establish a process for undertaking impact assessments; a requirement to assess significant impacts on work and employees; a requirement to involve those affected, including employees, workers and official representatives; a requirement to take appropriate steps in response, or, in other words, to mitigate and impose safeguards; and a requirement to disclose metrics, methods and mitigation taken.

In many ways, Amendment 104A is a continuation of the debates on the DMCC Bill on changing uses of technology in workplaces and the potential for workers to be disadvantaged by the decisions produced by software. Given the risks, we feel that there should be more protections in data legislation rather than fewer, and transparency and consultation are key.

We support Amendment 46 because it offers a further measure of protection to children’s rights. We believe that, in this area, we should retain the existing legislative framework from the 2018 Act, and we cannot see any case for weakening those protections. Amendment 48 largely echoes our Amendment 49. The amendment of the noble Lord, Lord Holmes, is in this group, although he is not here. To our way of looking at things, it seems eminently sensible. I look forward to the opportunity to listen to him talk to that at a later stage of the Bill. I beg to move.

Type
Proceeding contribution
Reference
837 cc150-2GC 
Session
2023-24
Chamber / Committee
House of Lords Grand Committee
Back to top