UK Parliament / Open data

Data Protection and Digital Information Bill

That is a fair question. I must confess that I do not know the answer. There will be mechanisms in place, department by department, I imagine, but one would also need to report on it across government. Either it will magically appear in my answer or I will write to the Committee.

The CDDO has already published guidance on the procurement and use of generative AI for the Government. We will consult on introducing this as a mandatory requirement for public sector procurement, using purchasing power to drive responsible innovation in the broader economy.

I turn to the amendments in relation to meaningful involvement. I will first take together Amendments 36 and 37, which aim to clarify that the safeguards mentioned under Clause 14 are applicable to profiling operations. New Article 22A(2) already clearly sets out that, in cases where profiling activity has formed part of the decision-making process, controllers have to consider the extent to which a decision about an individual has been taken by means of profiling when establishing whether human involvement has been meaningful. Clause 14 makes clear that a solely automated significant decision is one without meaningful human involvement and that, in these cases, controllers are required to provide the safeguards in new Article 22C. As such, we do not believe that these amendments are necessary; I therefore ask the noble Baroness, Lady Jones, not to press them.

Turning to Amendment 38, the Government are confident that the existing reference to “data subject” already captures the intent of this amendment. The existing definition of “personal data” makes it clear that a data subject is a person who can be identified, directly or indirectly. As such, we do not believe that this amendment is necessary; I ask the noble Lord, Lord Clement-Jones, whether he would be willing not to press it.

Amendments 38A and 40 seek to clarify that, for human involvement to be considered meaningful, the review must be carried out by a competent person. We feel that these amendments are unnecessary as meaningful human involvement may vary depending on the use case and context. The reformed clause already introduces a power for the Secretary of State to provide legal clarity on what is or is not to be taken as meaningful human involvement. This power is subject to the affirmative procedure in Parliament and allows the provision to be future-proofed in the wake of technological advances. As such, I ask the noble Baronesses, Lady Jones and Lady Bennett, not to press their amendments.

7 pm

Amendments 39, 47, 51, 56, 60 and 64 to 68 appear to restrict public authorities’ use of automated decision-making by introducing a new definition of decisions that meaningfully involve automated processing. Our reforms clarify that a solely automated decision is one that is taken without any meaningful human involvement going beyond a cursory or rubber-stamping exercise. These amendments seek to bring in an entirely separate threshold for the use of automated decision-making by public authorities and controllers acting on their behalf. We consider this unnecessary as the Article 22 safeguards, as they apply to solely automated decisions, are robust and provide strong protections to all data subjects. These safeguards are applicable to all controllers whether they are or act for a public authority. As such, we believe that the reforms in the Bill will benefit society by allowing public authorities to use automated decision-making with appropriate safeguards in place.

Amendments 43 and 62 seek to extend the limitations on the use of special categories of data to all automated decision-making. Such restrictions would be unnecessary and would impede the use that controllers can make of this technology. We believe that the safeguards set out under Article 22C and Section 50C, which entitle data subjects to information about decisions taken about them, to make representations to contest decisions and to obtain human review, provide sufficient protection for personal, non-sensitive data. As such, we do not believe that these amendments are necessary; I ask the noble Lord, Lord Clement-Jones, not to press them.

Amendment 109 seeks to preserve and amend Section 14 of the Data Protection Act relating to automated decision-making authorised by law. The Government believe that the same uniform safeguards should be applicable across all processing conditions, including the processing of special category data, to simplify and clarify the obligations of controllers. Having different safeguards and obligations depending on the lawful ground of processing would lead to uncertainty among controllers as well as among data subjects. The Government aim to simplify and clarify the rules to ensure clear understanding of organisations’ obligations to protect data subjects’ rights. Furthermore, this amendment would also require data subjects to receive a personalised explanation of decisions reached following the automated processing of their data. Article 22C(2)(a) already requires controllers to provide data subjects with information about decisions taken

about them. As such, we do not believe that this amendment is necessary; I ask the noble Baroness, Lady Jones, not to press it.

I shall return briefly to the rollout of the Algorithmic Transparency Reporting Standard. To date, we have taken a deliberatively iterative and agile approach on ATRS development and rollout with the intention of generating buy-in from departments, gathering feedback, informing the evidence base, and improving and adapting the standard.

Type
Proceeding contribution
Reference
837 cc147-9GC 
Session
2023-24
Chamber / Committee
House of Lords Grand Committee
Back to top