My Lords, the Central Digital and Data Office, or CDDO, and the Centre for Data Ethics and Innovation, as it was then called—it now has a new name as a unit of DSIT—launched the algorithmic transparency recording standard in November 2021. The idea for the ATRS arose from a recommendation by the CDEI that the UK Government should place a mandatory transparency obligation on public sector organisations using algorithms to support “significant decisions affecting individuals”. It is intended to help public sector organisations to provide clear information about the algorithmic tools that they use, how they operate and why they are using them.
The ATRS is a promising initiative that could go some way to addressing the current transparency deficit around the use of algorithmic and AI tools by public authorities. Organisations are encouraged to submit reports about each algorithmic tool that they are using that falls within the scope of the standard.
We welcome the recent commitments made in the Government’s response to the AI regulation White Paper consultation to make the ATRS a requirement for all government departments. However, we believe that this is an opportunity to deliver on this commitment through the DPDI Bill, by placing it on a statutory footing rather than it being limited to a requirement in guidance. That is what Amendment 74 is designed to do.
We also propose another new clause that should reflect the Government’s commitment to algorithmic transparency. It would require the Secretary of State to introduce a compulsory transparency reporting requirement, but only when she or he considers it appropriate to do so. It is a slight watering-down of Amendment 74, but it is designed to tempt the Minister into further indiscretions. In support of transparency, the new clause would, for as long as the Secretary of State considers making the ATRS compulsorily inappropriate, also require the Secretary of State to regularly explain why and keep her decision under continual review.
Amendment 76 on safe and responsible automated decision systems proposes a new clause that seeks to shift the burden back on public sector actors. It puts the onus on them to ensure safety and prevent harm, rather than waiting for harm to occur and putting the burden on individuals to challenge it. It imposes a proactive statutory duty, similar to the public sector equality duty under Section 149 of the Equality Act 2010, to have “due regard” to ensuring that
“automated decision systems … are responsible and minimise harm to individuals and society at large”.
The duty incorporates the key principles in the Government’s AI White Paper and therefore is consistent with its substantive approach. It also includes duties to be proportionate, to give effect to individuals’ human rights and freedoms and to safeguard democracy and the rule of law. It applies to all “automated decision systems”. These are
“any tool, model, software, system, process, function, program, method and/or formula designed with or using computation to automate, analyse, aid, augment, and/or replace human decisions that impact the welfare, rights and freedoms of individuals”.
This therefore applies to partly automated decisions, as well as those that are entirely automated, and systems in which multiple automated decision processes take place.
It applies to traditional public sector actors: public authorities, or those exercising public functions, including private actors outsourced by the Government to do so; those that may exercise control over automated decision systems, including regulators; as well as those using data collected or held by a public authority, which may be public or private actors. It then provides one mandatory mechanism through which compliance with the duty must be achieved—impact assessments. We had a small debate about the ATRS and whether a compliance system was in place. It would be useful to see whether the Minister has any further comment on that, but I think that he disagreed with my characterisation that there is no compliance system currently.
This provision proposes impact assessments. The term used, “algorithmic impact assessment”, is adopted from Canada’s analogous directive on automated decision-making, which mandates the use of AIAs for all public sector automated decision systems. The obligation is on the Secretary of State, via regulations, to set out a framework for AIAs, which would help actors to uphold their duty to ensure that automated decision systems are responsible and safe; to understand and to reduce the risks in a proactive and ongoing way; to introduce the appropriate governance, oversight, reporting and auditing requirements; and to communicate in a transparent and accessible way to affected individuals and the wider public.
Amendment 252 would require a list of UK addresses to be made freely available for reuse. Addresses have been identified as a fundamental geospatial dataset by the UN and a high-value dataset by the EU. Address data is used by tens of thousands of UK businesses, including for delivery services and navigation software. Crucially, address data can join together different property-related data, such as energy performance certificates or Land Registry records, without using personal information. This increases the value of other high-value public data.
2 pm
Addresses are no longer just about sending letters; they are crucial spatial data infrastructure used by tens of thousands of UK businesses. Reliable address data is important for navigation software, such as TomTom or Waze, and successful service provision such as Amazon deliveries. Control of the UK’s addresses was sold to Royal Mail when it was privatised in 2013. There is now a complex system for generating and managing UK addresses, where most of the work is done by local authorities but most of the benefits flow to Royal Mail.
If you are in the public sector, you can access address data for free because the Government pay Royal Mail and Ordnance Survey millions of pounds per year for public sector access. If you are a business, however, you have to pay for access, sign licensing agreements and potentially hire lawyers. This is especially burdensome for start-ups and SMEs. The previous Government created the Geospatial Commission to unlock the value of geospatial data, but it has not yet evaluated the arrangements with Royal Mail. The process of creating and managing addresses is slow and complex, with multiple bodies involved.
This all causes a number of problems: it holds back growth and innovation, it holds back emerging technology, it hinders public service delivery, it causes problems for citizens and, increasingly, it makes the UK an outlier among high-income countries. As a result, our address data is expensive, hard to access and unreliable. The Government need to act now to provide a dataset of all UK addresses that is accurate and can be freely used by anyone offering services to UK citizens. This could be implemented by establishing a new body to manage address data or by mandating that Royal Mail makes the data openly available.
Unlike opening up more sensitive datasets, such as personal location, releasing address data—a list of the physical places recognised by the Government—carries few new legal or ethical risks. Many other countries are doing this, including those with a strong privacy regime. The harms created by the lack of access to address data are more pressing. I offer the Netherlands as a good example of somewhere that has already been through this process.
I am grateful for the support of the noble Baroness, Lady Bennett, for this particular amendment, alongside another noble Lord who no doubt will reveal themself when I finally find my way through this list of amendments. In the meantime, I beg to move.