UK Parliament / Open data

Online Safety Bill

My Lords, this is not just a content Bill. The Government have always been clear that the way in which a service is designed and operated, including its features and functionalities, can have a significant impact on the risk of harm to a user. That is why the Bill already explicitly requires providers to ensure their services are safe by design and to address the risks that arise from features and functionalities.

The Government have recognised the concerns which noble Lords have voiced throughout our scrutiny of the Bill, and those which predated the scrutiny of it. We have tabled a number of amendments to make it even more explicit that these elements are covered by the Bill. We have tabled the new introductory Clause 1, which makes it clear that duties on providers are aimed at ensuring that services are safe by design. It also highlights that obligations on services extend to the design and operation of the service. These obligations ensure that the consideration of risks associated with the business model of a service is a fundamental aspect of the Bill.

My noble friend Baroness Harding of Winscombe worried that we had made the Bill worse by adding this. The new clause was a collaborative one, which we have inserted while the Bill has been before your Lordships’ House. Let me reassure her and other noble Lords as we conclude Report that we have not made it worse by so doing. The Bill will require services to take a safety by design approach to the design and operation of their services. We have always been clear that this will be crucial to compliance with the legislation. The new introductory Clause 1 makes this explicit as an overarching objective of the Bill. The introductory clause does not introduce any new

concepts; it is an accurate summary of the key provisions and objectives of the Bill and, to that end, the framework and introductory statement are entirely compatible.

We also tabled amendments—which we debated last Monday—to Clause 209. These make it clear that functionalities contribute to the risk of harm to users, and that combinations of functionality may cumulatively drive up the level of risk. Amendment 281BA would amend the meaning of “functionality” within the Bill, so that it includes any system or process which affects users. This presents a number of concerns. First, such a broad interpretation would mean that any service in scope of the Bill would need to consider the risk of any feature or functionality, including ones that are positive for users’ online experience. That could include, for example, processes designed for optimising the interface depending on the user’s device and language settings. The amendment would increase the burden on service providers under the existing illegal content and child safety duties and would dilute their focus on genuinely risky functionality and design.

Second, by duplicating the reference to systems, processes and algorithms elsewhere in the Bill, it implies that the existing references in the Bill to the design of a service or to algorithms must be intended to capture matters not covered by the proposed new definition of “functionality”. This would suggest that references to systems and processes, and algorithms, mentioned elsewhere in the Bill, cover only systems, processes or algorithms which do not have an impact on users. That risks undermining the effectiveness of the existing duties and the protections for users, including children.

Amendment 268A introduces a further interpretation of features and functionality in the general interpretation clause. This duplicates the overarching interpretation of functionality in Clause 208 and, in so doing, introduces legal and regulatory uncertainty, which in turn risks weakening the existing duties. I hope that sets out for my noble friend Lady Harding and others our legal concerns here.

Amendment 281FA seeks to add to the interpretation of harm in Clause 209 by clarifying the scenarios in which harm may arise, specifically from services, systems and processes. This has a number of concerning effects. First, it states that harm can arise solely from a system and process, but a design choice does not in isolation harm a user. For example, the decision to use algorithms, or even the algorithm itself, is not what causes harm to a user—it is the fact that harmful content may be pushed to a user, or content pushed in such a manner that is harmful, for example repeatedly and in volume. That is already addressed comprehensively in the Bill, including in the child safety risk assessment duties.

Secondly, noble Lords should be aware that the drafting of the amendment has the effect of saying that harm can arise from proposed new paragraphs (a) (b) and (c)—

Type
Proceeding contribution
Reference
831 cc2418-9 
Session
2022-23
Chamber / Committee
House of Lords chamber
Back to top