UK Parliament / Open data

Online Safety Bill

My Lords, I am very grateful for the discussion we have had today and the parallel discussions that have accompanied it, as well as the many conversations we have had, not just over the months we have been debating the Bill but over the past few days.

I will turn in a moment to the amendments which have been the focus of the debate, but let me first say a bit about the amendments in this group that stand in my name. As noble Lords have kindly noted, we have brought forward a number of changes, informed by the discussions we have had in Committee and directly with noble Lords who have taken an interest in the Bill for a long time.

Government Amendments 281C, 281D, 281E and 281G relate to the Bill’s interpretation of “harm”, which is set out in Clause 209. We touched on that briefly in our debate on Thursday. The amendments respond to concerns which I have discussed with many across your Lordships’ House that the Bill does not clearly acknowledge that harm and risk can be cumulative. The amendments change the Bill to make that point explicit. Government Amendment 281D makes it clear that harm may be compounded in instances where content is repeatedly encountered by an individual user. That includes, but is not limited to, instances where content is repeatedly encountered as a result of algorithms or functionalities on a service. Government Amendment 281E addresses instances in which the combination of multiple functionalities on a service cumulatively drives up the risk of harm.

Those amendments go hand in hand with other changes that the Government have made on Report to strengthen protections for children. Government Amendment 1, for instance, which we discussed at the beginning of Report, makes it clear that services must be safe by design and that providers must tackle harms which arise from the design and operation of their service. Government Amendments 171 and 172 set out on the face of the Bill the categories of “primary priority” and “priority” content which is harmful to children to allow the protections for children to be implemented as swiftly as possible following Royal Assent. As these amendments demonstrate, the Government have indeed listened to concerns which have been raised from all corners of your Lordships’ House and made significant changes to strengthen the Bill’s protections for children. I agree that it has been a model of the way in which your Lordships’ House operates, and the Bill has benefited from it.

Let me turn to the amendments in the name of the noble Baroness, Lady Kidron. I am very grateful for her many hours of discussion on these specific points, as well as her years of campaigning which led to them. We have come a long way and made a lot of progress on this issue since the discussion at the start of Committee. The nature of online risk versus harm is one which we have gone over extensively. I certainly accept the points that the noble Baroness makes; I know how heartfelt they are and how they are informed by her experience sitting in courtrooms and in coroners’ inquests and talking to people who have had to be there because of the harms they or their families have encountered online. The Government are firmly of the view that it is indisputable that a platform’s functionalities, features or wider design are often the single biggest factor in determining whether a child will suffer harm. The Bill makes it clear that functions, features and design play a key role in the risk of harm occurring to a child online; I draw noble Lords’ attention to Clause 11(5), which makes it clear that the child safety duties apply across all areas of a service, including the way it is designed, operated and used, as well as content present on the service. That makes a distinction between the design, operation and use, and the content.

In addition, the Bill’s online safety objectives include that regulated services should be designed and operated so as to protect from harm people in the United Kingdom who are users of the service, including with regard to algorithms used by the service, functionalities of the services and other features relating to the operation of the service. There is no reference to content in this section, again underlining that the Bill draws a distinction.

This ensures that the role of functionalities is properly accounted for in the obligations on providers and the regulator, but I accept that noble Lords want this to be set out more clearly. Our primary aim must be to ensure that the regulatory framework can operate as intended, so that it can protect children in the way that they deserve and which we all want to see. Therefore, we cannot accept solutions that, however well meaning, may inadvertently weaken the Bill’s framework or allow providers to exploit legal uncertainty to evade their duties. We have come back to that point repeatedly in our discussions.

6 pm

I will address the problems with the amendments as drafted; as the noble Baroness knows, if she presses them to a vote, we will not be able to accept them, although we are very happy to continue to discuss the concerns lying behind them. I am happy to reassure noble Lords that the Bill recognises and addresses that services can be risky by design and that features and functionalities can exacerbate the risk of harm to users, including children.

First, I have mentioned the new introductory clause that your Lordships have put into the Bill, which establishes safety by design as a key objective of it. As such, features and functionalities are captured in the existing children’s risk assessment and safety duties. I am grateful to the noble Lord, Lord Stevenson, for his suggestion that, if there is interest from the noble Baroness, Lady Kidron, we could use the time between now and Third Reading, in addition to our continuing

discussions, to look at that again and try to make it clearer. However, its inclusion in the Bill has already been of benefit.

Secondly, providers must comprehensively consider and assess the risk presented by the design and operation of their service, including the risk of their design choices, which, as many noble Lords have highlighted, are often motivated by commercial aims rather than safety. These assessments also require providers to assess the risk that a service’s features and functionalities pose. Once this mandatory risk assessment is completed, they are required to mitigate and manage the risks to children that they have identified. For example, if a service has a direct messaging function, it will need to consider how this increases the risk of users encountering harms such as bullying and to follow steps in codes of practice, or take equivalent measures, to mitigate this.

It is not right to say that functionalities are excluded from the child safety duties. Clause 11(5) clearly sets out that safety duties apply across all areas of a service, including the way it is designed, operated and used, and not only to content that is present on the service.

The noble Lord, Lord Russell, spoke to his Amendments 46 and 90. They seek to remove the provisions in Clauses 11(15) and 25(13), which limit corresponding duties elsewhere in the Bill to cases where the risk of harm is presented by the nature of the content rather than the fact of its dissemination. Clause 209 is clear that harm from content may arise from the fact or manner of its dissemination. As I have mentioned, the Government’s amendments to Clause 209 make it clear that this includes instances where algorithms bombard a user with content, such as in the scenario the noble Lord set out. As such, user-to-user and search service providers must take action to address this as part of their child safety duties.

The duties in Clauses 11(2) and 25(2) apply to content that is harmful due to the manner of its dissemination, requiring providers to design and operate their services so as to mitigate the risks of harm identified in their risk assessments. This includes risks such as an algorithm pushing content at high volume to a user. If Clauses 11(15) and 25(13) were removed, Clause 11(3) and (6) and Clause 25(3) would require children to be protected from inherently harmless content on the grounds that harm could be caused if that content were encountered repeatedly over time. I am sure that is not what the noble Lord, Lord Russell, has in mind with his amendments, but that is why, if he pushes them to a vote, we will not be able to accept them.

We have talked about this at great length. If we can use the time between now and Third Reading fruitfully to address the points I have raised on these amendments—the noble Baroness, Lady Kidron, has heard them repeatedly; I make them for the benefit of the rest of your Lordships’ House, because we have had much discussion—I am very willing to look at that and bring forward points to address this at Third Reading. However, I have set out our concerns about the approach taken in the amendments she has tabled. I am very grateful to her for her time and for discussing

this. Procedurally, if she presses them to a vote now, the matter will have been dealt with on Report and we will not be able to look at this again at Third Reading. I hope she may yet have found comfort in what I have said and be willing to continue those discussions, but if she wishes to press her amendments to a Division now, the Government will not be able to accept them and I would recommend that noble Lords vote against.

Type
Proceeding contribution
Reference
831 cc1559-1562 
Session
2022-23
Chamber / Committee
House of Lords chamber
Back to top