UK Parliament / Open data

Online Safety Bill

My Lords, like everyone who spoke, I and the Government recognise the tragic consequences of suicide and self-harm, and how so many lives and families have been devastated by it. I am grateful to the noble Baroness and all noble Lords, as well as the bereaved families who campaigned so bravely and for so long to spare others that heartache and to create a safer online environment for everyone. I am grateful to the noble Baroness, Lady Finlay of Llandaff, who raised these issues in her Private Member’s Bill, on which we had exchanges. My noble friend Lady Morgan is right to raise the case of Frankie Thomas and her parents, and to call that to mind as we debate these issues.

Amendments 96 and 296, tabled by the noble Baroness, Lady Finlay, would, in effect, reintroduce the former adult safety duties whereby category 1 companies were required to assess the risk of harm associated with legal content accessed by adults, and to set and enforce terms of service in relation to it. As noble Lords will know, those duties were removed in another place after extensive consideration. Those provisions risked creating incentives for the excessive removal of legal content, which would unduly interfere with adults’ free expression.

However, the new transparency, accountability and freedom of expression duties in Part 4, combined with the illegal and child safety duties in Part 3, will provide a robust approach that will hold companies to account for the way they deal with this content. Under the Part 4 duties, category 1 services will need to have appropriate systems and processes in place to deal with content or activity that is banned or restricted by their terms of service.

Many platforms—such as Twitter, Facebook and TikTok, which the noble Baroness raised—say in their terms of service that they restrict suicide and self-harm content, but they do not always enforce these policies effectively. The Bill will require category 1 companies—the largest platforms—fully to enforce their terms of service for this content, which will be a significant improvement for users’ safety. Where companies allow this content, the user-empowerment duties will give adults tools to limit their exposure to it, if they wish to do so.

The noble Baroness is right to raise the issue of algorithms. As the noble Lord, Lord Stevenson, said, amplification lies at the heart of many cases. The Bill will require providers specifically to consider as part of their risk assessments how algorithms could affect children’s and adults’ exposure to illegal content, and content that is harmful to children, on their services. Providers will need to take steps to mitigate and effectively manage any risks, and to consider the design of functionalities, algorithms and other features to meet the illegal content and child safety duties in the Bill.

Type
Proceeding contribution
Reference
830 cc184-5 
Session
2022-23
Chamber / Committee
House of Lords chamber
Back to top