My hon. Friend absolutely nails it. He said earlier that businesses are already collecting this data. Since I was first involved with the Bill, it has primarily been about getting businesses to adhere to their own terms and conditions. The data they use should be used in that way.
The amendment to the definition of “freedom of expression” in part 12 would have no effect as these concepts are already covered by the existing definition. Changing the definition of “automated tool” would introduce untested terms and would have an unclear and confusing impact on the duties.
My hon. Friend the Member for Yeovil also asked for clarification of how Ofcom’s power to view information remotely will be used, and whether the power is sufficiently safeguarded. I assure the House that this power is subject to strict safeguards that mean it cannot be use to undermine a provider’s systems.
On Third Reading in the other place, the Government introduced amendments that defined the regulator’s power to view information remotely, whereas previously the Bill spoke of access. As such, there are no risks to system security, as the power does not enable Ofcom to access the service. Ofcom also has a duty to act proportionately and must abide by its privacy obligations under the Human Rights Act. Ofcom has a stringent restriction on disclosing businesses’ commercially sensitive and other information without consent.
My hon. Friend also asked for clarification on whether Ofcom will be able to view live user data when using this power. Generally, Ofcom would expect to require a service to use a test dataset. However, there may be circumstances where Ofcom asks a service to execute a test using data that it holds, for example, in testing how content moderation systems respond to certain types of content on a service as part of an assessment of the systems and processes. In that scenario, Ofcom may need to use a provider’s own test dataset containing content that has previously violated its own terms of service. However, that would be subject to Ofcom’s privacy obligations and data protection law.
Lords amendment 17 seeks to explicitly exempt low-risk functionality from aspects of user-to-user services’ children’s risk assessment duties. I am happy to reassure my hon. Friend that the current drafting of the Government’s amendment in lieu of Lords amendment 17 places proportionate requirements on providers. It explicitly excludes low-risk functionality from the more stringent duty to identify and assess the impact that higher-risk functionalities have on the level of risk of harm to children. Proportionality is further baked into this duty through Ofcom’s risk assessment guidance. Ofcom is bound by the principle of proportionality as part of its general duties under the Communications Act 2003, as updated by the Bill. As such, it would not be able to recommend that providers should identify and assess low-risk functionality.
The amendment to Lords amendment 217 tabled by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) would introduce a new safeguard
that requires Ofcom to consider whether technology required under a clause 122 notice would circumvent end-to-end encryption. I wish to reassure him and others who have raised the question that the amendment is unnecessary because it is duplicative of existing measures that restrict Ofcom’s use of its powers. Under the Bill’s safeguards, Ofcom cannot require platforms to weaken or remove encryption, and must already consider the risk that specified technology can result in a breach of any statutory provision or the rule of law concerning privacy. We have intentionally designed the Bill so that it is technology neutral and futureproofed, so we cannot accept amendments that risk the legislation quickly becoming out of date. That is why we focused on safeguards that uphold user rights and ensure measures that are proportionate to the specific risks, rather than focusing on specific features such as encryption. For the reasons I have set out, I cannot accept the amendment and hope it will not be pressed to a vote.
The amendment tabled by my hon. Friend the Member for Stroud (Siobhan Baillie) would create an additional reporting requirement on Ofcom to review, as part of its report on the use of the age assurance, whether the visibility of a user’s verification status improves the effectiveness of age assurance, but that duplicates existing review requirements in the Bill. The Bill already provides for a review of user verification; under clause 179, the Secretary of State will be required to review the operation of the online safety regulatory framework as a whole. This review must assess how effective the regulatory framework is at minimising the risk of harm that in scope services pose to users in the UK. That may include a review of the effectiveness of the current user verification and non-verified users duty. I thank my hon. Friend also for raising the issue of user verification and the visibility of verification status. I am pleased to confirm that Ofcom will have the power to set out guidance on user verification status being visible to all users. With regard to online fraud or other illegal activity, mandatory user verification and visibility of verification status is something Ofcom could recommend and require under legal safety duties.
Let me quickly cover some of the other points raised in the debate. I thank my hon. Friend the Member for Gosport (Dame Caroline Dinenage), a former Minister, for all her work. She talked about young people and the Bill contains many measures, for example, on self-harm or suicide content, that reflect them and will still help to protect them. On the comments made by the hon. Member for Aberdeen North (Kirsty Blackman) and indeed the shadow Minister, the hon. Member for Pontypridd (Alex Davies-Jones), whom I am glad to see back in her place, there are a number of review points. Clause 179 requires the Secretary of State to review how the Bill is working in practice, and there will be a report resulting from that, which will be laid before Parliament. We also have the annual Ofcom report that I talked about, and most statutory instruments in the Bill will be subject to the affirmative procedure. The Bill refers to a review after two to five years—Ministers can dictate when it takes place within that period—but that is based on allowing a long enough time for the Bill to bed in and be implemented. It is important that we have the ability to look at that in Parliament. The UN convention on the rights of the child principles are
already in the Bill. Although the Bill does not cite the report by name, the EU convention principles are all covered in the Bill.
My hon. Friend the Member for Folkestone and Hythe (Damian Collins) did an amazing job in his time in my role, and before and afterwards as Chair of the Joint Committee responsible for the pre-legislative scrutiny of the Online Safety Bill. When he talked about scrutiny, I had the advantage of seeing the wry smile of the officials in the Box behind him. That scrutiny has been going on since 2021. Sarah Connolly, one of our amazing team of officials, has been involved with the Bill since it was just a concept.