My hon. Friend makes an important point, and I thank him for the amazing work he has done in getting the Bill to this point and for his ongoing help and support in making sure that we get it absolutely right. This is not about bashing technology companies; it is about not only holding them to account, but bringing them closer, to make sure that we can work together on these issues to protect the children I was talking about.
Despite the breadth of existing safeguards, we recognise the concerns expressed about privacy and technical feasibility in relation to Ofcom’s power to issue CSE or terrorism notices. That is why we introduced additional safeguards in the Lords. First, Ofcom will be required to obtain a skilled person’s report before issuing any warning notice and exercising its powers under clause 122. Ofcom must also provide a summary of the report to the relevant provider when issuing a warning notice. We are confident that in addition to Ofcom’s existing routes of evidence gathering, this measure will help to provide the regulator with the necessary information to determine whether to issue a notice and the requirements that may be put in place.
We also brought forth amendments requiring Ofcom to consider the impact that the use of technology would have on the availability of journalistic content and the confidentiality of journalistic sources when considering whether to issue a notice. That builds on the existing safeguards in clause 133 regarding freedom of expression and privacy.
We recognise the disproportionate levels of harm that women and girls continue to face online, and that is why the Government have made a number of changes to the Bill to strengthen protections for women and girls. First, the Bill will require Ofcom to produce guidance on online harms that disproportionately affect women and girls and to provide examples of best practice to providers, and it will require providers to bring
together in one clear place all the measures that they take to tackle online abuse against women and girls on their platforms. The Bill will also require Ofcom to consult the Victims’ Commissioner and the Domestic Abuse Commissioner, in addition to the Children’s Commissioner, while preparing codes of practice. That change to the Bill will ensure that the voices of victims of abuse are brought into the consultation period.
2.30 pm
The offence of controlling or coercive behaviour has been added as a priority offence and will require companies to proactively tackle such content and activity that disproportionately affects women and girls. The Bill also introduces new offences relating to intimate image abuse, including criminalising deepfakes for the first time in England and Wales. Those new offences to protect women and girls sit alongside other changes that we have made to the criminal law to ensure that it is fit for purpose in the modern age. For example, we have also introduced a new communications offence of intentionally encouraging or assisting serious self-harm. Our amendments will also require platforms to remove the most harmful self-harm content for all users. The offence has been designed to avoid criminalising or removing recovery and support content.
The Government are committed to empowering adults online and made changes to the Bill to strengthen the user empowerment content duties. First, we have introduced a new content assessment duty in relation to the main user empowerment duties. That will require big tech platforms to carry out comprehensive assessments of the prevalence of content that falls in scope of their providers’ user empowerment duties on their services, such as legal content that encourages suicide or an act of self-harm. They will need to keep a record of that assessment and publish a summary of it for their users in their terms of service. The new duty will underpin the main duties to offer user empowerment tools, ensuring that platforms and users have a comprehensive understanding of the relevant types of content on their services.
Secondly, where category 1 providers offer the user empowerment tools, we have further strengthened the duties on them by requiring them to proactively ask their registered adult users whether they wish to use the user empowerment content features. That will help to make the tools easier for users to opt into or out of. This approach continues to balance the empowerment of users and the protection of freedom of expression by avoiding the “default on” approach.
Baroness Fraser of Craigmaddie made amendments in the other place that aligned the definition of the term “freedom of expression” in the Bill with that in the European convention on human rights. That also reflects the approach of other UK legislation, including the Higher Education (Freedom of Speech) Act 2023. Those amendments will increase clarity about freedom of expression in the Bill.
The Government recognise the difficulties that coroners and bereaved families have when seeking to understand the circumstances surrounding a child’s death and have introduced a number of amendments to address those issues; I have outlined a bit of those. First, we expanded
Ofcom’s information gathering powers so that the regulator can require information from regulated services about a deceased child’s online activity following a request from a coroner. That is backed up by Ofcom’s existing enforcement powers. We have also given Ofcom the power to produce a bespoke report for the coroner and enabled the regulator to share information with a coroner without the prior consent of a business to disclose. That will ensure that Ofcom can collect such information and share it with the coroner where appropriate, so that coroners have access to the expertise and information they need to conduct their investigations.
Finally, we have introduced amendments to ensure that the process for accessing data regarding the online activities of a deceased child is more straightforward and humane. The largest companies must set out policies on disclosure of such data in a clear, accessible and sufficiently detailed format in their terms of service. They must also respond in writing in a timely manner, provide a dedicated means for parents to communicate with the company and put in place a mechanism for parents to complain if they consider that a service is not meeting its obligations.
We recognise the valuable work of researchers in improving our collective understanding of online safety issues, which is why we have made amendments to the Bill that require Ofcom to publish its report into researcher access to information within 18 months rather than two years. Ofcom will then be required to publish guidance on the issue, setting out best practice for platforms to share information in a way that supports their research functions while protecting user privacy and commercially sensitive material. While we will not be making additional changes to the Bill during the remainder of its passage, we understand the call for further actions in this area. That is why we have made a commitment to explore this issue further and report back to the House in due course on whether further measures to support researcher access to data are required and, if so, whether they could also be implemented through other legislation such as the Data Protection and Digital Information Bill.
The Government heard the House’s concerns about the risk of algorithms and their impact on our interactions online. Given the influence they can have, the regulator must be able to scrutinise the algorithms’ functionalities and other systems and processes that providers use. We have therefore made changes to provide Ofcom with the power to authorise a person to view specific types of information remotely: information demonstrating the operation of a provider’s systems, processes or features, including algorithms, and tests or demonstrations. There are substantial safeguards around the use of that power, which include: Ofcom’s legal duty to exercise it proportionately; a seven-day notification period; and the legal requirement to comply with data protection rules and regulations.
The Government are grateful to Baroness Morgan of Cotes and my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), who like many in the House have steadfastly campaigned on the issue of small but risky platforms. We have accepted an amendment to the Bill that changes the rules for establishing the conditions that determine which services will be designated as category 1 or category 2B services and thus have additional duties. In making the regulations
used to determine which services are category 1 or category 2B, the Secretary of State will now have the discretion to decide whether to set a threshold based on the number of users or the functionalities offered, or both factors. Previously, the Secretary of State was required to set the threshold based on a combination of both factors.
It is still the expectation that only the most high risk user-to-user services will be designated as category 1 services. However, the change will ensure that the framework is as flexible as possible in responding to the risk landscape. I say to my hon. Friend the Member for Yeovil (Mr Fysh), who I know will speak later, that it is not meant to capture user-to-user systems; it is very much about content but not about stifling innovation in areas such as distributed ledgers and so on.