It is a pleasure to follow the hon. Member for Reading East (Matt Rodda), who I think illustrated clearly why this Bill matters. I want to joint the chorus of warm congratulations to my hon. Friend the Member for Folkestone and Hythe (Damian Collins) and his entire Committee on their remarkable work in producing such an impressive report on what is a very complex Bill. They have done the House a huge favour by suggesting ways in which the Bill could be made more straightforward and more focused on its overall objectives.
The Bill covers unlawful content as well as legal but harmful content, and it is in the latter category that the definitional challenges apply. Of course, we see in that challenge a conflict between specificity and flexibility. The legislation and the regulation that we create need to be specific enough that those subject to it know what they have to do, but flexible enough to keep up with what is a changing online world. The overarching duty of care set out in the initial White Paper was designed to give that adaptability and to encourage proactivity on the part of platforms in identifying and responding to emerging harms, and there is no doubt that that needs refinement.
I think that the Committee’s recommendation that there should be a requirement to have in place proportionate systems and processes for identifying and mitigating reasonably foreseeable risks of harm arising from regulated activity defined under the Bill is largely an elegant way to square that circle and keep some sense of control in this place of what harms we are content for the regulator to act against. However, I do not think that the Committee would claim that this is the last word on the subject, and nor should it be, because there are inherent risks of inflexibility—legislating to change harms is cumbersome and time-consuming.
There is also a risk of inconsistency, even with the Committee’s approach elsewhere. I am thinking of the Committee’s approach to defining content that is harmful to children, which it defines as content or activity that is either specified on the face of the Bill or in regulation, or—this is the crucial bit—where there is a “reasonably foreseeable risk” that it would be likely to cause harm. In other words, there needs to be some flexibility to oblige platforms to deal with harms that are not defined in regulations or in the Bill as they emerge in a fast-changing landscape, and I think that needs to be reflected more broadly too.