It is incredibly important to get the framework that operates in that sort of space right, as is the case for terrorist material and child protection online. The system that we have in place—it is essentially non-statutory, although it is underpinned by online and offline offences—is working well. Social media organisations’ collaboration with the police and others is incredibly important, and I urge them to collaborate with the police whenever they are asked to do so. We have taken the view that the effective and rigorous enforcement of rules relating to age verification is an important step to get that system up and running. The system is working well, with 220,000 take-downs since 2010, so we want to leave it in place. In all such instances, there might be difficult individual cases, but overall the system is, on the whole, working effectively. That is why we have taken different approaches for the two different areas.
New clause 10 would introduce some very specific requirements around online education. I maintain that the measure is not necessary, because e-safety is already covered at all stages in the new computing curriculum that was introduced in September 2014. From primary school, children are taught how to use technology safely, respectfully and responsibly, how to keep personal information private, how to recognise acceptable and unacceptable behaviour, and how to report a range of concerns. As hon. Members will see, we care deeply about protecting children online both through direct rules for the internet and through education. The new clause is not necessary, and I worry that putting in place a more static system would risk making the task at hand harder.
When it comes to broader protection, we expect social media and interactive services to have in place robust processes that can quickly address inappropriate content and abusive behaviour on their sites. It would be difficult to make the sort of statutory code of practice proposed in new clause 13 work, as there is not a one-size-fits-all solution. The way in which to deal properly with inappropriate content and abuse will vary by service and by incident. Technological considerations might differ by platform as innovation changes the way in which the internet operates. Legislating in this area is difficult because of the pace of change, and users will benefit most if companies develop a bespoke approach for reporting tools and in-house processes. Existing arrangements and the action taken by social media companies provide the best approach to tackling this problem.