I am grateful for that because I believe we are on the same page. I agree with the right hon. Gentleman that the police forces in this country will need to be radically reconfigured. The time when a police constable might turn up to a burglary and advise how to target harden the home should be about to go, because the cyber-security of the property and the family in question will often be much more important. At the moment, however, in Birmingham we cannot get police to investigate even violent abuses because there are no police—they have been cut in the west midlands to the smallest number since the force was created in 1974. That is a debate for another day.
Four significant changes need to happento the online regulatory and policing environment. I think the Government have accepted the first: there needs to be a duty of care on social media companies. The concept of duty of care is quite well established in law. Its legal tradition goes back to the early 1970s and it is tried and tested. If I went out and built a stadium here in London and filled it full of people, there would be all kinds of rules and regulations that would ensure that I kept those people safe. If I went out and built a similar online stadium and filled it full with all kinds of nonsense, no such regulations would bite on me. That has to change. We have to ask these firms to identify the harms their services and products might cause and to do something about them, and we have to hold them to account for that.
The second idea is much tighter regulation of hate speech, which the Government have not yet accepted the need to look into. We have raised a number of times in debates like this the approach taken by the Ministry of Justice in Germany. Its Network Enforcement Act—or NetzDG law for short—has created a much more effective policing environment for tackling online hate speech, and it has done so in a way that keeps Germany well within its Council of Europe obligations on protecting free speech. It is time we looked at that because, as the report that has come through from the German Ministry of Justice shows, it is beginning to work.
I am told that something like one in seven Facebook moderators now works in Germany. Google, Twitter, Facebook and YouTube have had to take down a significant amount of hateful material. Looking across the Council of Europe space at the countries that are signatories to the European convention on human rights, which includes the protection of free speech, it appears that Germany is leading the way in creating an effective policing environment to tackle hate speech. Surely, it is time for the Government to look at that a little harder.
The third thing we need is a different kind of regulator. Again, I think the Government have accepted that. There are something like nine different regulators with some kind of regulatory, policing or overwatch powers in the internet space. That is too many. We are not saying they need to be boiled down to one, but that number needs to be closer to one than to nine. That means we have to overhaul the regulators, so we are looking forward to seeing a new Bill whenever we see the Queen’s Speech and a new legislative programme for the next Session.
The final change we need, which is more long term, is a bill of digital rights for the 21st century. The reality is that the online world is going to be regulated, re-regulated and re-regulated again over the course of this century. It is therefore important that we set down some first principles that provide something of a north star to guide us and give companies a bit more predictability as we navigate the changes ahead. At the core of that bill of digital rights should be the right to universal digital literacy. Ultimately, as a country, we are all going to have to become more digitally literate so we can start putting back in place some of the norms and boundaries of the civilised discourse that once was the hallmark of democracy in this country.
5.42 pm