My Lords, I beg to move Amendment 25YR and support Amendment 33A in this group.
Our amendment requires the Secretary of State to publish, within six months of the Bill being passed, a code of practice for all social media sites obliging them to put in place mechanisms to prevent children from being abused and bullied online. In the context of the rest of Part 3, we have specifically focused the amendment on the protection of children and young people, although we would expect such a code to have a wider benefit for adults suffering abuse. The amendment would require both Houses to approve the code and, once in place, it would be a statutory requirement on social media sites to comply. Although the full detail of the code is not spelled out, it would include requirements to inform the police if advised of illegal posts, and to take them down with immediate effect. In addition, it would require social media sites to have terms of use to prevent cyberbullying and abuse, including clearly spelled-out mechanisms for taking down the offending material.
We believe that these measures will ensure, finally, that the social media companies begin to take their responsibilities seriously. Action is overdue, which is why we have inserted a relatively tight but achievable timetable—and we make no apologies for that.
We have rehearsed in Committee many of the arguments why this intervention is crucial. I will not repeat them all, but we believe that the case for action to rein in the social media sites is now compelling. The charity Childnet has reported that one in four teenagers suffered hate incidents online last year, and that figure continues to rise. The NSPCC has reported that two-thirds of young people want social media sites to do more to protect them, with exposure to hate messages, pro-anorexia sites, self-harm sites and cyberbullying all on the increase.
Girlguiding revealed in a survey last year that 20% of girls were sent unwanted pornographic films or images without their consent. When I met a delegation of Girl Guides last week, they described how the bombardment of sexualised images was creating huge body-confidence issues and normalising sexist behaviour in schools.
I could go on, but the point is that all these statistics are going in the wrong direction. There is no culture of safeguarding children’s safety and well-being online. As a result, children are being frightened, intimidated, bullied and coerced on social media sites.
Since our last debate in Committee, we have received further evidence of the failure of the social media sites to act when illegal material is brought to their attention. If anyone is in any doubt about the need for our amendment, they have only to recall the example of Facebook, which, on being informed by the BBC that
obscene images of children were being posted on its site, failed to remove the vast majority of those posts and then had the audacity to report the BBC to the police when it was sent further examples for it to follow up. Similarly, at the Home Affairs Select Committee, Google’s vice-president admitted that it had allowed a video entitled “Jews admit to organising white genocide” to remain on its site despite admitting that it was anti-Semitic, deeply offensive and shocking. This latest evidence underlines why we feel that action is needed now.
When we debated this issue in Committee, the Minister gave what I felt to be a rather complacent response. He argued that a statutory code was unnecessary and that the onus should be on companies to develop their own in-house processes to deal with the issue. Of course, shortly after that, the Secretary of State decided that leaving it to the companies to sort out on their own was not really good enough after all, and that a new internet safety strategy would be launched, including round tables with the media companies and, as we have heard, a Green Paper in the summer. That is okay as far as it goes, but it does not go far enough. We believe that we have left it to the social media companies to change their behaviours on a voluntary basis for far too long. That is why our amendment has a timetable and a requirement for the eventual code to be placed on a statutory basis.
7 pm
Finally, for all those who are worried that these are global companies and therefore difficult to regulate, I ask noble Lords to look at the Australian system. They have already passed the Enhancing Online Safety for Children Act 2015. This Act requires all social media sites that have terms of use to prohibit cyberbullying and abuse. It also establishes a children’s e-safety commissioner to deal with complaints and ensure that material is taken down. Of course, the social media sites being regulated are precisely the same ones which operate in the UK. At the same time, the Sunday papers over the last weekend reported that Germany is talking about introducing a similar statutory scheme. So let us not say that this cannot be done.
In conclusion, we believe that whatever discussions are now taking place with social media sites—and of course any discussions are welcome—they would be more fruitful if they concentrated on a draft code of practice that would ultimately be binding on them. Surely we owe our children and young people reassurance if we are finally to act on this issue.