UK Parliament / Open data

Digital Economy Bill

Proceeding contribution from Lord Ashton of Hyde (Conservative) in the House of Lords on Monday, 20 March 2017. It occurred during Debate on bills on Digital Economy Bill.

My Lords, I am grateful to all contributors on this important subject. We take the harm caused by online abuse and harassment very seriously. The measures that we have introduced in this Bill show that the Government are taking this seriously. I hope that I can offer some comfort in this area since we last discussed these two amendments in Committee.

Amendment 25YR seeks to require Ministers to issue a mandatory code of practice to ensure that commercial social media platform providers show a duty of care to ensure the safety of a child or young person using their service; to report and remove illegal posts on social media; prohibit and remove cyberbullying; and to undertake to work with the education profession and charities to provide children with digital safety skills. Amendment 33A seeks to impose a duty on “social media services” to respond to reports posted on their site of material which pass the “criminal test”, being that the content would, if published by other means or communicated in person, cause a criminal offence to be committed. I have two responses to these amendments—first, an explanation of the work that this Government have started to address these issues through our internet safety strategy; and, secondly, some fundamental concerns about their drafting.

The UK is leading the way in online safety, and will continue to do so, with the support of industry, parents, charities, academics, and other experts, and this is a firm priority for this Government. We have been absolutely clear that abusive and threatening behaviour is totally unacceptable in any form, online or offline. On 27 February, my department announced that it is leading cross-government work on an internet safety strategy which aims to make the UK the safest place in the world to go online for children and young people. This work will also address the abuse that women suffer online, as we look at trolling and other aggressive behaviour, including rape threats. We will ask experts, social media companies, tech firms, charities and young people themselves about online safety during a series of round tables later this month, and we will use these discussions to understand more about the scope of the problem and identify where there are gaps in our current approach to tackling online harms.

We will continue to consult closely with interested parties throughout the spring, including Members of this House with expertise in this area. Indeed, we have already invited several noble Lords to take part. A key part of this work will be to clearly set out the responsibilities of social media in respect of online safety as part of a Green Paper which will be published in June. Other priorities will include: how to help young people to avoid risks online; helping parents to face up to the dangers and discuss them with their children; and how technology can help provide solutions.

We have not ruled anything out at this stage, including a code of practice, but this is a complex field and to find the right solution we need to take the time to have a proper conversation with all the leading stakeholders. We would not want anything to prejudge the outcome

of these discussions. We believe that this will result in a properly considered, comprehensive approach to online safety which stakeholders are fully signed up to, and one that will deliver the long-lasting protections that these amendments are seeking to secure.

7.15 pm

I turn to the amendments. We have some fundamental concerns about how they are drafted. We have three main concerns about the amendment that would require a code of conduct for social media companies. First, while we fully agree that social media companies should be socially responsible to their users, to require them to have an “overarching duty of care” for “any activity or interaction” of young people on their platforms goes too far. It is unclear how this would be measured or what the parameters of such a duty would be.

Secondly, the amendment would require social media companies to inform the police about posts that contravene existing legislation. This would require social media companies to take a judgment role about whether content is legal or not, effectively handing them the power to police the internet. We would be extremely concerned about giving these companies this degree of authority.

Finally, the definitions of social media companies is unclear and goes wider than the sorts of sites we think that the amendment seeks to cover. It would include any website or forum where users can interact, including through comments, live chats or reviews, from major retail websites to newspaper sites. That clearly goes far beyond the remit of child protection and would be unworkable, unwelcome and disproportionate.

In relation to the “criminal test” amendment, we have similar concerns about the definitions of “social media service”. More fundamentally, the law is very clear that what is illegal offline is illegal online, and we have processes in place to establish this. It should not be left to social media companies, or their users, to take a judgment on whether in their view content is criminal or not.

It is clearly right that we take the most effective action possible to remove vile material from the internet. We strongly believe that the internet safety strategy is the best mechanism to consider what more social media companies can do in this area. In the meantime, government is already working with social media and interactive services to have robust processes in place quickly to address inappropriate content and abusive behaviour on their sites.

We have all read the recent news stories about vile content hosted on social media companies. This Government believe that those companies have a responsibility to make sure their platforms are not used as a place to peddle hate or celebrate horrendous acts of violence. We are already talking to those companies and they are responding to those concerns. In particular, advertising revenue is an effective and salutary lesson for them.

We also expect online service providers to play a key role to protect their users and to ensure they have relevant safeguards and reporting processes in place, including access restrictions for children and young people who use their services. Social media companies

already take down content that is violent or incites violence, if it breaches their terms and conditions. However, it is extremely difficult to identify where the threat has come from and whether the threat is serious.

We have referred already this evening to the Internet Watch Foundation, and its data confirm that good progress is being made. It works with companies to identify and remove illegal child sexual abuse material. In 2015, it processed 112,975 reports, and 68,543 were confirmed as child sexual abuse material. Yet only 1% of URLs were on social media sites.

I apologise if I gave the wrong impression in Committee—we are not complacent at all. We know that there is more to do and I give a firm commitment to the House that we will consider all available options through our internet safety strategy. For example, the noble Baroness, Lady Jones, mentioned the Australian system. We are carefully considering those international best practices, including Australia’s approach, as part of the strategy.

The noble Baroness, Lady Janke, mentioned the problem of the plethora of relevant laws on, for example, cyberbullying. There are laws in place to protect people when bullying behaviour constitutes a criminal offence, for example under the Protection from Harassment Act 1997, the Malicious Communications Act 1988, the Communications Act 2003 and the Public Order Act 1986. I think that proves her point. We will take those things into account in the internet safety strategy.

The noble Lord, Lord Alton, mentioned suicide sites. That is, of course, something that needs to be looked at and we will include those in our strategy.

We are working on this now. As I said, it will be published in June. We will bring forward the implementation of proposals as quickly as possible thereafter. I hope that noble Lords, especially the noble Baronesses, Lady Jones and Lady Janke, are reassured that we are taking the necessary strides to keep children and young people safe online. I therefore ask the noble Baroness to withdraw the amendment.

Type
Proceeding contribution
Reference
782 cc76-8 
Session
2016-17
Chamber / Committee
House of Lords chamber
Back to top