UK Parliament / Open data

Digital Economy Bill

Proceeding contribution from Lord Ashton of Hyde (Conservative) in the House of Lords on Monday, 20 March 2017. It occurred during Debate on bills on Digital Economy Bill.

My Lords, we now move in this wide-ranging Bill from the esoteric delights of the universal service obligation, dynamic spectrum access services and the Electronic Communications Code to a crucial area: namely, seeking to protect children online.

As we have said before, the introduction of a new law requiring appropriate age verification measures for online pornography is a bold new step, with many challenges. It ensures that commercial providers of pornographic material are rightly held responsible for what they provide and profit from. Commercial providers of online pornographic content provide an incredibly large amount of easily accessible content to UK users, with little or no protections to ensure that those accessing it are of an appropriate age. It is imperative that we retain controls on such material and, in this terribly sensitive area, aim to strike a balance between freedom of expression and protection of the young.

Perhaps the most sensitive challenge is how we approach material that would not be classified by the BBFC in the offline world. We have heard concerns from some quarters that the current definition of “prohibited material” may be going too far in the type of material that the regulator is able to sanction above and beyond the age verification requirements. We heard in Committee that this,

“would give the regulator extended powers of censorship beyond that originally envisaged in the Bill”,

that it would potentially set,

“new limits on consenting adults accessing pornography that is not harmful to themselves or others”,

and that,

“this is not the place to resolve these wider debates on adult consensual pornography”.—[Official Report, 2/2/17; cols. 1355-56.]

We agree. Our policy intent is child protection, not censorship. Our amendment redefines the scope, taking an approach based on the definition of an “extreme pornographic image” in the Criminal Justice and Immigration Act 2008. This captures grotesque sexual violence, including rape. We have thought long and hard about where we should draw the line. We have adopted two principles. First, as this measure is about protecting children, we do not want to create a new threshold for what adults can or cannot see. This is not the place for that debate. Secondly, we want to ensure that we do not allow the regulator to step on the toes of others involved in policing this territory.

The definition of an “extreme pornographic image” in the Criminal Justice and Immigration Act provides a good marker. I know that there are also concerns

about sexual violence against women and other acts that do not meet the “extreme pornography” definition. We absolutely do not intend to create a regime that unintentionally legitimises all types of sexually explicit content as long as age verification controls are in place. We are most definitely not saying that material not allowed under other legislation is allowed if age verification is in place.

That is why government Amendment 25YV makes it absolutely clear that content behind age verification controls can still be subject to criminal sanctions provided by existing legislation: for example, the Obscene Publications Act. But we concede that there is unfinished business here. Having protected children, we still need to examine other online safety issues. As we will come to later in the debate, my department is leading cross-government work on an internet safety strategy that aims to make the UK the safest place in the world to go online. We want to understand more about the scope of the problem and identify where there are gaps in our current approach to tackling online harms.

We have heard the calls to provide the age verification regulator with powers to block criminal images involving children, as defined by the Coroners and Justice Act 2009. However, at the forefront of cross-government thinking on this was the need not to cut across the excellent work of the Internet Watch Foundation on child sexual abuse content, complicating the landscape and making it harder to effectively and efficiently protect children. It has never been the case that this regime would seek to regulate that child sexual abuse material. Fundamentally, we are dealing with different harms, with different responses, and it is right that they are treated separately.

With child sexual abuse material, the Government seek to ensure that it is eradicated at source: that content is not just blocked but actively taken down from the internet. Providing for the age verification regulator to simply block this material in the course of its work risks this. The BBFC and the IWF are in agreement that they do not wish the BBFC to take on the role of policing child sexual abuse material, or content likely to fall within this classification. The Internet Watch Foundation does a vital and difficult job and we should not seek to complicate that by conflating with age verification.

This is a sensitive subject and we know that we will never satisfy everyone. But I hope that I have convinced noble Lords that the position we have settled on is neither arbitrary nor a sop to one interest group or another. I commend the government amendments in this group.

3.15 pm

Type
Proceeding contribution
Reference
782 cc12-3 
Session
2016-17
Chamber / Committee
House of Lords chamber
Back to top