UK Parliament / Open data

Online Safety Bill

Proceeding contribution from Lord Allan of Hallam (Liberal Democrat) in the House of Lords on Wednesday, 19 July 2023. It occurred during Debate on bills on Online Safety Bill.

My Lords, the powers in Clause 111 are perhaps the most controversial outstanding issue in the Bill. I certainly agree with the noble Lord, Lord Moylan, that they deserve some continued scrutiny. I suspect that Members of another place are being lobbied on this extensively right now. Again, it is one of the few issues; they may not have heard of the Online Safety Bill, but they will do in the context of this particular measure.

We debated the rights and wrongs of encryption at some length in Committee, and I will not repeat those points today, not least because the noble Lord, Lord Moylan, has made some of the arguments as to why encryption is important. I will instead today focus on the future process, assuming that the Clause 111 powers will be available to Ofcom as drafted and that we are not going to accept the amendment from the noble Lord, Lord Moylan.

Amendments 258 and 258ZA, in my name and that of my noble friend Lord Clement-Jones, both aim to improve the process of issuing a Clause 111 order by adding in some necessary checks and balances.

As we debate this group, we should remember that the Clause 111 powers are not specific to encrypted services—I think the Minister made this point—and we should have the broader context in mind. I often try to bring some concrete scenarios to our discussions, and it may be helpful to consider three different scenarios in which Ofcom might reach for a Clause 111 notice.

The first is where a provider has no particular objections to using technology to identify and remove child sexual exploitation and abuse material or terrorist material but is just being slow to do this. There are mature systems out there. PhotoDNA is very well known in the industry and effectively has a database with digital signatures of known child sexual exploitation material. All the services we use on a daily basis such as Facebook, Instagram and others will check uploaded photos against that database and, where it is child sexual exploitation material, they will make sure that it does not get shown and that those people are reported to the authorities.

I can imagine scenarios where Ofcom is dealing with a service which has not yet implemented the technology—but does not have a problem doing it—and the material is unencrypted so there is no technical barrier; it is just being a bit slow. In those scenarios, Ofcom will tell the service to get on with it or it will get a Clause 111 notice. In those circumstances, in most cases the service will just get on with it, so Ofcom will be using the threat of the notice as a way to encourage the slow coaches. That is pretty unexceptional; it will work in a pretty straightforward way. I think the most common use of these notices may be to bring outliers into the pack of those who are following best practice. Ofcom may not even need to issue any kind of warning notice at all and will not get past the warning notice period. Waving a warning notice in front of a provider may be sufficient to get it to move.

The second scenario is one where the provider equally does not object to the use of the technology but would prefer to have a notice before it implements it. Outside the world of tech companies, it may seem a little strange why a provider would want to be ordered to do something rather than doing the right thing voluntarily, but we have to remember that the use of this kind of technology is legally fraught in many jurisdictions. There have been court cases in a number of places, not least the European Union, where there are people who will challenge whether you should use this technology on unencrypted services, never mind encrypted ones. In those cases, you can imagine there will be providers, particularly those established outside the United Kingdom, which may say, “Look, we are fine implementing this technology, but Ofcom please can you give us a notice? Then when someone challenges it in court, we can say that the UK regulator made us do it”. That would be helpful to them. This second group will want a notice and here we will get to the point of the notice being issued. They are not going to contest it; they want to have the notice because it gives them some kind of legal protection.

I think those two groups are relatively straightforward: we are dealing with companies which are being slow or are looking for legal cover but do not fundamentally object. The third scenario, though, is the most challenging and it is where I think the Government could get into real trouble. My amendments seek to help the Government in situations where a provider fundamentally objects to being ordered to deploy a particular technology because it believes that that technology will create real privacy threats and risks to the service that it offers. I do not think the provider is being awkward in these circumstances; it has genuine concerns about the implications of the technology being developed or which it is being instructed to deploy.

In these circumstances, Ofcom may have all the reasons in the world to argue why it thinks that what it is asking for is reasonable. However, the affected provider may not accept those reasons and take quite a strong counterview and have all sorts of other arguments as to why what it is being asked to do is unacceptable and too high-risk. This debate has been swirling around at the moment as we think about current models of end-to-end encryption and client-side scanning technology, but we need to recognise that this Bill is going to be around for a while and there may be all sorts of other technologies being ordered to be deployed that we do not even know about and have not even been developed yet. At any point, we may hit this impasse where Ofcom is saying it thinks it is perfectly reasonable to order a company to do it and the service provider is saying, “No, as we look at this, our experts and our lawyers are telling us that this is fundamentally problematic from a privacy point of view”.

6.15 pm

The amendments I have tabled do not stop Ofcom from issuing any kind of order for any kind of technology. In Amendment 258, we are saying that, where there is a disagreement, there should be a point where the public can join the debate. However, I really want to focus on Amendment 258ZA, where we are saying that in those circumstances the provider should have

a statutory right to refer it to the Information Commissioner’s Office. We will try to press this to a vote, so I hope people are listening to the argument because I think it is well intended. The right to refer it to the Information Commissioner’s Office is not only the sensible thing to do but will help the Government if they are trying to get a company to deploy a technology. It will help and strengthen their case if they have made it clear that there will be this important check and balance.

As we have discussed a lot through this debate, these are safety/privacy trade-offs. In some cases, it is a real trade-off—more privacy can sometimes compromise safety because people are able to do things in private that are problematic. At the same time, more privacy can sometimes be a benefit for safety if it protects you from fraudsters. But there certainly are occasions where there are genuine safety/privacy trade-offs. In this Bill, we are charging Ofcom with creating the highest level of safety for the people in the UK when they are online. Ofcom will be our safety regulator and that is its overriding duty. I am sure the Minister will argue it also has to think about privacy and other things, but if you look at the Bill in total, Ofcom’s primary responsibility is clear: it is safety, particularly child safety.

We have created the Information Commissioner’s Office as the guardian of our privacy rights and tasked it with enforcing a body of data protection law, so Ofcom is our primary safety regulator and the ICO is our primary privacy regulator. It seems to me entirely sensible and rational to say that, if our safety regulator is ordering a provider to do something on the grounds of safety and the provider thinks it is problematic, the provider should be able to go to our privacy regulator and say, “You two regulators both have a look at this. Both come to a view and, on the basis of that, we can decide what to do”.

That is really what Amendment 258ZA is intended to do. It does not intend to handcuff Ofcom in any way or stop it doing anything it thinks is important. It does not intend to frustrate child safety; it simply intends to make sure that there is a proper balance in place so that, where a provider has genuine concerns, it can go to the regulator that we have charged with being responsible for privacy regulation.

The reason that this is important and why we are spending a little more time on it today is that there is a genuine risk that services being used by millions of people in the United Kingdom could leave. We often focus on some of the more familiar brands such as WhatsApp and others, but we need to remember things such as iMessage. If you use an Apple phone and use iMessage, that is end-to-end encrypted. That is the kind of service that could find it really problematic. Apple has said, “Privacy is all”, and is going to be thinking about the global market. If it was ordered to deploy a technology which it thought was unsafe, Apple would have to think very carefully about being in the UK market.

To say that Apple has a right to go to the ICO and ask for a review is perfectly reasonable and sensible. I suspect that the Minister may try to argue for the skilled person’s concession that they have made—which is helpful and is material—could involve a review by data protection officials, but that is not the same as

getting an authoritative decision from the privacy regulator, the Information Commissioner’s Office. It is helpful and those amendments are welcome; I would say that the skilled person’s report is necessary but far from sufficient in these circumstances where there is that fundamental view.

If I can try to sell it to the Government, if they accept this amendment and Ofcom says it needs this technology to be deployed for safety reasons and the ICO says it has looked at it as the privacy regulator and has no objections, the onus is on the company. Then it looks like the provider, if it chooses to leave the United Kingdom market, is doing so voluntarily because there are no fundamental objections.

If, on the other hand, the ICO says it is problematic, we know then that we need to carry on discussing and debating whether that technology is appropriate and whether the safety/privacy balance has been got right. So, whether you support more scanning of content or are concerned about more scanning of content, to have the providers of the services that we all use every day, in the circumstances where they think there is a fundamental threat, being able to go to our privacy regulator, which we have set up precisely to guard our privacy rights, and ask it for an opinion, I do not think is an excessive ask. I hope that the Government will either accept the amendment or make a commitment that they will bring in something comparable at Third Reading. Absent that, we feel that this is important enough that we should test the opinion of the House in due course.

Type
Proceeding contribution
Reference
831 cc2365-9 
Session
2022-23
Chamber / Committee
House of Lords chamber
Back to top