UK Parliament / Open data

Online Safety Bill

My Lords, we are reaching the end of our Committee debates, but I am pleased that we have some time to explore these important questions raised by the noble Lord, Lord Knight of Weymouth.

I have an academic friend who studies the internet. When asked to produce definitive answers about how the internet is impacting on politics, he politely suggests that it may be a little too soon to say, as the community is still trying to understand the full impact of television on politics. We are rightly impatient for more immediate answers to questions around how the services regulated by this Bill affect people. For that to happen, we need research to be carried out.

A significant amount of research is already being done within the companies themselves—both more formal research, often done in partnership with academics, and more quick-fix commercial analyses where the companies do their own studies of the data. These studies sometimes see the light of day through publication or quite often through leaks; as the noble Lord, Lord Knight, has referred to, it is not uncommon for employees to decide to put research into the public domain. However, I suggest that this is a very uneven and suboptimal way for us to get to grips with the

impact on services. The public interest lies in there being a much more rigorous and independent body of research work, which, rightly, these amendments collectively seek to promote.

The key issues that we need to address head-on, if we are actively to promote more research, lie within the data protection area. That has motivated my Amendment 233A—I will explain the logic of it shortly—and is the reason why I strongly support Amendment 234.

A certain amount of research can be done without any access to personal data, bringing together aggregated statistics of what is happening on platforms, but the reality is that many of the most interesting research questions inevitably bring us into areas where data protection must be considered. For example, looking at how certain forms of content might radicalise people will involve looking at what individual users are producing and consuming and the relationships between them. There is no way of doing without it for most of the interesting questions around the harms we are looking at. If you want to know whether exposure to content A or content B led to a harm, there is no way to do that research without looking at the individual and the specifics.

There is a broad literature on how anonymisation and pseudonymisation techniques can be used to try to make those datasets a little safer. However, even if the data can be made safe from a technical point of view, that still leaves us with significant ethical questions about carrying out research on people who would not necessarily consent to it and may well disagree with the motivation behind the sorts of questions we may ask. We may want to see how misinformation affects people and steers them in a bad direction; that is our judgment, but the judgment of the people who use those services and consume that information may well be that they are entirely happy and there is no way on earth that they would consent to be studied by us for something that they perceive to be against their interests.

Those are real ethical questions that have to be asked by any researcher looking at this area. That is what we are trying to get to in the amendments—whether we can create an environment with that balance of equity between the individual, who would normally be required to give consent to any use of their data, and the public interest. We may determine that, for example, understanding vaccine misinformation is sufficiently important that we will override that individual’s normal right to choose whether to participate in the research programme.

My Amendment 233A is to Amendment 233, which rightly says that Ofcom may be in a position to say that, for example, vaccine misinformation is in the overriding public interest and we need research into it. If it decides to do that and the platforms transfer data to those independent researchers, because we have said in the amendment that they must, the last thing we want is for the platforms to feel that, if there is any problem further down the track, there will be comeback on them. That would be against the principle of natural justice, given that they have been instructed to hand the data over, and could also act as a barrier.

3.15 pm

The fact that I am raising these concerns is because it is not far-fetched; however well-intentioned somebody is and however well they think they are doing data security, the reality of today’s world is that there are data breaches. Once you have given the data over, at some point some independent researcher is going to have a dataset compromised, and Ofcom itself may be in possession of data that is going to be compromised. Amendment 233A seeks to clarify that, in those circumstances, we are not going to go after the company.

People may be aware of a case involving my former employer and a company called Cambridge Analytica, and if you look at the fallout from that case, some of the decisions that were made pointed to the notion that the first party which originally collected the data can almost never say that they are no longer liable; any transfer to a third party carries their liability with it. That is reasonable in most cases; if, for commercial reasons, you are passing data on to somebody else, that is fine. However, in the circumstances where we have said the regulator is going to insist that they provide the data for a valid public purpose, I do not think we should be holding them liable if something goes wrong downstream—that is the rationale for Amendment 233A.

That brings me on to Amendment 234, which is a good way of trying to address the problem more generally. Sometimes there is an assumption that research is good and companies are bad: “Hand over the data and good stuff will happen”. There is a variable community of companies and a variable community of researchers, in terms of the confidence we can have in them to maintain data security and privacy. Having some kind of formal mechanism to approve researchers, and for researchers to sign up to, is extraordinarily helpful.

I refer noble Lords to the work done by the European Digital Media Observatory—this is one of those declarations of interests that is really a confession of expertise. I was on the board of the European Digital Media Observatory, for which I had no remuneration as it was done as a bit of penance having worked in the sector. As part of my penance, I felt I should be helping bodies that try to deal with the misinformation issue. The European Digital Media Observatory is a European Commission-sponsored body trying to deal with these exact questions, asking how we enable more research to happen, looking at misinformation in the context of the EU. It did some excellent work led by Dr Rebekah Tromble, an academic at George Washington University, who convened a working group which has come up with a code of practice that is going through the EU process. As long as we are not divergent from the general data protection regulation, it would have significant applicability here.

The real benefit of such an approach is that everyone knows what they are supposed to do, and we can objectively test whether or not they are doing it: the party that collected the data and handed it over; and the party that receives the data and does the research—everyone has very clear roles and responsibilities. By doing that, we unlock the flows, which is what we want

to do collectively in these amendments: we want the data to flow from the regulated services to the independent researchers.

I am not arguing that this will necessarily solve all the problems, but it will certainly flush out whether, when services say they cannot provide data for research, that is a “cannot” or “will not”. Today, they can say they cannot for data protection legal reasons—I think with some justification. If we have the code of conduct in place as proposed in Amendment 234, and the researchers are approved researchers who have signed up to it and committed to doing all the right things, then it is much more reasonable for us to say “Platform, meet researcher; researcher, meet platform—you all know your responsibilities, and there are no legal barriers”, and to expect the data to move in a way that will meet those public interest obligations.

This an important set of amendments which we are coming to quite late in the day. They touch on some issues that are being dealt with elsewhere, and I hope this is one example where we will feel comfortable learning from the EU, which is a little bit ahead in terms of trying to deal with some of these questions, working within a framework which is still, from a data protection law point of view at least, a pretty consistent framework between us and them.

Type
Proceeding contribution
Reference
831 cc386-9 
Session
2022-23
Chamber / Committee
House of Lords chamber
Back to top