UK Parliament / Open data

Data Protection and Digital Information Bill

My Lords, I speak to Amendments 2, 3, 9 and 290 in my name. I thank the noble Baronesses, Lady Jones and Lady Harding, and the noble Lord, Lord Clement-Jones, for their support.

This group seeks to secure the principle that children should enjoy the same protections in UK law after this Bill passes into law as they do now. In 2018, this House played a critical role in codifying the principle that children merit special, specific protection in relation to data privacy by introducing the age-appropriate design code into the DPA. Its introduction created a wave of design changes to tech products: Google introduced safe search as its default; Instagram made it harder for adults to contact children via private messaging; Play Store stopped making adult apps available to under-18s; and TikTok stopped sending notifications through the night and hundreds of thousands of underage children were denied access to age-inappropriate services. These are just a handful of the hundreds of changes that have been made, many of them rolled out globally. The AADC served as a blueprint for children’s data privacy, and its provisions have been mirrored around the globe. Many noble Lords will have noticed that, only two weeks ago, Australia announced that it is

going to follow the many others who have incorporated or are currently incorporating it into their domestic legislation, saying in the press release that it would align as closely as possible with the UK’s AADC.

As constructed in the Data Protection Act 2018, the AADC sets out the requirements of the UK GDPR as they relate to children. The code is indirectly enforceable; that is to say that the action the ICO can take against those failing to comply is based on the underlying provisions of UK GDPR, which means that any watering down, softening of provisions, unstable definitions—my new favourite—or legal uncertainty created by the Bill automatically waters down, softens and creates legal uncertainty and unstable definitions for children and therefore for child protection. I use the phrase “child protection” deliberately because the most important contribution that the AADC has made at the global level was the understanding that online privacy and safety are interwoven.

Clause 1(2) creates an obligation on the controller or processor to know, or reasonably to know, that an individual is an identifiable living individual. Amendments 2 and 3 would add a further requirement to consider whether that living individual is a child. This would ensure that providers cannot wilfully ignore the presence of children, something that tech companies have a long track record of doing. I want to quote the UK Information Commissioner, who fined TikTok £12.7 million for failing to prevent under-13s accessing that service; he said:

“There are laws in place to make sure our children are as safe in the digital world as they are in the physical world. TikTok did not abide by those laws … TikTok should have known better. TikTok should have done better … They did not do enough to check who was using their platform”.

I underline very clearly that these amendments would not introduce any requirement for age assurance. The ICO’s guidance on age assurance in the AADC and the provisions in the Online Safety Act already detail those requirements. The amendments simply confirm the need to offer a child a high bar of data privacy or, if you do not know which of your users are children, offer all users that same high bar of data privacy.

As we have just heard, it is His Majesty’s Government’s stated position that nothing in the Bill lessens children’s data privacy because nothing in the Bill lessens UK GDPR, and that the Bill is merely an exercise to reduce unnecessary bureaucracy. The noble Lords who spoke on the first group have perhaps put paid to that and I imagine that this position will be sorely tested during Committee. In the light of the alternative view that the protections afforded to children’s personal data will decline as a result of the Bill, Amendment 9 proposes that the status of children’s personal data be elevated to that of “sensitive personal data”, or special category data. The threshold for processing special category data is higher than for general personal data and the specific conditions include, for example, processing with the express consent of the data subject, processing to pursue a vital interest, processing by not-for-profits or processing for legal claims or matters of substantial public interest. Bringing children’s personal data within that definition would elevate the protections by creating an additional threshold for processing.

Finally, Amendment 290 enshrines the principle that nothing in the Bill should lead to a diminution in existing levels of privacy protections that children currently enjoy. It is essentially a codification of the commitment made by the Minister in the other place:

“The Bill maintains the high standards of data protection that our citizens expect and organisations will still have to abide by our age-appropriate design code”.—[Official Report, Commons, 17/4/23; col. 101.]

Before I sit down, I just want to highlight the Harvard Gazette, which looked at ad revenue from the perspective of children. On Instagram, children account for 16% of ad revenue; on YouTube, 27%; on TikTok, 35%; and on Snap, an extraordinary 41.4%. Collectively, YouTube, Instagram and Facebook made nearly $2 billion from children aged nought to 12, and it will not escape many noble Lords that children aged nought to 12 are not supposed to be on those platforms. Instagram, YouTube and TikTok together made more than $7 billion from 13 to 17 year-olds. The amendments in this group give a modicum of protection to a demographic who have no electoral capital, who are not developmentally adult and whose lack of care is not an unfortunate by-product of the business model, but who have their data routinely extracted, sold, shared and scraped as a significant part of the ad market. It is this that determines the features that deliberately spread, polarise and keep children compulsively online, and it is this that the AADC—born in your Lordships’ House—started a global movement to contain.

This House came together on an extraordinary cross-party basis to ensure that the Online Safety Bill delivered for children, so I say to the Minister: I am not wedded to my drafting, nor to the approach that I have taken to maintain, clause by clause, the bar for children, even when that bar is changed for adults, but I am wedded to holding the tech sector accountable for children’s privacy, safety and well-being. It is my hope and—if I dare—expectation that noble Lords will join me in making sure that the DPDI Bill does not leave this House with a single diminution of data protection for children. To do so is, in effect, to give with one hand and take away with the other.

I hope that during Committee the Minister will come to accept that children’s privacy will be undermined by the Bill, and that he will work with me and others to resolve these issues so that the UK maintains its place as a global leader in children’s privacy and safety. I beg to move.

5.15 pm

Type
Proceeding contribution
Reference
837 cc67-9GC 
Session
2023-24
Chamber / Committee
House of Lords Grand Committee
Back to top