My Lords, Amendment 146 is in my name and those of the noble Lord, Lord Clement-Jones, and the noble Baronesses, Lady Harding and Lady Jones; I thank them all for their support. Before
I set out the amendment that would provide a code of practice for edtech and why it is so urgently required, I thank the noble Baroness, Lady Barran, and officials in the Department for Education for their engagement on this issue. I hope the Minister can approach this issue with the same desire they have shown to fill the gap that it seeks to address.
A child does not have a choice about whether they go to school. For those who do not fall into the minority who are homeschooled or who, for a reason of health or development, fall outside the education system, it is compulsory. The reason I make this point at the outset is that, if school is compulsory, it must follow that a child should enjoy the same level of privacy and safety at school as they do in any other environment. Yet we have allowed a gap in our data legislation, meaning that a child’s data is unprotected at school and, at the same time, invested in an unregulated and uncertified edtech market to develop promises of learning outcomes that range from unsubstantiated to false.
Schools are keen to adopt new technologies and say that they feel pressure to do so. In both cases, they lack the knowledge and time to assess the privacy and safety risks of the technology products that they are being sold. Amendment 146 would enable children and schools to benefit from emerging technologies. It would reduce the burden on schools in ensuring compliance so that they can get on with the job of teaching our children in a safe, developmentally appropriate and rights-respecting environment, and it would deal with companies that fail to provide evidence for their products and routinely exploit the complexity of data protection law to children’s detriment. In sum, the amendment brings forward a code of conduct for edtech.
Subsections (1) and (2) would require the ICO to bring forward a data code for edtech and tech used in education settings. In doing so, the commissioner would be required to consider children’s fundamental rights, as set out in the Convention on the Rights of the Child, and their relevance to the digital world, as adopted by the Committee on the Rights of the Child in general comment 25 in 2021. The commissioner would have to consider the fact that children are legally entitled to a higher standard of protection in respect to their personal data than adults. In keeping with other data codes, the amendment also sets out whom the ICO must consult when preparing the code, including children, parents and teachers, as well as edtech companies.
Subsection (3) would require edtech companies to provide schools with transparent information about their data-processing practices and their impact on children. This is of particular importance because the department’s own consultation showed that schools are struggling to understand the implications of being a data controller and most often accept the default settings of products and services. Having a code of conduct would allow the Information Commissioner not only to set the standards in subsections (1) and (2) but to insist on the way that information is given in order to support schools to make the right choices for their pupils.
Subsection (4) would allow schools to use edtech providers’ adherence to the code as proof of fulfilling their own data protection duties. Once again, this would alleviate the burden on teachers and school leaders.
Subsection (5) would simply give the commissioner a role in supporting a certification scheme to enable the industry to demonstrate both the compliance of edtech services and products with the UK GDPR and conformity with the age-appropriate design code of practice and the edtech code of practice. The IEEE Standards Association and For Humanity have published certification standards for the AADC but they have not yet been approved by the ICO or UKAS standards. Subsection (5) would act as a catalyst, ensuring that the ICO and the certification partners work together efficiently. Ultimately, schools will respond better to certification than to pure data law.
If the edtech sector was formally in scope of the AADC and it was robustly applied, that would do some, though not all, of what the amendment seeks to do. But in 2018, Her Majesty’s Government, as they were then, made the decision that schools are responsible for children and that the AADC would be confusing. I am not sure whether the Government of the day did not understand the AADC. It requires companies to offer children privacy by design and default. Nothing in the code would have infringed—or will infringe—on a school’s safeguarding duties, but leaving schools out of scope leaves teachers or school data protection officers with vast responsibilities for wilfully leaky products that simply should not fall to them. Many in this House thought that the Government were wrong, and since then we have seen grand abuse of the gap that was created. This is an opportunity to put that error right.
7 pm
I remind the Committee of my interest as chair of the Digital Futures for Children research centre. A piece of work done by the DFC in August 2022 set out the consequences of expecting schools and teachers to navigate—indeed, negotiate—the data policies of commercial edtech providers. The report revealed the level of leakage of children’s personal data from two very popular edtech products: Google Classroom and, to a lesser extent, ClassDojo. The authors found that it was nearly impossible to identify what personal data the edtech providers collect, what happens to it, which policies govern its use and why it is collected and shared, allowing data obtained through education to be used for a myriad of purposes not limited to educating a child. They found evidence that the data spread far and wide—so wide that they were unable to find the edges of how far it had spread—and it was being used in all sorts of commercial environments, including targeted advertising. I am not pointing at these two products because they are egregious, although I hope that noble Lords will feel that this lack of privacy is egregious, but rather because they represent industry norms. The conclusions that the authors came to are supported by regulatory action that has been taken elsewhere.
In the Netherlands, a comprehensive and highly technical data protection impact assessment commissioned by the Government and conducted by Privacy Company
across 2019 and 2020 identified so many significant high risks in data processing that it resulted in the Dutch Data Protection Authority proposing to ban all schools from using Chromebooks and Google Workspace for Education. This was averted following lengthy negotiations between the Government and Google and significant contractual changes—notably, the number of purposes for which children’s data could be used was reduced from over 50 to just three. Similarly, in Germany in 2021 and France in 2022, data authorities warned schools against using services that failed to protect children’s data or unlawfully transferred it overseas, leading to many school districts radically changing their relationship with providers or banning use altogether.
If it is the case that children’s education data seems a procedural rather than a serious matter, we should learn from the experience of the Department for Education, which granted access to the Learning Records Service to a company providing employee training. This company was then bought and the new company used its access to that database to sell information about children to gambling companies for age verification purposes. That is completely unacceptable. Once it is out in the commercial arena, there is no way of retrieving information or influencing how it is then passed on or used.
In 2020, the Government announced an EPAS which was far more wide-ranging than what this amendment proposes. It was to be made up of four components: privacy accreditation for apps and technology solutions; privacy accreditation for researchers; codes of practice specific to the education sector; and support for schools including training, templates and guidance. It was proposed that the Department for Education, with the support of the ICO, would apply to UKAS for formal accreditation as a certification body, as defined under the UK GDPR.
By 2023, the Government rolled back their ambition and, in their response to a freedom of information request on the scope of EPAS, the Department for Education confirmed that its objectives were now limited to developing guidance and advice on how to apply legislation for schools.
In its recent edtech report, UNICEF concluded that
“the responsibility for data protection, privacy and platform safety should not be on parents and teachers”,
and that Governments should develop policies and procedures, with minimum standards
“to evaluate the safety of EdTech prior to adoption”.
But where we are now is that rather than removing the burden from teachers by creating conditions for data protection by default and supporting an accreditation scheme, the Government are offering teachers a complex document dump, allowing the tech sector to outsource its poor practice on to schools and teachers. Ensuring that edtech services accessed by children offer the same privacy and data protections as other online services is urgent and critical work.
Finally, while I hope that the Minister will answer favourably on the amendment and offer to work together to bring a code forward, I must make the following
point. There is absolutely nothing preventing the Department for Education using its procurement platform to set standards on data protection and quality control of edtech. This would be an enormous leap forward and I and my expert colleagues at the DFC would be delighted to support the Government in doing so.
There is more than one way to protect children at school and the amendment in front of us is the simplest and most comprehensive. But whatever the route, the outcome must be that we act. We are seeing school districts in the US take tech providers to court and the UK can do better than that. UK tax-funded schools and teachers must not be left with the responsibility for the regulatory compliance of multibillion-dollar private companies. Not only is it patently unfair, but it illustrates and exacerbates a power asymmetry for which children at school pay the price. To unlock edtech’s full potential, regulatory action and government leadership is not just beneficial but essential. I beg to move.