My Lords, I have added my name to Amendment 146 in the name of the noble Baroness, Lady Kidron, and I thank all noble Lords who have spoken.
These days, most children learn to swipe an iPad long before they learn to ride a bike. They are accessing the internet at ever younger ages on a multitude of devices. Children are choosing to spend more time online, browsing social media, playing games and using apps. However, we also force children to spend an increasing amount of time online for their education. A growing trend over the last decade or more, this escalated during the pandemic. Screen time at home became lesson time; it was a vital educational lifeline for many in lockdown.
Like other noble Lords, I am not against edtech, but the reality is that the necessary speed of the transition meant that insufficient regard was paid to children’s rights and the data practices of edtech. The noble Baroness, Lady Kidron, as ever, has given us a catalogue of abuses of children’s data which have already taken place in schools, so there is a degree of urgency about this, and Amendment 146 seeks to rectify the situation.
One in five UK internet users are children. Schools are assessing their work online; teachers are using online resources and recording enormous amounts of sensitive data about every pupil. Edtech companies have identified that such a large and captive population is potentially profitable. This amendment reinforces that children are also a vulnerable population and that we must safeguard their data and personal information on this basis. Their rights should not be traded in as the edtech companies chase profits.
The code of practice proposed in this amendment establishes standards for companies to follow, in line with the fundamental rights and freedoms as set out in the UN Convention on the Rights of the Child. It asserts that they are entitled to a higher degree of protection than adults in the digital realm. It would oblige the commissioner to prepare a code of practice which ensures this. It underlines that consultations with individuals and organisations who have the best interests of children at heart is vital, so that the enormous edtech companies cannot bamboozle already overstretched teachers and school leaders.
In education, data has always been processed from children in school. It is necessary for the school’s functioning and to monitor the educational development of individual children. Edtech is now becoming a permanent fixture in children’s schooling and education, but it is largely untested, unregulated and unaccountable. Currently, it is impossible to know what data is collected by edtech providers and how they are using it. This blurs the boundaries between the privacy-preserving and commercial parts of services profiting from children’s data.
Why is this important? First, education data can reveal particularly sensitive and protected characteristics about children: their ethnicity, religion, disability or health status. Such data can also be used to create algorithms that profile children and predict or assess their academic ability and performance; it could reinforce prejudice, create siloed populations or entrench low expectations. Secondly, there is a risk that data-profiling children can lead to deterministic outcomes, defining too early what subjects a child is good at, how creative they are and what they are interested in. Safeguards
must be put in place in relation to the processing of children’s personal data in schools to protect those fundamental rights. Thirdly, of course, is money. Data is appreciating in value, resulting in market pressure for data to be collected, processed, shared and reused. Increasingly, such data processed from children in schools is facilitated by edtech, an already major and expanding sector with a projected value of £3.4 billion.
The growth of edtech’s use in schools is promoted by the Department for Education’s edtech strategy, which sets out a vision for edtech to be an
“inseparable thread woven throughout the processes of teaching and learning”.
Yet the strategy gives little weight to data protection beyond noting the importance of preventing data breaching. Tech giants have become the biggest companies in the world because they own data on us. Schoolchildren have little choice as to their involvement with these companies in the classroom, so we have a moral duty to ensure that they are protected, not commodified or exploited, when learning. It must be a priority for the Government to keep emerging technologies in education under regular review.
Equally important is that the ICO should invest in expertise specific to the domain of education. By regularly reviewing emerging technologies—those already in use and those proposed for use—in education, and their potential risks and impacts, such experts could provide clear and timely guidance for schools to protect individual children and entire cohorts. Amendment 146 would introduce a new code of practice on the processing and use of children’s data by edtech providers. It would also ensure that edtech met their legal obligations under the law, protected children’s data and empowered schools.
I was pleased to hear that the noble Baroness, Lady Kidron, has had constructive discussions with the Education Minister, the noble Baroness, Lady Barran. The way forward on this matter is some sort of joint work between the two departments. The noble Baroness, Lady Kidron, said that she hopes the Minister today will respond with equal positivity; he could start by supporting the principles of this amendment. Beyond that, I hope that he will agree to liaise with the Department for Education and embrace the noble Baroness’s request for more meetings to discuss this issue on a joint basis.