UK Parliament / Open data

Data Protection and Digital Information Bill

My Lords, I rise somewhat reluctantly to speak to Amendment 291 in my name. It could hardly be more important or necessary, but I am reluctant because I really think that the Minister, alongside his colleagues in DSIT and the Home Office, should have taken this issue up. I am quite taken aback that, despite my repeated efforts with both of those departments, they have not done so.

The purpose of the amendment is simple. It is already illegal in the UK to possess or distribute child sexual abuse material, including AI-generated or computer-generated child sexual abuse material. However, while the content is clearly covered by existing law, the mechanism that enables its creation—the files trained on or trained to create child sexual abuse material—is not. This amendment closes that gap.

Some time ago, I hosted an event at which members of OCCIT—the online child sexual exploitation and abuse covert intelligence team—gave a presentation to parliamentarians. For context, OCCIT is a law enforcement unit of the National Police Chiefs’ Council that uses covert police tactics to track down offender behaviour, with a view to identifying emerging risks in the form of new technologies, behaviours and environments. The presentation its officers gave concerned AI-generated abuse scenarios in virtual reality, and it was absolutely shattering for almost everyone who was present.

A few weeks later, the team contacted me and said that what it had showed then was already out of date. What it was now seeing was being supercharged by the

ease with which criminals can train models that, when combined with general-purpose image-creation software, enable those with a sexual interest in children to generate CSAM images and videos at volume and—importantly—to order. Those building and distributing this software were operating with impunity, because current laws are insufficient to enable the police to take action against them.

In the scenarios that they are now facing, a picture of any child can be blended with existing child sexual abuse imagery, pornography or violent sexual scenarios. Images of several children can be honed into a fictitious child and used similarly or, as I will return to in a moment, a picture of an adult can be made to look younger and then used to create child sexual abuse. Among this catalogue of horrors are the made-to-order models trained using images of a child known to the perpetrator—a neighbour’s child or a family member—to create bespoke CSAM content. In short, the police were finding that the scale, sophistication and horror of violent child sexual abuse had hit a new level.

The laws that the police use to enforce against CSAM are Section 1 of the Protection of Children Act 1978 and Section 160 of the Criminal Justice Act 1988, both of which create offences in respect of indecent photographs or pseudophotographs of a child. AI content depicting child sexual abuse in the scenarios that I have just described is also illegal under the law, but creating and distributing the software models needed to generate them is not.

There are many services that allow anyone to take any public image and put it in a false situation. Although I have argued elsewhere that AI images should carry a mark of provenance, these services are not the subject of this amendment. This amendment is laser focused on criminalising AI models that are trained on or trained to create child sexual abuse material. They are specific, specialist and being traded with impunity. These models blend images of children—known children, stock photos, images scraped from social media or synthetic, fabricated AI depictions of children—with existing CSAM or pornography, and they allow paedophiles to generate bespoke CSAM scenarios.

7.30 pm

Most of these generation models are distributed for free, but more specialist models are provided on subscription for less than £50 per month. This payment provides any child sexual offender with the ability to generate limitless—and I do mean “limitless”—child sexual abuse images, but, while the police can take action against those who possess those images, they are unable to take action against those who make it possible to do so: the means of production.

A surprising number of people think that AI abuse is a victimless crime. It is not. It is worth all present or reading this considering whether they would be comfortable with their child or grandchild, their neighbour’s child or indeed any other child of their acquaintance’s image being used in this way.

Then there is the additional fact that anyone, adult or child, can appear in AI generated CSAM. I am not going to say how it can be done because I do not want

my words to be a set of instructions on the public record, but I have in my possession a series of images generated by the covert police in OCCIT in which a child is shown. The child is shown meeting celebrities, among whom is President Obama, and then that same child is seen in a series of sexual abuse scenarios in images and videos. I say for the record that they have been redacted and do not meet the criminal bar. That child was generated from publicly available images of me from IMDb and the parliamentary website. It took a matter of hours. It was done by the police, with my permission, but the images are graphic and distressing. I made them to show the Government the ease with which such material is being generated, but the Minister knows he was instructed not to look at them.

Failing to adopt this amendment is tantamount to leaving every woman in public life—and any child with their photograph on a website, on a social media feed, in an advert or captured covertly in their own garden—vulnerable to the same abuse. We have acknowledged the distress caused to public figures, such as Cathy Newman of Channel 4 News and Taylor Swift, by appearing in AI porn, but the material generated by the software that is the subject of this amendment is of a higher order still. It is child sexual abuse material, and it should be prevented. An enforcement officer said that

“we believe that this material is desensitising offenders and shortening the offending pipeline. What might have taken several years to go from consumption to real world child sexual abuse, may now take a matter of months”.

While noble Lords have that in their minds, I also say that it is getting in the way of the police identifying victims because they are chasing thousands of images of AI children who do not exist.

As I said at the outset, it was my determined wish that the Government deal with this issue quickly, seamlessly and relatively privately, but they have not. Although I will listen very carefully to the Minister when he replies, I make utterly clear that this is an issue that urgently needs resolving. If we cannot do so in Committee, I intend to draw the importance of the issue to the attention of noble Lords who are not following our proceedings and ask them to support its inclusion in the Bill. I beg to move.

Type
Proceeding contribution
Reference
837 cc588-590GC 
Session
2023-24
Chamber / Committee
House of Lords Grand Committee
Back to top