UK Parliament / Open data

Data Protection and Digital Information Bill

My Lords, I speak to Amendments 293 and 294 from the noble Lord, Lord Clement-Jones, Amendment 295 proposed by my noble friend Lady Jones and Amendments 295A to 295F, also in the name of the noble Lord, Lord Clement-Jones.

Those noble Lords who are avid followers of my social media feeds will know that I am an advocate of technology. Advanced computing power and artificial intelligence offer enormous opportunities, which are not all that bad. However, the intentions of those who use them can be malign or criminal, and the speed of technological developments is outpacing legislators

around the world. We are constantly in danger of creating laws that close the stable door long after the virtual horse has bolted.

The remarkable progress of visual and audio technology has its roots in the entertainment industry. It has been used to complete or reshoot scenes in films in the event of actors being unavailable, or in some cases, when actors died before filming was completed. It has also enabled filmmakers to introduce characters, or younger versions of iconic heroes for sequels or prequels in movie franchises. This enabled us to see a resurrected Sir Alec Guinness and a younger version of Luke Skywalker, or a de-aged Indiana Jones, on our screens.

The technology that can do this is only around 15 years old, and until about five years ago it required extremely powerful computers, expensive resources and advanced technical expertise. The first malicious use of deepfakes occurred when famous actors and celebrities, mainly and usually women, had their faces superimposed on to bodies of participants in pornographic videos. These were then marketed online as Hollywood stars’ sex tapes or similar, making money for the producers while causing enormous distress to the women targeted. More powerful computer processors inevitably mean that what was once very expensive rapidly becomes much cheaper very quickly. An additional factor has turbo-boosted this issue: generative AI. Computers can now learn to create images, sound and video movement almost independently of software specialists. It is no longer just famous women who are the targets of sexually explicit deepfakes; it could be anyone.

Amendment 293 directly addresses this horrendous practice, and I hope that there will be widespread support for it. In an increasingly digital world, we spend more time in front of our screens, getting information and entertainment on our phones, laptops, iPads and smart TVs. What was once an expensive technology, used to titillate, entertain or for comedic purposes, has developed an altogether darker presence, well beyond the reach of most legislation.

In additional to explicit sexual images, deepfakes are known to have been used to embarrass individuals, misrepresent public figures, enable fraud, manipulate public opinion and influence democratic political elections and referendums. This damages people individually: those whose images or voices are faked, and those who are taken in by the deepfakes. Trusted public figures, celebrities or spokespeople face reputational and financial damage when their voices or images are used to endorse fake products or for harvesting data. Those who are encouraged to click through are at risk of losing money to fraudsters, being targeted for scams, or having their personal and financial data leaked or sold on. There is growing evidence that information used under false pretences can be used for profiling in co-ordinated misinformation campaigns, for darker financial purposes or political exploitation.

In passing, it is worth remembering that deepfakes are not always images of people. Last year, crudely generated fake images of an explosion, purported to be at the Pentagon, caused the Dow Jones industrial average to drop 85 points within four minutes of the image being published, and triggered emergency response

procedures from local law enforcement before it was debunked 20 minutes later. The power of a single image, carefully placed and virally spreading, shows the enormous and rapid economic damage that deepfakes can create.

Amendment 294 would make it an offence for a person to generate a deepfake for the purpose of committing fraud, and Amendment 295 would make it an offence to create deepfakes of political figures, particularly when they risk undermining electoral integrity. We support all the additional provisions in this group of amendments; Amendments 295A to 295F outline the requirements, duties and definitions necessary to ensure that those creating deepfakes can be prosecuted.

I bring to your Lordships’ attention the wording of Amendment 295, which, as well as making it an offence to create a deepfake, goes a little further. It also makes it an offence to send a communication which has been created by artificial intelligence and which is intended to create the impression that a political figure has said or done something that is not based in fact. This touches on what I believe to be a much more alarming aspect of deepfakes: the manner in which false information is distributed.

8 pm

We are seeing an endless cat and mouse game of systems being used to create and distribute these images, learning from those designed to detect and block them. Currently, we are largely unprotected from the broader societal threats from deepfakes, the risks to which we have already been exposed. They have already had a malign influence in polarising political debate.

There have been and continue to be co-ordinated efforts by organisations and foreign states to exert influence over democratic elections and referendums in the world’s largest and most technologically advanced democracies. This year will see elections in India, the USA, the EU and, almost certainly, the United Kingdom. Almost half the world’s population will have a vote this year, the most in human history. However, in this brave new world, international espionage security services are fighting an almost invisible hydra: a multi-headed enemy endlessly growing new appendages to replace those that have been cut off when discovered. Can the Minister say what assessments have been made so far of such deepfakes and what steps are the Government taking to stop our elections being rigged?

I feel that we need to focus far more on how deepfakes are used and distributed. Networks have been developed that are co-ordinated and extremely effective, involving many bots and humans, sometimes malicious, sometimes misguided and sometimes well-meaning but misinformed. Stemming the flood of deepfakes by prosecuting those who create them may not be enough if the networks which distribute them transform the misinformation into a tsunami. They could sweep across democracies, overwhelm legislation and wash away all the safeguards of the political and economic systems upon which we rely to keep us safe.

We must take the issue of deepfakes seriously. If we sleepwalk and take our eyes off the ball, deepfakes will scramble our sense of true and false. I look forward to the Minister’s response.

Type
Proceeding contribution
Reference
837 cc595-7GC 
Session
2023-24
Chamber / Committee
House of Lords Grand Committee
Back to top