My Lords, Amendment 251 is also in the names of the noble Lords, Lord Arbuthnot and Lord Clement-Jones, and the noble Baroness, Lady Jones. I commend the noble Lord, Lord Arbuthnot, for his staunch support of the sub-postmasters over many years. I am grateful to him for adding his name to this amendment.
This amendment overturns a previous intervention in the law that has had and will continue to have far-reaching consequences if left in place: the notion
that computer evidence should in law be presumed to be reliable. This error, made by the Government and the Law Commission at the turn of the century and reinforced by the courts over decades, has, as we now know, cost innocent people their reputations, their livelihoods and, in some cases, their lives.
Previously, Section 69 of the Police and Criminal Evidence Act 1984 required prosecutors in criminal cases relying on information from computers to confirm that the computer was operating correctly and could not have been tampered with before it submitted evidence. As the volume of evidence from computers increased, this requirement came to be viewed as burdensome.
In 1997, the Law Commission published a paper, Evidence in Criminal Proceedings: Hearsay and Related Topics, in which it concluded that Section 69
“fails to serve any useful purpose”.
As a result, it was repealed. The effect of this repeal was to create a common law presumption, in both criminal and civil proceedings, of the proper functioning of machines—that is to say, the computer is always right. In principle, there is a low threshold for rebutting this presumption but, in practice, as the Post Office prosecutions all too tragically show, a person challenging evidence derived from a computer will typically have no visibility of the system in question or the ways in which it could or did fail. As a result, they will not know what records of failures should be disclosed to them and might be asked for.
This situation was illustrated in the Post Office prosecution of sub-postmaster Mrs Seema Misra. Paul Marshall, Mrs Misra’s defence lawyer, describes how she was
“taunted by the prosecution for being unable to point to any … identifiable … problem”,
while they hid behind the presumption that the Horizon system was “reliable” under the law. On four occasions during her prosecution, Mrs Misra requested court order disclosure by the Post Office of Horizon error records. Three different judges dismissed her applications. Mrs Misra went to prison. She was eight weeks pregnant, and it was her son’s 10th birthday. On being sentenced, she collapsed.
The repeal of Section 69 of PACE 1984 reflects the Law Commission’s flawed belief that most computer errors were “down to the operator” or “apparent to the operator”, and that you could
“take as read that computer evidence is reliable unless a person can say otherwise”.
In the words of a colleague of mine from the University of Oxford, a professor of computing with a side consultancy specialising in finding bugs for global tech firms ahead of rollout, this assumption is “eye-wateringly mistaken”. He recently wrote to me and said:
“I have been asking fellow computer scientists for evidence that computers make mistakes, and have found that they are bewildered at the question since it is self-evident”.
There is an injustice in being told that a machine will always work as expected, and a further injustice in being told that the only way you can prove that it does not work is to ask by name for something that you do not know exists. That is to say, Mrs Misra did not have the magic word.
In discussions, the Government assert that the harm caused by Horizon was due to the egregious failures of corporate governance at the Post Office. That there has been a historic miscarriage of justice is beyond question, and the outcome is urgently awaited. But the actions of the Post Office were made possible in part because of a flaw in our legal and judicial processes. What happened at the Post Office is not an isolated incident but potentially the tip of an iceberg, where the safety of an unknown number of criminal convictions and civil judgments is called into question.
For example, the Educational Testing Service, an online test commissioned by the Home Office, wrongly determined that 97% of English language students were cheating, a determination that cost the students their right to stay in the UK and/or their ability to graduate, forfeiting thousands of pounds in student fees. The Guardian conducted interviews with dozens of the students, who described the painful consequences. One man was held in UK immigration detention centres for 11 months. Others described being forced into destitution, becoming homeless and reliant on food banks as they attempted to challenge the accusation. Others became depressed and suicidal when confronted with the wasted tuition fees and the difficulty of shaking off an allegation of dishonesty.
The widespread coverage of the Horizon scandal has made many victims of the Home Office scandal renew their efforts to clear their names and seek redress. In another case, at the Princess of Wales Hospital in 2012, nurses were wrongly accused of falsifying patient records because of discrepancies found with computer records. Some of the nurses were subjected to criminal prosecution, suffering years of legal action before the trial collapsed, when it emerged that a visit by an engineer to fix a bug had eradicated all the data that the nurses were accused of failing to gather. That vital piece of information could easily have been discovered and disclosed, if computer evidence was not automatically deemed to be reliable.
6.30 pm
I do not seek to come to a judgment on any of these cases. I simply make the point that to assume that evidence from computer software is reliable is nonsense. This is backed up by a number of high-profile tech failures: the 999 emergency call system failed on 25 June 2023; air traffic control failed on 28 August 2023, with 700,000 passengers disrupted after planes were grounded because of a simple bug; there was evidence at the Grenfell inquiry that the fire brigade IT system played a part in the controllers not understanding the full extent of what was happening; and there have been dozens of occasions when banking system failures have meant that people could not transfer funds, including to complete time-sensitive house purchases or contractual obligations. Indeed, it is not unusual but entirely expected that these things happen.
Roger Bickerstaff is a partner at law firm Bird & Bird who specialises in technology. He wrote earlier this year that
“for the last 20 years at least, it has generally been recognised by IT lawyers in software contracts, as opposed to criminal law and civil litigation, that software is inherently prone to errors”.
Amendment 291 reinstates Section 69 of the 1984 Act with the addition of the word “material”. The effect of this is to shift the burden of establishing that the evidence produced from computers is reliable back, once again, to the person relying on such evidence, so that there are systems and processes in place to place to monitor, address and log issues. I added “material”, because bugs and security issues are so frequent and inevitable, and not all undermine the reliable operation of software systems. The wording therefore avoids the risk of overcorrection.
There have been previous efforts to tackle this issue, including by Alex Chalk, then Parliamentary Under-Secretary of State, now Lord Chancellor, who commissioned a report to improve the existing approach to proof in court proceedings on computer-derived evidence. I have read a published version of the report and am surprised that the Government did not accept its practical approach, but rather determined that they have
“no plans to review the presumption”.
They instead cite Mr Justice Fraser’s finding that the Post Office demonstrated a simple institutional obstinacy or refusal to consider any possible alternatives to its view of Horizon, which was maintained regardless of the weight of factual evidence to the contrary. Yes, the Post Office showed an institutional obstinacy—that is a generous interpretation—but the law provided cover and the law remains in place.
I met with the Lord Chancellor and I was grateful for his time. He indicated a willingness to acknowledge that there is an issue. I understand that he may not wish to revert to language from 1984, as in the amendment in front of us. The amendment is probing and intended to draw noble Lords’ attention to the urgent problem, but either it must stand or we need another route to the same ends, because to enshrine in law the idea that computer evidence is reliable makes the law an ass and is a recipe for future injustice.
There is a desperate need to clarify and add detail to the court rules on disclosure for computer evidence. In the 21st century, it is necessary for court proceedings to have full sight of relevant material, for example security and maintenance records or bug logs. The yawning gap between swearing under oath that the evidence given is true and the lack of responsibility for the accuracy of computer evidence in court proceedings is simply mind-boggling. We need a legal duty on those proffering computer evidence to confirm that they know of no reason why the information put in evidence should not be accepted as being reliable or true, as well as some responsibility for that. As we have said so many times in Committee, the Government should reconsider their position on removing the balancing test for automated decision-making on the understanding that automating errors reproduces them at scale.
I look forward to the speeches of my fellow signatories and hope that, when he responds, the Minister will be able to reflect previous indications from the Ministry of Justice that the Government are willing to find a path through this—rather than being yet another politician who turned a blind eye to injustice in plain sight and chose not to be part of the journey to justice. I beg to move.