We on these Benches very much agree with this group of amendments, as is obvious from the fact that the name of my noble friend Lord Dholakia joins mine on Amendment No. 110B. Liberty suggested Amendment No. 110B to us, as it may have done to the noble Lord, Lord Henley. Liberty makes the extremely valid point that it would amend the definition of data matching so that it is restricted to the comparison of sets of data to determine how far they match. It would no longer be defined as including, "““the identification of any patterns and trends””."
One of Liberty’s greatest concerns about the privacy implications of the Bill relate to the data-matching provisions in Schedule 6. These would give the Audit Commission the power to conduct data-matching exercises, or to make contracts with other bodies, public or private, on its behalf. It would require bodies subject to audit by the commission to provide information for the purposes of these exercises. Schedule 6 would also empower bodies whose accounts the commission does not audit to provide information for the purposes of data matching. That could include central government departments which, under these provisions, could theoretically—I repeat, theoretically—provide access to the children’s index or the national identity register. Private bodies such as banks, insurance companies and building societies will also be able to provide client details under Schedule 6. Information disclosed to the commission for the purposes of data matching and the results of those fishing expeditions could be disclosed to an unrestricted range of bodies for fraud detection or prevention purposes, or if there is another statutory duty to disclose the information.
This amendment seeks to explore what is meant by data matching in this context. We hope that the Government will explain what the Audit Commission is currently doing with the personal data of millions of people as part of the national fraud initiative, and describe what it might do in the future with the mass of personal data that has been collected and shared. Data matching could at one level involve little more than the comparison of two or more sets of data to see whether there are overlaps, which could identify someone who is claiming two benefits that are mutually exclusive, as the noble Lord, Lord Henley, has said.
Data mining involves the use of specialist software to profile innocuous mass data in order to identify patterns or characteristics that might indicate some sort of unusual behaviour or impropriety. It is essentially a fishing expedition which is not based on any suspicion or intelligence that a particular person or company has done anything wrong. The way in which data mining works can be illustrated with this hypothetical example. The Government want to crack down on tax evasion and think that the following factors are strong indicators that the person is engaged in this: regularly paying with cash rather than credit cards or cheques, having erratic streams of income, and taking extravagant holidays. They set up a computer program to search all bank account statements, local authority and central government records, and travel operator databases to identify these types of behaviour. The computer produces a list of every person who satisfies all three indicators and they are then subject to investigations by HM Customs and Excise.
At Second Reading, the noble Baroness, Lady Anelay, commented that, "““the Bill could open the way for operations under which software was used to search several databases to identify suspicious patterns of activity that simply could not be spotted when the data were seen individually””.—[Official Report, 7/2/07; col. 736.]"
The consultation preceding the Bill acknowledged that there would be concerns about the legality of data mining. We believe that this would raise difficulties over compliance with DPA principles, and that, in human rights terms, proportionality issues will arise from the fact that data mining by its very nature will not be targeted or intelligence sifted. In order to be effective, huge quantities of data will have to be analysed. Data mining may well help to identify some people involved in fraudulent activities. But can identifying a few criminals justify the state trawling through all our personal data? We do not see how this kind of random, computerised fishing expedition into personal data can be proportionate.
Data mining can also give rise to serious practical concerns. At Second Reading, my noble friend Lord Thomas of Gresford said: "““It is the sort of thing that the supermarket card is designed to do to demonstrate to the management whether a customer buys buy tins of salmon or jars of Marmite. The patterns of behaviour thrown up by the data matching in Part 3 may or may not be meaningful; it is all a matter of chance. Depending on how they are interpreted, the Audit Commission will be able to point the finger at what is deemed to be a suspicious constellation of characteristics or behaviours in an individual. Instead of a system in which a person is suspected of a crime and is then investigated by the police, a trawl using the latest computer techniques will throw up names and those people will be investigated because of their characteristics or behaviours. Suddenly, we have grounds for a serious crime prevention order under Part 1””.—[Official Report, 7/2/07; col. 738.]"
Many of us have experience of the inaccurate results thrown up by data-mining exercises conducted into the information held about our shopping practices on supermarket loyalty cards. Data mining is clearly not infallible. Where it leads to a person being sent vouchers for a brand they would never use, the data-mining error is merely an annoyance. If, however, it leads to an innocent person being subjected to a police investigation or to a preventive measure like a gangster ASBO, the personal cost will be much greater and the risk of error therefore unacceptable.
Given the principled and practical concerns, it is not surprising that other countries impose far more stringent safeguards on the ability of the state to mine personal data. I will not repeat what the noble Lord, Lord Henley, said about the German experience, but German law imposes even greater restrictions on the use of data mining to identify potential future behaviour. Liberty and Members on these Benches are concerned that no equivalent legal restrictions on data mining exist under UK law. We fear that parliamentary approval of data mining in the context of fraud prevention could be treated as a green light for the use of data-mining processes in many other contexts.
Amendment No. 112 would remove proposed new Section 32C of the Audit Commission Act 1998. New Section 32B requires, "““(a) a body subject to audit,""(b) an English best value authority””,"
such as a county council or county borough council, "““to provide the Commission or a person acting on its behalf with such data (and in such form) as the Commission or that person may reasonably require for the purpose of conducting data matching exercises””."
New Section 32C broadens that so that data heldby or on behalf of a person not subject to audit or which is a best-value authority may be disclosed to the commission. That, we believe, goes very wide indeed.
Serious Crime Bill [HL]
Proceeding contribution from
Lord Burnett
(Liberal Democrat)
in the House of Lords on Monday, 26 March 2007.
It occurred during Debate on bills on Serious Crime Bill [HL].
Type
Proceeding contribution
Reference
690 c1533-5 
Session
2006-07
Chamber / Committee
House of Lords chamber
Subjects
Librarians' tools
Timestamp
2023-12-15 12:34:59 +0000
URI
http://data.parliament.uk/pimsdata/hansard/CONTRIBUTION_387965
In Indexing
http://indexing.parliament.uk/Content/Edit/1?uri=http://data.parliament.uk/pimsdata/hansard/CONTRIBUTION_387965
In Solr
https://search.parliament.uk/claw/solr/?id=http://data.parliament.uk/pimsdata/hansard/CONTRIBUTION_387965