UK Parliament / Open data

Department of Trade and Industry

I was not in any way trying to make a political point; I was merely musing that there had been a huge loss in capacity. During the hon. Gentleman’s time as Minister, the Government could go to such organisations and command that they delivered responses; that has been lost. The second proposal, which we felt was very important, is that there should be greater involvement of learned societies in peer review and the assessment of scientific evidence that comes to Government. That would have the added value of possibly reducing the Government’s dependence on external consultants. To that end, we recommended that the Government discuss with the learned societies whether aspects of the scientific advisory system in the United States could be adopted in the UK. The Government said that they would ““reflect further””, and I would be interested to learn what reflections there have been. I wish at this point to pay tribute to the work of the Royal Society, the Royal Society of Chemistry, the body representing physics and the other learned societies which regularly give evidence of the highest quality to our Committee. Such evidence could be made available to the Government in exactly the same way it is to a Select Committee. Without their evidence, it must sometimes be difficult for Government to form conclusions of the kind that our Committee is capable of reaching. Scientific capacity is important, in terms of both social sciences and the physical sciences. If Government policies are to be evidence based—which the Government claim is the case—they need good scientific capacity. Since 1997, the Government have increasingly emphasised the importance of evidence-based policy making. In 1999, the ““Modernising Government”” White Paper and the Cabinet Office report ““Professional policy making for the 21st century”” emphasised the importance of the use of evidence in policy making. I also note that the National School of Government now runs analysis and use of evidence courses. Policies are thus increasingly promoted as evidence based—I am glad it appears that the Minister is nodding from a sedentary position. Unfortunately, however, few outside Government were prepared to accept the term ““evidence based”” as an accurate description of Government policy and we did not have to look far to find examples of a stark disconnect between evidence and policy. The former Secretary of State for Education and Skills, the right hon. Member for Bolton, West (Ruth Kelly), had almost no evidence when she announced her ban on junk food. Sir John Krebs, former chairman of the Food Standards Agency, was scathing. He claimed that"““the policy was developed with no evidence that it would work; no scientific definition of junk food; no cost benefit analysis; and no public engagement.””" The policy announcement broke every one of the Government chief scientific adviser’s rules of engagement. Yet despite the presence of a departmental chief scientific adviser within the then Department for Education and Skills, the policy was allowed to run unchallenged. If the Government cite evidence in support of a policy, we believe that it should be able to bear robust scrutiny and that it should be communicated convincingly to the public. If there are problems with the evidence base, the consequences for public confidence can be grave, as was the case with MMR—measles, mumps and rubella—or damaging to scientific progress, as was the case with genetically modified crops, or potentially disastrous, as with the weapons of mass destruction issue and Iraq. In our report, we highlighted four key issues on the use of evidence. First, the Government should take steps to strengthen the evidence base by establishing a cross-departmental fund to commission independent policy-related research. Will the Minister tell us whether he will set up such a central fund for independent research? We also acknowledge that it is not just a question of money, especially for academic researchers who are struggling to put together publications in time for the next research assessment exercise—or RAE. Indeed, for that reason we recommended that the Government work to rectify the situation in which the RAE acts as a disincentive to engagement by the scientific community with policy. The Government responded that the new metrics will achieve that and I would be interested to know how they will measure the success of the new metrics-based RAE in that regard in terms of public policy. Secondly, the Government should ensure that the evidence they use is of the highest quality. The use of evidence and its quality should be peer reviewed. We recommended that the Government commission pilot reviews of the extent to which policies are evidence based. The Government accepted that recommendation, and it will be interesting to hear from the Minister whether such pilot reviews have taken place. Thirdly, the Government should be gathering evidence to inform their future policy and undertaking horizon scanning. We recognised the excellent work of the foresight programme and the horizon scanning centre, but note that horizon scanning should be embedded within the policy-making process. It is no use if foresight produces excellent reports, but no one takes up the findings. Professor Paul Wiles, the Home Office departmental CSA, clearly stated to our inquiry that"““doing horizon scanning is one thing, getting an organisation to actually lift its head from immediate problems and think ten or twenty years ahead and use that horizon scanning is sometimes a challenge””." It is a challenge, but it is a necessary challenge. Finally, we were concerned that when the Government undertake pilots or trials or runs consultations, the results should be published and their effect on the policy-making process made clear. We have looked at the trials of biometric technology for identity cards and emphasised to the Government that if those trials raise any doubts, the policy should be amended accordingly. Similarly, the Government need to ensure that the purpose and remit of consultations are clear and that feedback is given to those who have contributed. Problems such as the recent consultation over civil nuclear power undermine public confidence, not just in the consultation process but in the use of evidence in policy making more broadly. I understand that the Cabinet Office has recently launched a consultation on consultations. I would be interested to hear from the Minister whether there is any early feedback from the process. As with scientific advice, it is important that the Government are open about the evidence underlying policies and we believe that evidence should be published and reviewed. That will ensure that evidence is not misused or selectively published in order to prop up policies. It was deeply disturbing to hear the allegations from Professor Tim Hope of Keele university that the Home Office had actually interfered with the publication of research. He said:"““It is with sadness and regret that I saw our work ill-used and our faith in government’s use of evidence traduced””." It is essential that the policy-making process and the use of evidence is fully transparent, and that where policy is not based on evidence, that should be made clear. We acknowledge, and make the point strongly in our report, that not all policy needs to be evidence based. The Government have every right to promulgate policies that are not evidence based. Some policies have a mainly political or ideological basis. We accept that. The Government should acknowledge openly the many drivers of policy making as well as any gaps in the relevant research base or where policy is made despite the evidence. If the Government promote the idea that all policy is evidence based, they are undermining those policies on issues such as MMR, GM and climate change, where it is crucial that there is public confidence in the evidence. Furthermore, if the Government change their mind about a policy as a result of a poor pilot or on the basis of new evidence, the Opposition—I include my party and the Conservatives—should not use that as an opportunity for political point scoring. Changing policy on the basis of evidence, pilots or research should be seen as a political strength, not a failing. The third main plank of our inquiry into these cross-cutting areas was the management of risk. We did not attempt to deal with individual areas of risk, but focused instead on the communication of risk to the public—the issue that the hon. Member for Tiverton and Honiton (Angela Browning) raised. Successive Governments have attempted to deal with risk and it has risen up the agenda. We have seen green books, orange books, a Treasury guide on management of risk to the public, appraisal guidance and a risk management assessment framework; there have been lots of them. There is a long way to go, but I welcome the progress and urge the Government to continue to seek ways to sustain and improve risk assessment in policy making. During the inquiry, we looked at the ways in which risk is communicated to the public and considered good examples, such as nanotechnology, which was communicated well to the public, has been widely accepted and is being well used, and bad examples, such as GM. The way in which risk is communicated to the public is crucial, particularly given the weak scientific and numeracy culture in this country. We recommended that the Government develop a scale of risks that could be used by all Departments. Rather than saying that the risk was very low, or one in 100,000, one could say it was as likely as being murdered. People understand that. If one wanted to express a negligible risk, such as one in 10 million—that means nothing to most people—one could say it was as likely as being hit by lightning.
Type
Proceeding contribution
Reference
462 c1210-3 
Session
2006-07
Chamber / Committee
House of Commons chamber
Back to top