WiredGov Newswire (news from other organisations)
|Printable version||E-mail this to a friend|
Committee publishes report on scientific advice and evidence in emergencies
In a report published recently on the use of scientific advice and evidence in emergencies, the House of Commons Science and Technology Committee is critical of the Government’s preparedness for dealing with emergencies, saying it is simply not good enough that scientific advice is often only sought after events have struck.
The report says that while science is used effectively to aid responses to emergencies, and that to some extent the Government is learning the lessons of past experiences, the detachment of the Government Chief Scientific Adviser (GCSA) from the National Risk Assessment (NRA) – the key process of risk evaluation carried out by the Cabinet Office – is a serious concern.
The Committee says that the Transport Secretary’s announcement in December 2010 that the GCSA would look into future weather planning assumptions, following a spell of severe winter weather, suggested that the GCSA had little or no input to the risk assessments that must have taken place on severe weather. Severe weather is an identified risk on the NRA.
Scientific evidence should inform all stages of risk assessment and the Committee recommends that the NRA should not be signed off until the GCSA is satisfied that all risks requiring scientific input and judgements have been properly considered.
The report calls for a new independent scientific advisory committee to be set up to advise the Cabinet on risk assessment and review the NRA, in order to improve public and parliamentary confidence in what is a necessarily unpublished document.
The Committee also repeats calls for the Government Office for Science to be located within the Cabinet Office to reflect its cross-departmental remit and help improve policy processes.
Andrew Miller MP, Committee Chair, said,
"The current approach smacks of closing the stable door after the horse has bolted. Science is not just something to reach for when a crisis happens, it must be integral to the whole planning process and unfortunately the Government still hasn’t got it quite right."
Volcanic ash disruption
The Icelandic volcanic eruption in April 2010 is a stark example of the lack of scientific input in risk assessment: the risk of disruption to aviation caused by a natural disaster was dropped from the assessment process in 2009, despite warnings from earth scientists.
Had the concerns of the scientific community been heard, the Government would undoubtedly have been better able to cope with a situation that ended up costing the UK economy hundreds of millions of pounds.
H1N1 influenza pandemic
Concerns over how risk was communicated to the public during the 2009-10 swine flu pandemic are raised in the report. It highlights the sensationalised media reporting about the projected deaths from swine flu and questions the use of the concept of ‘reasonable worst case scenario’.
Andrew Miller said,
"The Government should emphasise the range and likelihood of various possibilities to the public, with a concept of ‘most probable scenarios’ becoming familiar to the public’s understanding of risk. Reasonable worst case scenarios are potentially misleading as people think they describe something that is likely to happen."
Although useful, the Scientific Advisory Groups in Emergencies, set up to advise government during emergencies, were found by the Committee to work in an unnecessarily secretive way, making it difficult to access information about membership and operations.
The groups should not be given a carte blanche to operate how they please simply because an emergency is occurring, and the Committee calls for clarification on what codes, principles or guidance govern their operation.
In addition to the volcanic ash cloud and swine flu, the Committee also examined space weather and cyber security as case studies for its inquiry.
It heard concerns that the UK is only a minimal subscriber to the European Space Agency’s Space Situational Awareness programme, which is the understanding of conditions in space that are relevant to human activities, and says the Government should review the need for the UK to increase its participation following the 2011 National Risk Assessment.
Cyber attacks pose a national security risk and the Committee recommends that the Government actively ensures that requirements for security clearance do not deter academics from providing scientific advice to government.
It also says an understanding of human behaviour is essential in risk assessment, planning and response, citing debate around expectations on the public in maintaining cyber defences, and says it is disappointed at the lack of focus on social and behavioural science in government to date. The Committee expects that the newly established Cabinet Office Behavioural Insight team will provide input to risk assessment for emergencies.
In this inquiry, the Science and Technology Committee examined how scientific advice and evidence is used in national emergencies, when the Government and scientific advisory system are put under great pressure to deal with atypical situations.
The inquiry focused towards four very different case studies in order to build up a richer picture of how science is used in emergencies. The case studies were: (i) the 2009-10 H1N1 influenza pandemic (swine flu); (ii) the April 2010 volcanic ash disruption; (iii) space weather; and (iv) cyber attacks.