Managing risk - where pragmatism meets philosophy
When Dr Simon Bennett, Lecturer in Disaster Management in the University of
Leicester's Scarman Centre, talks about his research field he does so with the dedication and passion of someone who understands the global implications of failing to educate people in risk management.
Many of his ideas are to be found in a recent Government preliminary report,
Risk Perception and Assessment in Design: Research Review and Priority Setting
Exercise, published in April 2002, in collaboration with engineers from the Universities of Bristol and Bath.
Dr Simon Bennett
“One key factor that emerged from the Government project,” Dr Bennett said, “is that we can make the subject of risk management much stronger if we amalgamate the technological and social science definitions of risk. We need to look for a synthesis of natural
science and social sciences. For the report we looked at research that can be done to bring together the two models – qualitative and quantitative – to make risk management more effective.
“You need to understand human nature holistically. The way that risks are understood by humans as social beings with prejudices and misconceptions is also a very important part of managing risk.
“Formal engineering risk analysis (quantitative) doesn’t utilise the societal perception of risk. So engineers produce numbers to convince the public to do this or that. But they are failing to engage with the public as social beings. Real people live in varied circumstances and have different cultural views of the world.”
The public does not always derive the right messages from quantitative estimates of risk. Many more people die on Britain’s roads annually than in air crashes. Yet because road fatalities generally occur in single figures while a single plane crash can kill up to two or three hundred people, air travel is perceived as the more dangerous.
Dr Bennett detects a change in the public attitude towards science and engineering. “There is still a big gap between the two risk cultures. At the turn of the 20th century there was enormous faith in science. It got rid of diseases, gave us a more comfortable lifestyle and was of great social benefit. Modernity left behind superstition and religion in favour of systematic knowledge derived from observation and experimentation.
“But since the 1960s people have become more sceptical if scientists say that science is always beneficial. Think of atomic power – it can improve people’s lives, but it also presents dangers for the planet.”
Deferring to public perceptions of risk, rather than trying to educate the public to judge risk with greater clarity, can prove costly. The Swedes, and more recently the Germans, have voted to close all their nuclear power stations because of such perceived risks as radiation and meltdown (the ‘China Syndrome’), even though the risks are minimal.
Dr Bennett believes that if the Swedish and German Governments had tried earlier in the debate to convey the reality of the technological risk, the vote might have gone the other way. As things are now, Sweden and Germany will have no nuclear energy, making them totally dependent on oil from the Middle East and gas from Russia – actually a far greater risk to the stability of electrical supply.
Scientists are not the only group of people to be greeted with scepticism. The real issue in the debate over the MMR vaccine in the UK was not statistics, but why the public had neither faith in the Government nor in medical science. This scepticism is proving costly in financial, social and legal ways.
The great cultural shift which has to come, Dr Bennett believes, is the acknowledgement that Science is not certain. “Science mustn’t deny uncertainty if it is to remain credible in the 21st century. Remember Thalidomide. It has to educate the public to understand that there are degrees of certitude, and then let the public make up its own mind. If Science doesn’t make this shift, then ultimately scientists won’t be able to do their jobs.
“The results of scientific experiments are affected by the assumptions a scientist makes when s/he starts out, and therefore are not certainties. I want people to understand that, because I believe that science is our least worst vehicle for progress in a chaotic world.”
As the interim report, Risk Perception and Assessment in Design indicates, the Government is trying to bring these two cultures of social and scientific risk assessment together.
The heart of the report is that risk perception theory has to be built into the design process, making design more relevant and acceptable to society. This is already happening – for instance NASA research has resulted in quieter aircraft engines that are less disruptive to people living near airports. “We still need science, medical drugs, aircraft, freeways and so on,” concluded Simon Bennett, “but if you introduce risk perception theories into the design process and at every subsequent stage then what engineers and scientists create will be more acceptable and relevant to society.”
The Government Report, Risk Perception and Assessment in Design:
Research Review and Priority Setting Exercise, published in April 2002, was the culmination of a project funded by the ESRC and EPSRC.
Its authors were Dr Simon Bennett, Lecturer in Disaster Management at the University of Leicester Scarman Centre; Chris McMahon, Reader in Mechanical Engineering, University of Bristol; Dr Jerry Busby, Lecturer in Mechanical Engineering, University of Bath; and Gordon Barr, Research Student, University of Bristol.
The project’s aims included:
· To review research into risk perception and assessment in design
· To produce a Government report on priorities for future research
· To explore the gulf between engineers and social scientists in dealing with the risks of technology