The precautionary principle is in essence surely scientific. It remains sceptical and vigilant in cases of the unsubstantiated and places the burden of proof on the novel. However, in the case of the small but frequent risk of a huge disaster, how cautious is cautious enough? If the risk of an earthquake is broad and uncertain, how should this uncertainty be portrayed in the media? I plan to comment on the governing of “known unknowns” as described by Donald Rumsfeld. I will be using the case study of the earthquake in 2009, in the city of L’Aquila in Italy.
The earthquake, that rated magnitude-6.3, took place on the 6th April 2009, with thousands of foreshocks and aftershocks recorded. From December 2008, these foreshocks were used in determining the probability of a main quake, but this science is notoriously uncertain. This catastrophe was devastating in itself with 309 dead, many injured and infrastructure damage resulting in homelessness. However, what makes this case extraordinary was the subsequent court case. This concluded that six scientists and a government official were guilty of manslaughter for miscommunicating the risk of earthquake. The Economist reported that the seven jailed “had taken part in a meeting of the National Commission for the Forecast and Prevention of Major Risks, a government agency, on March 31st 2009”.
Not unlike the BSE scandal in the UK, it seems that while the science was uncertain, the communication to the public was overly reassuring. Accounts of the traditional Italian approach to earthquakes detailed a “cautionary “culture” of living in an earthquake zone”. Official reassurance was accused of promoting increased risk taking.This conviction was despite over 5000 scientists signing an open letter to the Italian president saying no prediction of earthquake is possible. It has been seen as an attack on science. This case has had a substantial effect on the confidence of the public in seismologists and has caused scientists to become understandably more cautious and conservative.
The BBC World Service produced a follow up on the 4th anniversary presented by Ruth Alexander. She reported that American scientists calculated the risk for that day, of an earthquake of a magnitude greater than 6, as being 1 in 1000. Therefore, the risk of death was 3 in a million (with a massive uncertainty of between 1 and 32 in a million). This risk was then equated with riding a motorcycle for 18miles in the UK. So, would it have been responsible or reasonable to sound a call for evacuation? Since this incident, there has been a tendency to report a risk of 40% when the risk is thought to be between 0 -40%, even when it is closer to 0. Furthermore, raw data on tremors in the area are reported to the public. On top of this, Apps such as “Did You Feel The Earthquake” are widespread and numerous, reporting on all tremors, even those that cannot be felt by humans. Furthermore, twitter has been used by councils to advice residents to sleep away from home, leading to a wave of communications that can end in receiving a text saying merely “evacuate”. Cue incidences of evacuating schools, clearing hospitals and sleeping in cars on days like January 31st only for nothing to happen. Is an overreaction really better than an underreaction?
Should scientists stop giving advice? These events show the need to communicate with the public in a transparent but also useful way. It seems that Italians feel it should be their choice how conservatively to manage risk and that meaningful information is the probability of a serious earthquake compared to the baseline. This case seems to show a step towards advising on “known unknown” risk outside what can be claimed with the evidence available. However, equally, it shows the harassment and panic that can ensue if unprocessed data is reported, creating a reactionary and insecure audience. A middle way in communicating risk is called for in delivering appropriately phrased probabilities and allowing the public to apply these to their own individual values and vulnerabilities. Sheila Jasanoff in “Technologies of Humility” calls for “Acknowledging the limits of prediction and control”, and to “confront ‘head-on’ the normative implications of our lack of perfect foresight”. She also discusses vulnerability and the need to address individual needs rather than considering the merely population. This suggests modesty is a key element in making claims about the future.
Furthermore, this case seems to place a heavy emphasis on science and communication technology, scapegoating individuals to take on all responsibility. This seems to gloss over alternative criticisms, which would be expensive to rectify. More direct approaches could reduce the requirements to evacuate and lessen the damages in disaster. For instance, new building regulations and their enforcement could prove instrumental in protecting citizens long-term. Of the 309 killed, many were from student accommodation, showing how individuals can be vulnerable for structural reasons that should surely be further investigated. Another call has been made for education, so that people can better interpret what they read and hear from official and informal sources as well as empowering people in emergency.
In conclusion, I would advise an approach that takes the public’s values into account; in this case, perhaps the precautionary principle. I would also advise against speaking in absolute terms without data to justify such claims. Long-term solutions, which would act to reassure in the absence and protect in the presence of an earthquake seem to be under-addressed. Finally, it seems important to recognise the value of seismologists on the whole and standardise reporting to allow them to continue to assist this region.