Scientists’ Social Responsibility

“If you see something, say something.” Under this title in a New York Times article, Penn State Professor and world-renowned climate scientist Michael Mann stated that it is no longer acceptable for scientists to remain on the sidelines when they see a clear and present danger. Just like Homeland Security asks citizens to report suspicious activities in their campaign under the same banner.

A call to scientists for social responsibility is pertinent in the presence of risks. Sometimes risk can be predicted and calculated by multiplying the probability of a dangerous event by the expected damage. This is how car insurance companies calculate your premium. They have big data at their disposal. But how about the hefty global risks that affect us all? Think about the spread of infectious diseases, extreme weather events, ecosystem collapse, terrorist attacks, cyberattacks, and more (the yearly Global Risk Reports of the World Economic Forum has the full list). We don’t have the data to assess such risks accurately (and luckily so!). Ulrich Beck, a German sociologist, stated that global risks are not incidental but have become deeply interwoven in the fabric of our society as science and technology progressed and cannot be resolved by the technocracy and bureaucracy that created them in the first place (Beck 1992). Risks are omnipresent and global by nature (the above examples) or the result of globalization (think of the Bhopal gas tragedy in India 1984 caused by negligence by the US headquarters, and which continues today among the survivors).

Risks confront scientists with a series of dilemmas (Bora 2007). When a new technology is introduced, the long term social and environmental negative consequences are not known, but when they are detected, it is too late to control them (control dilemma). It may turn out that GMOs are harmful, but it is virtually impossible to remove them from the market when it turns out they are. For more acute risks, there is a pressure to decide, but not all necessary knowledge is available (risk dilemma). Imagine the panic in a crisis center when a tornado is approaching a coastal area. And not making a decision is also a decision. Finally: if you issue a warning in a risky situation, you will only know if it was reasonable if one fails to heed it. But if we follow the warning, we will never know whether it was well-founded or not (warning paradox). Evacuate an area threatened by a hurricane? Far from easy!

Natural risks or disasters are typically accepted as a tragedy and can even bond communities. However, when science, technology, or policies induce risk or a disaster, victims will be quick to point fingers. You will now understand why officials will try to frame human-induced risks and disasters as “natural.” But are there even 100% natural disasters? Maybe an asteroid collision (or can we intercept these in the future?).

Risks pose a social problem. The ones who benefit from a risk-bearing technology typically don’t bear the risks. A city in need of energy will build its water dams, coal plants, and nuclear facilities in far-away rural areas, thereby exposing the often vulnerable and voiceless population to the risks of these technologies. Social movements often find their cause to protest such social injustices. The campaign by the Indian author and activist Arundhati Roy is a telling example of the social mobilization following the building of the Narmada Dams that are expected to displace one million people (YouTube movie available).

How can decision-makers deal with risks? The good news is that people typically trust new technologies, whether they know a lot about it or not or what the media tells them. The problem lies in the distrust of people towards formal agencies, especially if they perceive a considerable risk (Bell & Ashwood 2014, Chapter 10). And when you perceive risk, it is real for you. If you live downstream a water dam, you may feel exposed to risk, even if engineers tell you not to worry. So, priority number one for agencies is to build trust through proper risk communication. As we will see in the Science Communication module, such communication should be reciprocal, inclusive, and genuine.

Risk communication can go terribly wrong as the L’Aquila case tragically illustrated. In 2009, a severe earthquake hit the Italian town L’Aquila, killing 309 people, injuring 1,500, and leaving 65,000 temporary displaced. Earthquake scientists, gathered in L’Aquila for a meeting around that time, failed to communicate appropriately with the officials and residents. They were accused of manslaughter and sentenced to six years in prison (Kieffer 2013).

Scientists can assist officials in risk communication in several ways. As they have access to scientific information, the least they can do is to make that information, and uncertainties (!), publicly available on websites and through social media. It is naturally fundamental to do so in an accurate, timely, understandable, and respectful way. For building trust, scientists should do more, however. They should tell academic narratives that relate to the risk perception of the officials and residents and motivates them to act. Finally, an authentic dialogue between scientists and communities and collaboration will likely build trust and pay off. Authentic dialogue and collaboration accept the risk perceptions, the indigenous knowledge of the community, and any PTSD experiences as an integral part of risk assessments and risk prevention planning. Silencing the population will contribute to frustration, annoyance and can even lead to a populist uprising against the “expertocracy” of citizens informing themselves with flawed information. This brings us back to the beginning of this module: if you see something, say something. And we can now add: do something, even if it’s not in your job description.

Print Friendly, PDF & Email