Showing posts with label risk. Show all posts
Showing posts with label risk. Show all posts

November 10, 2011

Conferences and sociotechnical systems


Flying is a constant, necessary (in)convenience in my life. While it’s great being only a 4-hour flight away from Michigan when I’m in Arizona, the endeavor requires careful planning, packing, arranging, and management of every little detail from my laptop’s battery life to remembering to drink water. I’m doing a lot of flying this month, and just got back from the joint conference of the History of Science Society, Society for the History of Technology, and the Society for Social Studies of Science. As a consequence of all this talk of science and technology, I can’t help but begin to see everything as “socio-technical system.”

If you’ve seen the movie “The Matrix,” you have an idea what graduate school is like for me. There’s a Facebook page for one of my advisors, Dan Sarewitz, that jokingly asks,
- Are you unable to sit through a traditional biology/chemistry/physics/engineering/economics course without constantly contemplating how your professor managed to "drink the kool-aid?"
- Do you constantly remind yourself that your science professors are but tiny cogs in a global innovation machine?
- Are you unable to look at a tomato without thinking about science, politics, labor economics, sociology, anthropology, Michael Crow, agriculture, geopolitics, innovation systems, the University of California, and climate change?
- Does the mere mention of the "linear model" make you shudder?
- Are you unable to synthesize your views on climate change in less than 5,000 words? 
If so, you are probably a former student of Dan Sarewitz. You will never hold a mainstream academic position, and your peers (and the public) will never quite be sure what your "deal" is. That's what you get for taking the red pill.
Yep, that sounds about right.

A major project of the science studies is to give social, historical, and political context to the technologies we use in our everyday lives. For example, I’m reading a book by Maria Kaika about urban water infrastructures. We don’t really think about where our water comes from every day. We turn on the tap and expect water to be there (in the Western, developed world, at least). What we don’t think about is what it takes for that water to get there and for an assured, constant, and instant supply of water at our faucets. During the rare times when the tap might go out, we get a profound sense of “uncanny” because our expectations are suddenly jolted as we realize water doesn’t just appear form the walls. The author writes about the hidden infrastructure of urban water. For example, let’s say you visit a dam someplace out west. We don’t really connect this with out water supply, and also the enormous amount of energy needed to move water from the source to tap. All of this is hidden from view and out of mind. Kaika argues that this is because of the artificial divide between “wild” nature and the sanitized urban home. So here we have not only a sociotechnical system, but a socio-technical-environmental system.

Back to airplanes, since I’m actually writing this on the plane! Airplanes, and the process of air transportation, are a more visible form of sociotechnical systems. We stare in awe at the massive planes used for transcontinental flights. But from the second you walk into the airport, you become immediately aware that you are part of a finely tuned system of both humans and technologies. We are enrolled, inspected, standardized, and shuffled into our seats. Usually everything goes well, but today after our flight landed, the electricity went out as we were leaving the plane. This was also an example of “uncanny,” even though it is a more visible system. We can see the nuts and bolts of the plane (and don’t get me started on rivets… we read a painstaking paper last semester about the technological innovation behind airplane rivets), but we still expect everything work.

Think about the complex and heavily embedded system behind energy extraction and production, and the technological disaster that this has caused. These aren’t just technological disasters though, they are most definitely sociotechnical disasters. It’s crucially important to realize that humans design, maintain, and run these systems (to the extent that we have control). But inevitably, tightly coupled systems, such as energy, increase the severity of human error and technological failures. The take home message is that we often don’t notice sociotechnical systems until they fail.

UPDATE: Here's a great link via Arijit on the nation's water infrastructure being ignored.

September 9, 2011

Pika politics and climate change


Look at the cute little pika! So cute! So... controversial??? One of my professors at Arizona State University studies pikas, little critters that are found in both North America and Central Asia, and is entrenched in an unusual debate between environmentalists and the government. I'm going to paraphrase a bit from a presentation he gave to our lab group and then discuss the science policy behind it.

North American pikas are a focal point of the climate change agenda among conservationists in the American west. This is because pikas live in the mountains, and with rising temperatures due to climate change, it is feared that they will soon run out of habitat at high enough altitudes to stay cool. Seems pretty straightforward, right? Not so, according to my professor. While the conservationists are lobbying for pikas to be listed as endangered, he believes they are using shaky science.
In the advocates' claim for [endangered species] listing, Andrew Smith of Arizona State University sees a case of going overboard, and extending implications from limited studies. 
In his own work in Bodie, Calif., begun in 1969, Smith said he found pika capable of adapting to temperature swings by haying at night, instead of during the day, if it is too warm. He also has found the animals at low elevations, where they were not documented previously, complicating the theory that pikas are being chased relentlessly upslope. 
"We really think pikas are at risk, and we should learn more about them, and be monitoring them at lower elevations," Smith said. "They should tell us an incredible amount about climate change. But they are not endangered." (Seattle Times, 2009)
He thinks that environmental groups have picked the pika as a poster child for climate change based on values (such as conservation ethics) over scientific fact, and that they repeatedly cherry-pick data that supports their cause rather than the broader scientific consensus. While the lobby groups claim that pikas are disappearing before our eyes, others note that the western mountains are literally crawling with pikas. Scientists are working to take censuses of pika populations, but this is arduous and can reflect changes other than climate. So the question is, what will happen if the pikas don't disappear? Will we give up on climate change mitigation policy? Will science lose credibility? These are familiar questions to anyone who studies scientific controversies.



Environmentalists have long held a tenuous relationship with science- they both distrust it, and use it to their advantage in legal battles. Science has the power of legitimacy, and making visible the invisible. The case of agricultural biotechnology (GMOs), and how environmental advocates use science, is strikingly similar to the pika controversy. Small degrees of scientific uncertainty become major points of contention, and unfortunately the environmentalists and scientists seem to be speaking directly past each other. I will refer you to my past post to highlight this point. Roger Pielke Jr. would call this a politicized scientific debate. As he argues in The Honest Broker, we should use science to highlight a range of possible policy options, rather than a narrowly defined, predetermined political position. Implicit in the entire pika debate, as with the polar bears, is that in order to save the pikas, we must limit our carbon emissions.

Should scientists speak up and advocate against the environmental lobbyists? Or aim to provide a more robust understanding of the science and policy implications of climate change on animal populations? Can conservationists promote their own agenda without using dubious science?

Until next time, check out this new blog by some of my former MSU professors.

June 28, 2011

Risk, uncertainty, and value judgements in science policy

Yesterday my colleagues and I at Michigan State University and Kellogg Biological Station had a reading group to discuss Pielke's The Honest Broker. We read chapters 4-6 for today, which are titled, "4) Values; 5) Uncertainty; and 6) How science policy shapes science in policy and politics."


We talked about whether science is a good tool in making decisions. Certainly it can be good for informing decisions, such as if there's a tornado coming and you need to know whether you should evacuate. Unfortunately, as we saw in one of my previous posts, sometimes scientific assessments of risk and uncertainty do NOT translate well into action. Pielke agrees with this perspective. He thinks that science just adds smoke and mirrors to debates that are really about core values. So unless the situation under debate is one with low uncertainty and highly shared values (a tornado is coming, we should evacuate), we need more recognition of the underlying values of a debate (see: the climate change debate).

Pielke repeatedly refers to two works by Dan Sarewitz, who is one of my professors at Arizona State and regarded by many as a science policy guru. The first article is "How science makes environmental controversies worse" (2004). The second is "Science and Environmental Policy: An Excess of Objectivity" (2000). Both are worth a thorough reading: one thing I've discovered in grad school is that I sometimes read the same article months, or a year, apart, and find revelatory new nuggets of knowledge each time I read it.
The "Excess of Objectivity" book chapter is an insightful commentary on how science can actually impede the political process, by focusing on always disputable and uncertain facts while ignoring underlying value conflicts in highly politicized environmental issues. The “excess of objectivity” refers to the incompatibility of multiple fields of science, and how while each field claims objectivity, they drive controversy and muddy the political waters.

"How science makes environmental controversies worse" makes the same core argument, using a set of different examples from the 2000 election results, to climate change, to genetically modified food (another good case study is the debate over nuclear waste: see this editorial). This discussion reminded me of an article I read during my first weeks of grad school, "Value Judgments and Risk Comparisons. The Case of Genetically Engineered Crops" (2003) by Paul Thompson, who is an environmental and agricultural philosopher at MSU.


I wrote up an analysis of it that I think highlights the issues of value, risk, and uncertainty in environmental controversies pretty well: Thompson focuses on the inherent value judgments that scientists make about genetically engineered (GE) crops and environmental risk. He aims to identify the values behind the GE debate, rather than taking a philosophical or scientific position in the debate. He focuses on a relatively small aspect of this debate, which are claims for and against a comparative evaluation of the environmental risk of GE vs. traditional (non-GE) crops. This is the standard metric used by scientists and federal agencies to assess the risk of GE crops. Thompson’s argument is that risk assessments are inherently based on value-based judgments; the science itself cannot settle a claim about environmental risks.

He shows that the current regulatory system ironically puts the burden of proof on anti-GE activists, who are “in the position of needing to justify special treatment for this class of plants” (emphasize added, Thompson, 2003, p. 11). This gap charges the largely non-scientific public with demonstrating the scientific credibility of their value system, against the grain of the values held by the scientific community, which of course causes further problems on multiple levels. Thompson identifies several other challenges in the regulation of GE crops based on the current framework.

Risk assessments, especially environmental risk assessments, depend on value-based judgments of how much and what types of risk are “acceptable,” despite attempts to scientifically quantify this risk. The definitions of risk by the scientists and activists are essentially incompatible for comparing the risks of GM vs. non-GM crops, or even defining the concept of environmental risk. This highlights very clearly that science, rather than aiding the decision-making process, can complicate and add uncertainty to political debates.

On a related note, I'm headed to Boston today to attend the Science and Democracy Network conference! I'm really excited to talk to like-minded scholars about our work, and make some great connections.

June 13, 2011

Science policy communication failure costs lives



A recent issue of Science magazine features a news article about seven scientists in Italy who are facing manslaughter charges for not predicting the danger of an earthquake that killed 308 people. The scientists were part of a risk committee of earth scientists who testified that incipient tremors were not evidence of an oncoming earthquake in 2009. According to Science, “They agreed that no one can currently predict precisely when, where, and with what strength an earthquake will strike” (3 June 2011, p. 1135). These are all accurate statements, from a scientific point of view. But the problem lies in translating these statements for decision-makers and stakeholders, which includes people in the town of L’Aquila, Italy.

The lead scientist “maintained that he and his scientific colleagues had a responsibility to provide the ‘best scientific findings’ and that it is ‘up to politicians’ to translate the scientific findings into decisions” (Science, 3 June 2011, p. 1136). This is the linear model of science policy at its worst, literally costing lives because of the mismatch of science and policy risk management paradigms, or as Cash et al. (2006) describe, the “loading dock” model of simply delivering scientific results and hoping that the public sphere will pick them up and use them. To the scientists, risk and uncertainty are quantifiable metrics that are difficult to translate into social action. To decision-makers and the public, risk is a socially mediated, multidimensional value that depends on more than just probabilities. Uncertainty has been a traditional sticking point in earth science and policy topics such as climate change. However, Cash et al. (2006) demonstrate how bringing together scientists and decision-makers from the beginning helped improve the utility of climate models for end-users. They write, “Scientists began to understand that managers were comfortable making decisions under uncertainty, and managers began to understand the concerns scientists had about making scientific claims in the face of uncertainty” (Cash et al., 2006, p. 482). This was clearly not the case with the Italian scientists and decision-makers.

At first glance, this case provokes outcry from scientists afraid of losing the public’s trust and being put on trial, literally. While it may be presumptuous to actually put scientists on trial for a failure to dialogue with decision-makers, this puts into question the implicit “social contract of science” that has justified basic scientific research since the end of WWII. Sheila Jasanoff told a group of ASU graduate students last spring that, “Scientists have become arrogant, and have not explained to the people why they deserve support... The Enlightenment was not a historical event. It is a process, a mission, a continuous duty to explain yourself” (personal communication, 11 February 2011; not an exact quote, but very close). Jasanoff lays out an alternative claim to the linear model of science policy that she calls “technologies of humility” (2003). In contrast to calls for “more science” to reduce uncertainty, Jasanoff writes that, “what is lacking is not just knowledge to fill the gaps, but also processes and methods to elicit what the public wants, and to use what is already known” (2006, p. 240). The abstract of her paper states, “governments should reconsider existing relations among decision-makers, experts, and citizens in the management of technology. Policy-makers need a set of ‘technologies of humility’ for systematically assessing the unknown and the uncertain” (Jasanoff, 2003, p. 223). Jasanoff and other Science and Society scholars have been writing about the failures of the linear science policy model in predicting risk since the 1980s, when the risk-management paradigm began to crumble in the wake of seemingly “unpredictable” human-technology-based disasters like Chernobyl. Today we face critical policy issues from climate change to toxic chemicals that fundamentally depend upon and understanding of environmental science, but just understanding the science is not enough. We need a new model of science policy that incorporates the needs of decision-makers and stakeholders from the start, not after it’s too late.
Sources:
Cartlidge, E. (3 June 2011). “Quake Experts to Be Tried For Manslaughter.” Science, 332, p. 1135-1136.
Cash, D.W., Borck, J.C., & Patt, A.G. (2006). “Countering the Loading-Dock Approach to Linking Science and Decision Making.” Science, Technology, & Human Values, 31, p. 465-494.http://sciencepolicy.colorado.edu/students/envs_5100/Cashetal2006.pdf
Jasanoff, Sheila (2003). “Technologies of Humility: Citizen Participation in Governing Science.” Minerva, 41. 223-244.http://sciencepolicy.colorado.edu/students/envs_5100/jasanoff2003.pdf
Further reading:
Sarewitz, D., Pielke, Jr., R.A., & Byerly, R. (editors) (2000). Prediction: Science, Decision Making and the Future of Nature. Washington, DC: Island Press. Available at: Google books, Amazon.com