Monday, November 29, 2010

Global Warming and 'Over-Resonance'

11.16.2010 - Dire messages about global warming can backfire, new study shows

A study from UC Berkeley appears to prove the systems theory of Niklas Luhmann: that dire warnings about ecological disasters, such as global warming, can cause "over-resonance", i.e. a reactive state that actually backfires and undermines an adequate response to the crisis.

Luhmann's Ecological Communication (1989) proposes that social systems lack the capacity to accurately perceive environmental conditions. The usual response of society to environment is "under-resonance," i.e. the social system does not recognize problems in the environment, or just barely. However, when economic, political and scientific systems present information about ecological crisis in an alarming way, it causes social "over-resonance," an "effect-explosion" in which paradoxically, social systems become paralyzed and unable to respond effectively to the problem.

The study from U. C. Berkeley found that when facts about global warming were reported in an alarming way to subjects, they indicated that they doubted the veracity of the report, even among those whose ideals of a "just world" were fairly high. Furthermore, the overwhelmed subjects were less likely to take corrective action to address the problem.

Conversely, when subjects were presented with the same facts on global warming but provided with possible solutions, subjects were more likely to indicate that they believed in the veracity of the report and that they would modify their behaviour to address the problem.

Geoffrey West on Scaling Rules

Physicists really like simplicity and elegance. And, as a result, they have a wonderful knack for coming up with interesting ways that show how things that appear to be different are really the same.

(Warning: obscure sociology reference impending.) Simmel would love them. (Explanation of obscure reference: Simmel, the progenitor of formal sociology and what has subsequently become network analysis, was treated poorly by the German academic establishment -- in part because his lectures were so popular. Those lectures typically took the following structure: Look at thing A, Look at thing B, Look at thing C. Surprise A, B and C are really the same thing!)

That is the same general approach that West takes as part of this Yahoo Labs Big Thinker presentation exploring the scaling rules of cities, businesses and other things only discovered when you watch!



Unfortunately, the viewer doesn't display time, so I can't identify precisely where to find the interesting bits, ..... but somewhere between a fifth and a quarter into the talk you find this slide which is attached to an interesting observation:

a) there is a systematic relationship between the size of a biological organism and the amount of energy that it requires that b) remains the same over the entire 27 orders of magnitude of biological phenomenon (i.e., from below the cellular level to the blue whale). An increase in size of the system by 4 orders of magnitude requires only 3 orders of magnitude increase in energy input.

At about the 1/4 point, after talking about a number of other scaling relationships, Wests suggests that it is networks that underpin these relationships and the existence of the mathematical relations is a product of natural selection (which has optimized the design and selected for the most efficient) and the mathematics of network processes.

At about the 1/3 point, he notes that similar scaling processes occur not only when looking between species, but also when conceptualizing networks made up of the same object. Thus, the same principals that hold for a single tree also hold for the forest as a whole. Thus, for example when looking at all the different trees in a forest, there is a consistent relationship between the diameter of the trunk of a particular type of tree and the number of trees of that type in the forest.

He then turns to a discussion of growth -- based on the idea that incoming metabolized energy serves two functions a) the maintenance of existing cells and b) the growth of new cells. The first half of the talk, dealing with biological systems, is summarized in the following slide:

The central point is that the relationships are found in biology because natural selection has operated on the systems and selected for those which are optimized. Thus, the question for the second half of the talk becomes whether or not social systems show the same patterns. If they do, then they are likely sustainable. If they don't, problems are likely.



This slide, around the half way mark, summarizes the results of his empirical findings -- that social organization scales in terms of three different types of relationship rather than the single one characterizing biological systems.





And here, in contrast to biological processes that scale at a less than linear rate, we see the consequences of that aspect of social process that scales at a greater than linear rate. Thus, where biological systems that are large go slower than those that are small, social systems that are large operate at a pace that is faster than those that are small.

At this point, around 2/3 of the way through, it gets really interesting. In contrast to interpreting the exponential growth in terms of the standard Limits to Growth argument, West notes that you can 'reset' the initial conditions and, hence, modify the curve, through the process of social innovation. But, drum roll for the long awaited punchline,to do this and keep the system operating requires it to be 'reset' on a more and more rapid basis. In other words, there is a need not just for new innovation, but for more rapid and more fundamental innovation, as the system grows. This structure, like trying to run on a treadmill that is continuously accelerating and from which you will ultimately fall off, is unsustainable.

West closes with a discussion of corporations and the tension between infrastructure (governed by biological type scaling laws that are less than linear) and innovation (governed by scaling laws greater than one) and why companies, in contrast to cities, tend to die out.

There are lots of interesting parallels between West's argument and those of Tad Homer-Dixon and his notion of the ingenuity gap. Homer-Dixon argues that the fundamental problem we face is that the requirement for ingenuity is rising faster than our ability to supply it and, hence, there exists an ingenuity gap. I've always conceptualized the main driver behind the increased requirement as the increasing complexity, scale and fundamental nature of the problems we face. Or, in panarchy terms, the co-creation of increasingly higher-level (slower, larger scale) adaptive cycles. West provides a slightly different explanation for the increased requirement for ingenuity -- tracing it to the increasingly rapid need to 'reset' the growth curve.

Saturday, November 27, 2010

Big Fight in Marine Science!

Scientific American has an interesting article describing an argument over whether or not a key indicator of biodiversity in fisheries is flawed.
A tenet of modern fisheries science may be unfounded, suggests a study of how catches are affecting marine ecosystems. The finding has sparked a heated debate about how best to measure humanity's impact on the ocean.

A landmark study in 1998 found that we are 'fishing down the food chain' worldwide -- in other words, exhausting stocks of top predators such as cod before switching attention to smaller marine animals. This has since become accepted wisdom. But a study published in Nature today suggests that the indicator on which this claim is based -- 'mean trophic level' or MTL -- is severely flawed.

The authors of the 1998 paper have hit back, with one of them branding the latest research "completely invalid". But Trevor Branch, lead author of the Nature study and a fisheries scientist at the University of Washington, Seattle, stands by the work. "At a global level we are not fishing down," he says. "The results are quite clear on that."

Branch says that fishing down may have occurred in some local areas, for example as happened with cod in the Atlantic. But in other places -- such as the Gulf of Thailand -- fisheries first targeted creatures low in the food chain, such as mussels or prawns, and are now 'fishing up'.

Individuals wanting the bloody details can read the article and/or consult the original article and comment in Nature, The trophic fingerprint of marine fisheries. As for the policy implications of the work:
Timothy Essington, a fisheries researcher at the University of Washington, Seattle, whose work was cited by Branch, argues that for a detailed picture of what is happening in ecosystems, MTL must now be used in combination with other measures. Pauly's original paper took a "20,000-feet view", he says. "What we're seeing now is the view from that altitude isn't very clear."

Essington compares looking at MTL to a physician taking a patient's temperature. If the temperature changes dramatically it is probably an indication that something is wrong, he says. But if the doctor decided on a treatment based on temperature alone, "you would never go back to that doctor again".

Thursday, November 25, 2010

Spencer Weart on the Climate Wars

In tribute to the anniversary of Climategate, Spencer Weart (author of The Discovery of Global Warming) placed the events into a larger historical context. Here is what he said:

The controversy was one more step in the trends we have seen operating since the mid 20th century. First, the decline in the prestige of all authorities and would-be authoritative organizations. Second, the great expansion of the scientific community coupled with an increasing interdisciplinarity: strengths which brought a weakness in that that there were no longer any universally respected spokespeople (like Millikan, Einstein, or even Sagan); it is characteristic that the spokesman for the I.P.C.C., Pachauri, was unknown before he took the position, was not even a scientist, and indeed was accepted for the post by the Bush administration precisely because of these deficiencies. Third, the decline of science journalism; where Walter Sullivan and his like had admired the scientific community and were respected in turn, many of the media people who now attempted to explain science, such as the “weather” reporters on television, scarcely understood what they were dealing with.

These trends had been exacerbated since the 1990s by the fragmentation of media (Internet, talk radio), which promoted counter-scientific beliefs such as fear of vaccines among even educated people, by providing facile elaborations of false arguments and a ceaseless repetition of allegations.

The scientific community — for it was not only the I.P.C.C. but the entire scientific community whose reliability was now called into question — was unprepared for the attacks they now faced. We can easily speculate about the personal and social characteristics that to this day make many scientists unfit for aggressive personal controversy. But it will suffice to point out that unlike, for example, any political organization or business corporation, the I.P.C.C. lacked a well-funded and expert public relations apparatus. Even the universities, notably East Anglia, showed a complete lack of understanding of the basic need to respond promptly with a coherent statement of the full factual history of their problems.

To make matters worse, some scientists, and still more people among environmental and other organizations, made statements not supported by what was reliably known. An example was implicit or explicit claims that hurricanes were increasing as a result of human interference with the climate. There was no way for the general public to know whether scientists actually made such claims, still less whether the claims were made honestly or disingenuously. Thus a single error, such as the obviously wrong claim that Himalayan glaciers would vanish within decades, could be suspected to be a deliberate falsehood.

That said, the media coverage represented a new low. There were plenty of earlier examples of media making an uproar without understanding the science (recall, for example, how the director of the Brookhaven National Laboratory was forced to resign because of a leakage of tritium with a total radioactivity less than that in a theater exit sign, see p. xxx). But this was the first time the media reported that an entire community of scientists had been accused of actual dishonesty. Such claims, if directed for example at a politician on a matter of minor importance, would normally require serious investigation. But even in leading newspapers like The New York Times, critics with a long public record for animosity and exaggeration were quoted as experts. As we know, the repetition of allegations is sufficient to make them stick in the public’s mind, regardless of whether they are later shown (or could easily be shown at the time) to be untrue. Thus one more step was taken toward the disintegration and disasters of the late 21st century. … um, just kidding… I hope…


While Weart does a good job of placing the events in a larger historical context, he still seems render these processes (e.g., decline in the authority of scientists, the rise of media promoting 'counter-scientific beliefs') as distinct elements rather than multiple parts of a basic cultural transformation .... i.e., postmodernism.

Tuesday, November 23, 2010

Coal: Act Locally, Think Globally .... NOT!

Interesting article in the NYTimes on the global trade in coal. The basic point: while developed countries move to limit the use of coal as a fuel for electrical generation in order to reduce emissions, that very same coal is being sold to China. Moreover, where the coal was traditionally burned close to where it was mined, now it is shipped thousands of miles -- with the additional cost in emissions. And, to make it even more problematic, as demand has grown so has the price and, hence, more production and new mines. In short, local action to reduce emissions is going for nothing at the global level as those emissions are merely being relocated.



The graph above shows that the vast majority of countries are either importing less coal or exporting more of it. The one major exception, China, which has gone from a net exporter to a net importer in the two years between 2007 and 2009.



As the above map shows, there are a large number of countries involved in the trading of coal. The bulk of the shipments to China, however, come from Australia and Indonesia.

Sunday, November 21, 2010

American attitudes toward Climate Change

The Yale Project on Climate Change Communication recently released American's Knowledge of Climate Change. In a detailed study focusing on understanding of technical knowledge rather than expressions of opinion, they reached a rather grim assessment of the accuracy of public understanding.

Overall, we found that 63 percent of Americans believe that global warming is happening, but many do not understand why. In this assessment, only 8 percent of Americans have knowledge equivalent to an A or B, 40 percent would receive a C or D, and 52 percent would get an F. The study also found important gaps in knowledge and common misconceptions about climate change and the earth system. These misconceptions lead some people to doubt that global warming is happening or that human activities are a major contributor, to misunderstand the causes and therefore the solutions, and to be unaware of the risks. Thus many Americans lack some of the knowledge needed for informed decision-making in a democratic society. For example, only:

  • 57% know that the greenhouse effect refers to gases in the atmosphere that trap heat;
  • 50% of Americans understand that global warming is caused mostly by human activities;
  • 45% understand that carbon dioxide traps heat from the Earth’s surface;
  • 25% have ever heard of coral bleaching or ocean acidification.

Meanwhile, large majorities incorrectly think that the hole in the ozone layer and aerosol spray cans contribute to global warming, leading many to incorrectly conclude that banning aerosol spray cans or stopping rockets from punching holes in the ozone layer are viable solutions.


But, is this something to really be concerned about? These findings aren't tremendously different from other 'tests' of public knowledge that find, for example, 48% of American youth can't locate Mississippi on a map of the US while 75% can't locate Israel on a map of the Middle East. And, moreover, I don't think it realistic to expect the general public to be informed about the technical details of every issue. The world is too complex, the mediascape too uncooperative, and there are too many competing pressures on individuals (like paying the mortgage) for an effective governance strategy to be based on a general public knowledgeable about the technical details of highly complex issues.

Stated another way, the public is diverse and only a small segment are interested in understanding the science. But this doesn't stop them from having opinions. And, as Mike Hulme notes in the Guardian, the events of the past year have had a major effect on the ecology of public opinion on climate change:
There has been a re-framing of climate change. The simple linear frame of "here's the consensus science, now let's make climate policy" has lost out to the more ambiguous frame: "What combination of contested political values, diverse human ideals and emergent scientific evidence can drive climate policy?" The events of the past year have finally buried the notion that scientific predictions about future climate change can be certain or precise enough to force global policy-making.

The meta-framing of climate change has therefore moved from being bi-polar – that either the scientific evidence is strong enough for action or else it is too weak for action – to being multi-polar – that narratives of climate change mobilise widely differing values which can't be homogenised through appeals to science. Those actors who have long favoured a linear connection between climate science and climate policy – spanning environmentalists, contrarians and some scientists and politicians – have been forced to rethink. It is clearer today that the battle lines around climate change have to be drawn using the language of politics, values and ethics rather than the one-dimensional language of scientific consensus or lack thereof.

This leads to the second, and to me more interesting, study conducted by the Yale group: The Six Americas Study. This study identifies six different responses among Americans to the politics of climate change, ranging from individuals who are intensely concerned about the issue and motivated to do something to those who are dismissive of the problem and unmotivated to do anything to address the issue. The following video (starting at about the one minute mark) describes the key characteristics of each group.



As shown below, they have tracked the size of the 6 groups through time. While 18 months isn't a lot of time, there are a couple of interesting and discernible trends. First, there is a general consistency in opinions through time. The area of the different groups remains about the same and, in particular, the "concerned" (those who think climate change is happening, but its effects probably won't be felt for a generation) and the "cautious" (those who wonder whether or not climate change is real or whether humans are responsible) remain the two largest groups throughout.

Second, the January 2010 survey, taken immediately after the collapse of Copenhagen and in the midst of the 'climategate' scandal, is the profile that differs most from the other two. The media attention during that period seems to have engaged a significant proportion of the 'disengaged' (leading to a decline in their numbers) while at the same time driving up the number of 'dismissives' and reducing the number of 'alarmed' Americans.



Third, the 'doubtful' group appears to have been the least affected by the period of contention in December 2009 as their percentage remains essentially constant throughout. These are people who have an opinion, but aren't really engaged (I don't think it is real, but if it is it's a natural phenomenon and I don't need to worry about it.) In that sense, they are more isolated and less affected by the dynamics of the debate than the 'disengaged.'

Finally, the net effect of the events of December 2009 (i.e., a comparison of the November 2008 and June 2010 data) shows little overall change in the number of 'doubtful' and 'disengaged'. What we see is a 5% decline in the number of 'alarmed' and 'concerned' and corresponding 5% rises in the number of 'cautious' and 'dismissives'.

Friday, November 19, 2010

Foucault as a complexity theorist

I recently discovered the Sociology and Complexity Science blog (now added to the links).

Among the rather interesting items there were references to two papers exploring the links between Foucault and complexity theory. Here are the references and abstracts.

1) Ken Baskin, "Foucault, Complexity, and Myth: Toward a Complexity-based Approach to Social Evolution (a.k.a. History)." (You can preview the paper by opening the cover in Amazon and going to it--it is the first chapter in the book).



2) Mark Olssen "Foucault as Complexity Theorist: Overcoming the problems of classical philosophical analysis." Educational Philosophy and Theory, Volume 40, Issue 1, pages 96–117, February 2008 DOI: 10.1111/j.1469-5812.2007.00406.x
This article explores the affinities and parallels between Foucault's Nietzschean view of history and models of complexity developed in the physical sciences in the twentieth century. It claims that Foucault's rejection of structuralism and Marxism can be explained as a consequence of his own approach which posits a radical ontology whereby the conception of the totality or whole is reconfigured as an always open, relatively borderless system of infinite interconnections, possibilities and developments. His rejection of Hegelianism, as well as of other enlightenment philosophies, can be understood at one level as a direct response to his rejection of the mechanical atomist, and organicist epistemological world views, based upon a Newtonian conception of a closed universe operating upon the basis of a small number of invariable and universal laws, by which all could be predicted and explained. The idea of a fully determined, closed universe is replaced; and in a way parallel to complexity theories, Foucault's own approach emphasises notions such as self-organisation and dissipative structures; time as an irreversible, existential dimension; a world of finite resources but with infinite possibilities for articulation, or re-investment; and characterised by the principles of openness, indeterminism, unpredictability, and uncertainty. The implications of Foucault's type of approach are then explored in relation to identity, creativity, and the uniqueness of the person. The article suggests that within a complexity theory approach many of the old conundrums concerning determinism and creativity, social constructionism and uniqueness, can be overcome.

Wednesday, November 17, 2010

Documentary: Secret Life of Chaos

Interesting documentary on the chaotic systems, particularly the process of self-organization and emergence. Some really nice visuals illustrating the processes as well.

Monday, November 15, 2010

Energy and Wealth Shifts from OECD to ChinIndia

There are further statements and implications that can be drawn out from the IEA's just published World Energy Outlook 2010. Not only has global oil production peaked in 2006, but oil consumption in OECD countries, particularly Europe and Japan, is also in a steady and permanent decline. Oil consumption is rising quickly in developing nations, particularly China and India. China accounts for 57% of new oil demand, most of it for the transport sector. The IEA submits three future scenarios for oil consumption, but it doesn't matter which one you choose: all three predict a permanent decline in oil consumption for OECD countries.



Quote from IEA World Energy Outlook:

"If governments put in place the energy and climate policies to which they have committed themselves, then our analysis suggests that crude oil production has probably already peaked."

Quote from Fatih Birol, chief analyst for IEA, to Reuters News Agency:

"When we look at the OECD countries -- the U.S., Europe and Japan -- I think the level of demand that we have seen in 2006 and 2007, we will never see again."

Sunday, November 14, 2010

IEA: Peak Oil is Passed

The International Energy Agency has stepped into the peak oil debate. The 2010 version of their annual report World Energy Outlook 2010 concludes that the date of peak oil is not only here -- but that it passed in 2006.

The interesting thing, as shown in the graph below, is the manner in which they diverge from the classic interpretation of peak oil based on Hubbert's analysis. Rather than rendering the 'peak' as the point of inflection of a broadly symmetric production curve, the IEA are forecasting that substantial reserves in fields yet developed or discovered will allow global crude oil production to plateau for another 25 years at a close approximation of current rates. In other words, they have replaced Hubbert's concept of 'peak' oil with their own analysis, more appropriately termed 'plateau' oil.

Chart from Energy Outlook 2010


This rhetorical shift, admitting that the peak is passed yet concluding that it has no significant implication for the medium-term future of conventional oil production, can best be understood in light of the background presented here. The release of last year's report, which included the graph below showing that production would continue to increase, was met with a storm of controversy when an internal whistle blower indicated that the US had pressured the agency to keep the figures artificially inflated. This year's report appears to be an attempt to square the circle by admitting the peak has passed but claiming it has no significant policy implications.

Corresponding Chart from the Energy Outlook 2009


Equally as interesting, though not the focus of much comment, is the substantial reduction in the forecast of production from natural gas liquids. The 2009 report projected a near doubling of production for NGL by 2030 while the 2010 report projects comparatively constant production thru 2035.

Sunday, November 7, 2010

Intellectual Irony

In a wonderful illustration of intellectual irony, Stewart Brand (visionary creator of the Whole Earth Catalog and author of the Long Now) seems to have hoisted himself onto his own petard. The video below gives an example of how he has been rethinking his positions on cities, nuclear power, genetic modification and geo-engineering.



In a curmudgeonly take on the ideologically driven nature of some environmentalists, Brand has written that: "I would like to see an environmental movement that's comfortable noticing when it's wrong and announcing when it's wrong."

I'm fine with that and find some of Brand's views on the above issues compelling. But, what's good for the goose is good for the gander, and it turns out Brand is now in a controversy with George Monbriot over the validity of the following passage from his book:
Environmentalists were right to be inspired by marine biologist Rachel Carson's book on pesticides, Silent Spring, but wrong to place DDT in the category of Absolute Evil (which she did not) … In an excess of zeal that Carson did not live to moderate, DDT was banned worldwide, and malaria took off in Africa. Quoted in a 2007 National Geographic article, Robert Gwadz of the National Institutes of Health said: 'The ban on DDT may have killed 20m children.'

It turns out that the 2001 Stockholm Convention which regulates DDT use worldwide a) doesn't ban DDT and b) explicitly allows use to control disease vectors (read kill the mosquitoes that carry malaria). Monbriot's blog traces a hilarious series of ineffective attempts to contact Brand and get him to address the issue .... sort of a text version of Michael Moore's Roger and Me. Brand even suggests that Monbriot's argument isn't with Brand but with Gwadz (who Brand quotes)!

Personally, I want science journalism that holds itself to higher standards than the ideological hacks that dominate political blogs and unthinkingly repeat whatever quote they can find that justifies their position. Michelle Bachman can claim that Obama's trip is costing $200 million per day and Fox News can amplify that claim all they want. It doesn't make it true. And pointing to a Indian blogger (or Gwadz) as the source doesn't free you from responsibility. By not assessing the claim for accuracy -- the explicit criteria Brand has laid down for the environmental movement -- both Brand and Bachman need to recognize that they (as well as the sources) were wrong.

Saturday, November 6, 2010

Social Futures of London under Climate Change

A show at the London Museum features photographs of "possible futures" for metro London transformed by climate change. These are very convincing photos and captions that reveal the social and political implications of climate change in a way that few scientists have. Also notice the not-so-subtle allusion to the invasion of foreign cultures from the global south as climate changes, alluding also to the UK's current conflict with global south and Asian immigrants.

The show, called "Postcards from the Future", can be found at the huffingtonpost and at its own blog, London Futures.







Friday, November 5, 2010

No climate legislation? Let's go to court!

Love or hate the US political system, you can't claim it is boring. As anyone who has watched is aware, the legislative process is currently stuck in perpetual gridlock. But, like any good adaptive system, alternatives will emerge. And, as shown in the chart to the left, the famed 'balance of powers' appears to be working.

A recent report, Growth of U.S. Climate Change Litigation: Trends and Consequences, documents a fascinating trend in US court filings related to climate change. Following a dip in 2009, when there was the expectation of legislative action, filings have skyrocketed with the year end total for 2010 expected to triple the number of filings in 2009. Significantly, the cases involve filings both advancing and undercutting climate change regulations. As noted in the NYTimes coverage, the situation may play out in much the same way as happened with tobacco.
Bruce Kahn, senior investment analyst at DBCCA, says that in an extreme situation, the development of climate lawsuits could come to resemble the famous history of U.S. tobacco litigation. In that case, thousands of separate legal challenges eventually culminated in massive lawsuits, with the courts, and not Congress, eventually determining the character of U.S. tobacco policy.

The difference, says Kahn, is that climate litigation is growing at an alarmingly fast rate.

"The rate of change relative to tobacco is much faster," he said. "The run-up in tobacco cases took several decades before it became a real big class action suit ... whereas the ramp-up in climate change litigation has been much quicker."