A totally new concept from Mike Hulme

Mike Hulme the British Professor, has written a new paper that appeared in the 2011 issue of Osiris magazine. Hulme discusses what he has ferreted out, as a new revelation in climate science and climate studies practice. He then claims this method he sees scientists resorting to – which he calls ‘climate reductionism – has ‘deficiencies’.

In seeking to predict a climate-shaped future, the complexity of interactions between climates, environments and societies is reduced and a new variant of climate determinism emerges. I call this ‘climate reductionism’, a form of analysis and prediction in which climate is first extracted from the matrix of interdependencies which shape human life within the physical world. Once isolated, climate is then elevated to the role of dominant predictor variable. I argue in this paper that climate reductionism is a methodology that has become dominant in analyses of present and future environmental change – and that as a methodology it has deficiencies.

So Hulme has found in their analyses, climate scientists  first extract climate “from the matrix of interdependencies”  and  then elevate “climate to the role of dominant predictor variable”.

Helpfully enough, Hulme provides examples of the final outcome of this method. These are claims in the media he finds, e.g., ‘every year climate change leaves over 300,000 people dead and ‘We predict, on the basis of mid-range climate-warming scenarios for 2050, that 15–37% of species … will be ‘committed to extinction’, and so forth.

How does the method itself look? Hulme again provides an example of an economics paper as a perfect illustration of “[seeking] out simple chains of climatic cause-and-effect” between climate change and economic growth:

The authors recognise that whether or not climate change has a direct effect on economic development is contentious, but they claim nevertheless that their global analysis using data from over 180 nations reveals a “substantial contemporary causal effect of temperature on aggregate [economic] output … on average, a 1⁰C increase in average temperature predicts a fall in per-capita income by about 8 per cent”

By doing this Hulme says, the “complex relationships that exist between climate and economic performance are first reduced to a dependent relationship between temperature and GDP per capita” and then, using projections of future climate warming, future economic performance is predicted for the twenty-first century”.

But this is not a new observation at all. Climate sceptics are well familiar with the numerous instances of fear-mongering which Hulme acknowledges exist, have traced backward to the original source of such alarmism, and found the same methodology being adopted over and over again. ‘Take something good and connect it to climate, run a climate model and make it worse’ – this is virtually the touchstone of the alarmist call. The method has myriad variants and has been successfully used in environmental regulation and public health policy formulation. It is the root source of several claims to emerge from the IPCC. Fortunately enough the IPCC sourced such claims from environmental pressure group literature, so that aspect could be pointed out to the world to draw attention to the inherent implausibility and fear-mongering present in such claims. But what otherwise?

In instances where critics can bring enormously inconvenient ‘gotchas’ to bear, the consensus establishment reluctantly concedes some inches of territory; otherwise the use of this methodolgy is defended to death (because it yields such precious results).

Consider the case of Amazongate, where a computer-modelling effort on forest fires by Nepstad and colleagues transmogrified into a statement on dramatic Amazonian vulnerability.  The technique adopted in the originating papers by Daniel Nepstad is straightforward. A threshold of soil moisture is said to cause ‘fire risk’ and climate change produces/contributes to a reduction in precipitation over the Amazon. Vast tracts of the forest then burn away in the models. In Nepstad et al 2004, 40% of the forest is at risk of catching fire. In Nepstad et al 2008, the computer model loses 55% of the entire Amazon in the next 20 years. Or take the more recent Le Page et al 2010 whose models  reduction in rainfall whose models leave only 24% of the Amazon protected from fire due to reduction in rainfall. As we know, such extrapolation methodology has been comprehensively defended by scientists studying the Amazon.

How can this method be explicitly characterized for Amazonian literature?

From my comment on this thread:

The approach is similar: a certain diffusely present environmental metric is selected (in this case precipitation), and by a series of assumptions, linked to the risk for a local factor developing (in this case, fire). Models are then used, to obtain projections of the environmental metric (precipitation) into the future. Proportionally, the local event is projected to increase (usually), hand in hand, with the global environmental metric. Impressive area estimate figures are arrived at, for the purported ‘risk’, involving the entire Amazon.

From my comment

The way quantitative claims in the Amazon and the way their link to global warming is derived, pretty much follows the same pattern, as has occurred in this thread.

Initially, an area estimate is derived – either from modeling or from expert opinion. – indicating some damage to affect the entire Amazon. Properly considered, this estimate is meaningful, *only given the complete suite of assumptions*, which go into deriving the estimate.

Then, climate change is shown to be capable of potentially causing, usually by computer modeling, the worst/highest/most severe of causal factor/s. Thirdly, the above two things are clubbed together to derive high area estimates of forest ‘vulnerability’.

The above method – derivation of results by daisy-chaining and stacking of assumptions – implies naturally, that conclusions so arrived are heavily tied down by qualifiers, and therefore weak.

However, successive transmissions of such results happen in press releases, interviews or blog posts. At each step, the crucial context, the qualifying assumptions and conditions which constrain and make such conclusions meaningful are stripped away successively. This effectively de-contextualizes the scientific claim. The causal link between ‘climate change’ and the proposed ‘vulnerability’ becomes super-strong, at this stage.

What is left is a percent figure, the words ‘climate change’ and Amazon.

I would say these formulations above are very similar to what Hulme claims to have divined. The bulk of Working Group II high-impact high-profile modelling publications would squarely fall into this category and Hulme identifies this himself. The insight provided by Hulme in his paper is not new, and climate skeptics recognize it in its various forms.

Why does it happen though? Hulme blames the “hegemony exerted by the predictive natural sciences” which allows such ‘climate reductionism’ to emerge. I quote:

I suggest that the hegemony exerted by the predictive natural sciences over human attempts to understand the unfolding future, opens up the spaces for climate reductionism to emerge. It is a hegemony manifest in the pivotal role held by climate (and related) modelling in shaping climate change discourses.

This is backward in several important respects. There are strong systemic forces that strengthen predictive activity of this kind, once it takes its hold. Specifically in the case of climate change, a persistent and expansile international technocrat-activist class fuels the political and funding process. Predictive activity (i.e., modelling) capable of producing useful results (as opposed to no-catastrophe and world-will-end-tomorrow type results) are selected for survival in this milieu. Modeling that predicts undefined disaster safely well into the near future, whose putative outcomes are potentially modifiable by legislative regulation of substances is encouraged. In other words, the ‘hegemony exerted by predictive natural sciences’ coincides perfectly with spikes of political activity that finds such predictive science an attractive handmaiden. Once these driving forces are in place, the mere presence of an enormous number of variable in large, complex systems as the climate, makes the emergence of such climate reductionism almost inevitable.

Climate scientists  first extract climate “from the matrix of interdependencies”  and  then elevate “climate to the role of dominant predictor variable”, not just because they are scientists who “hegemony” made them do it, but because people and entities with power would benefit from them doing so.

Advertisements

One comment

  1. John Shade

    Good stuff. I like your capturing of a pattern used to generate alarm via computer models, and of the technocrat-activists beneftting ‘people and entities with power’.

    Your phrase ‘Take something good and connect it to climate, run a climate model and make it worse’ also works well, possibly better, as ‘‘Take something bad and connect it to climate, run a climate model and make it worse’