Lewandowsky et al 2013: surveying Peter to report on Paul

In 2012, Stephen Lewandowsky and co-authors submitted a paper to the journal Psychological Science, generating widespread publicity. Here, I address a simple issue/question that has hovered around the paper from the time it made its appearance. The issue is at the heart of Lewandowsky’s first ‘Moon Hoax’ paper and the in-limbo second paper in Frontiers in Psychology.

The ‘Moon Hoax’ paper (a.k.a LOG12, LOG13 etc) draws a number of conclusions about climate skeptics (called ‘deniers’). A major portion of the data and analysis is devoted to ‘rejection of climate science’. The paper’s title advertises its findings about ‘deniers’.

So the question is: how did Lewandowsky and co-authors study climate skeptics?

The paper draft (pdf) stated simply that authors ‘approached’ 5 skeptic blogs to post a survey, but ‘none did’. This led to a hunt to find who exactly these bloggers were (Lewandowsky wouldn’t tell). Lewandowsky spread significant amounts of distraction and smoke on the matter, raising hue and cry that he did email skeptical bloggers:

First out of the gate was the accusation that I might not have contacted the 5 “skeptic” bloggers, none of whom posted links to my survey. Astute readers might wonder why I would mention this in the Method section, if I hadn’t contacted anyone.

What matters however, is not whether or not Lewandowsky contacted skeptics but what came of such contact. The whole point of contacting the bloggers was to get surveys posted on their websites to ensure skeptic participation. This never took place. Through the noise, the question of non-sampling of skeptics remained unresolved‡.

As a way of providing answer, the paper itself appeared in final form about a month back. When examined, the authors appear to have settled on a remarkable method of addressing the defect. In the supplementary information, Lewandowsky et al (LOG13) make a startling claim. They state the blogs that did carry their survey have a broad readership ‘as evidenced by the comment streams’:

All of the blogs that carried the link to the survey broadly endorsed the scientific consensus on climate change. As evidenced by the comment streams, however, their readership was broad and encompassed a wide range of view on climate change.

The authors claim to have analysed reader comments at one venue to determine this. They state:

To illustrate, a content analysis of 1067 comments from unique visitors to http://www.skepticalscience.com, conducted by the proprietor of the blog, revealed that around 20% (N = 222) held clearly “skeptical” views, with the remainder (N = 845) endorsing the scientific consensus.

Extrapolating, the authors infer further that close to eighty-thousand skeptics saw Lewandowsky’s survey on Skepticalscience alone (see below). Owing to such broad readership, enough skeptics are said to have been exposed to the survey.

Readers of climate blogs will at once see several things that are off. However, these are the assertions forming the basis on which Lewandowsky et al 2013 rests.

Analysis

To start, the authors’ premises are accepted. It is deemed that comment streams can be analysed to determine whether a blog has a broad readership, or a more polarized one.

Comments on six blogs where Lewandowsky et al’s survey was posted were analysed. Commenter names and comment counts were obtained from web pages using R scripts. Following the authors’ method, this was carried out for the entire month the survey was posted. For each blog, duplicates were removed.

Commenters were classified as (a) skeptic, (b) ‘warmist’ (c) ‘non-skeptic’ (d) lukewarmer, (e) neutral, or (e) indeterminate. Regulars whose orientations are familiar (e.g., dana1981 – ‘warmist’) were tagged first. Those with insufficient information to classify, and infrequent posters with singleton comments were tagged ‘indeterminate’†.

The results are presented below. A total of 614 commenters contributed 4976 comments to six blogs in the month the survey was posted (range: 2 – 2387 comments/blog). An estimated 111 commenters posted across blogs, with 504 unique commenter aliases from all blogs.

The results show a skewed commenter profile. As a whole, there are 59 skeptical commenters, amounting to about 9.5% of total. Individually, skeptics range from 5-11% of commenters between blogs, with one venue (Hot Topic) showing 19% skeptics. Closer examination shows this to be made up by just 10 commenters. Non-skeptics are close to 80%, i.e., 480 of 614. Neutral posters are 9%, and indeterminate 3%. Of the 59, more than half are from comments posted at one blog (Deltoid).

counts

The same pattern can been seen to repeat by blog:

breakups

The marked difference in comment number between the blogs obscures underlying similarities. When commenter proportions are made equal, these become plain:

percent

spline

From the data above it is evident these blogs are not places where readership is “broad” or encompasses a wide range of views on climate. To the contrary, these are highly polarized, partisan blogs serving their cliques. One half of the blogs hosted comments from all of 6 skeptical commenters in total (Scott Mandia, A Few Things Ill Considered, and Bickmore’s Climate Asylum).

The non-surveyed Skepticalscience.com

What about Skepticalscience’s comment stream? Lewandowsky et al state that John Cook at his website analyzed 1067 comments to identify 222 skeptics and use this to buttress claims of broad readership in survey blogs. One wonders how Cook got the fantastic figures! When commenters for Sept 2010 are analysed, there are 36 skeptical voices of a total 286. Cook’s estimates are inflated six times over. In reality skeptics form 12.58% of commenters for that month, and a mere 0.03 fraction of John Cook’s 1067 unique commenters.  These results verify with independent analysis performed by A.Scott.

Furthermore, close to 90% of commenting viewers are not skeptics. Contrary to Lewandowsky et al, Skepticalscience is not a place where readership is “broad and encompasses a wide range of view on climate”. In fact Skepticalscience exactly matches Deltoid, a virulently anti-skeptic website, in commenter profile.

pol

Importantly however, John Cook never posted the survey at Skepticalscience (see here and here). In the face of this false claim, the authors’ post-hoc exercise of computing skeptic exposure becomes counterfeit.

How would the picture have been had Lewandowsky et al actually obtained survey exposure with a skeptical audience? As a comparative exercise, I pulled comment counts from widely read skeptical blogs WattsupwiththatBishop HillJoanne Nova and Climate Audit for the same period. Traffic figures provided by Anthony Watts indicate close to 3 million visits in August 2010. The results ought to be eye-opening:

winc

Conclusion:

A number of things can now be confirmed. The authors of Lewandowsky et al 2013 did not survey skeptical blogs. The websites that carried the survey have neither a broad readership, nor represented skeptical readers and commenters. The authors did not survey any readers at the website Skepticalscience, but represent their data and findings as though they did. Lastly, the authors’ calculations in assessing survey exposure, which they base on the same Skepticalscience, are shown to be wrong.

With the above, conclusions drawn about skeptics by Lewandowsky et al by sampling a population of readers and commenters who are not skeptic can be termed invalid. At best the study’s skeptic-related analysis is meaningless, arising from non-representative sampling. At worst the possibility of false conclusions owing to flawed survey exposure arises. The above data combined with Lewandowsky et al 2013 survey results, in fact, show one possible outcome of displaying loaded questions relating to climate skeptics to a non-skeptical audience. Conclusions about non-skeptical ‘pro-science’ commenters and their psychology are probably more appropriate.

Notes:

‡ The list of surveyed blogs (from Lewandowsky et al 2013 SI):

Skepticalscience – http://www.skepticalscience.com
Tamino – Open Mind http://tamino.wordpress.com
Climate Asylum – http://bbickmore.wordpress.com
Climate change task force – http://www.trunity.net/uuuno/blogs/
A few things ill considered – http://scienceblogs.com/illconsidered/
Global Warming: Man or Myth? – http://profmandia.wordpress.com/
Deltoid – http://scienceblogs.com/deltoid/
Hot Topic – http://hot-topic.co.nz/

Note that (a) there is no record of Skepticalscience having posted the survey, and (b) the Climate Change Task Force entry is available on the Waybackmachine (for e.g., here)

† Batch Google searches (e.g., http://google.siliconglobe.co.uk/) and keyword searches on scraped HTML blog posts were used to search for commenter output. Multiple entries were frequently required for each commenter to be satisfactorily classified. Wherever possible (which was so in almost all instances), results during August and Sept 2010 were employed. Comments supportive of consensus, critical of ‘deniers’ and ‘skeptics’ and/or unequivocally appreciative of article (e.g., “great post, now I can use this in my arguments with deniers”) were classified as coming from ‘warmists’. Comments approving of main thrust of a ‘warmist’ blog post, but with no further information available were classified as ‘ns’ – not skeptic. Commenters questioning basic premises of blog post, being addressed to by ‘denier’, ‘denial’ etc, whose stance could be verified by similar mode of behaviour in other threads, were classified as ‘skeptics’. In most instances they were easily recognized. Those, in whom no determination could be made, owing to various factors, were classified as ‘indeterminate’. Commenters explicitly professing acceptance of consensus but posing relatively minor question, etc – classified as lukewamers. Entries required reading at least two different comments for almost every commenter, except in instances commenter orientation was known from prior experience. Certainly there will be errors to a degree, and subjectivity is involved. It is unavoidable that infrequent (and singleton) commenters, and those with non-unique names (‘tom’, ‘john’) are resistant to classification. Validation of method was available when blogger A.Scott arrived at similar results working independently on portions of the data.

This article was published at WUWT.

About these ads

3 comments

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s