Understanding Climate Risk

Science, policy and decision-making

Cutting, pasting and throwing grad students under the bus

with 7 comments

Most of the so-called scholarship purporting to disprove human-induced climate change has failed epically. Recent events give a perfect example of why this is. However, reactions to the fall-out show that some of the so-called ‘good guys’  also need to take a good, hard look at themselves. Science without standards ain’t science.

The current issue of Nature Magazine has an editorial covering the retraction of a paper co-authored by Yasmin Said and Ed Wegman of a paper that contained material from the 2006 Wegman report. The retraction is due to high levels of unattributed material in the paper uncovered by bloggers Deep Climate and John Mashey.

One main criticism of mainstream climate science is the level of collaboration between scientists and their role with the Intergovernmental Panel on Climate Change (IPCC). This is portrayed as a conspiracy where the peer review system is abused by scientists giving each other favourable reviews and shutting out the opposition. This supposed lack of ethics in mainstream science is one of the main themes being used to foster doubt and confusion amongst the public. It wasn’t helped by the stolen CRU emails. However, if the contrarian view is to be supported by a respectable scholarly literature, shouldn’t the same high standards apply?

The controversy goes back to the hockey stick of reconstructed northern hemisphere (NH) temperatures. This showed that the late 20th century temperatures were warmer than the medieval warm period, also a NH phenomenon. Wikipedia has a detailed account of the controversy.

This part of the story goes back to 2005–6, when Senators Joe Barton and Ed Whitfield (US Republican) of the House Committee on Energy and Commerce commissioned Ed Wegman to investigate the statistics of Mann, Bradley and Hughes 600-year reconstruction published in Nature in 1998 (MBH98). This followed the publication of criticisms of MBH98’s statistical analysis by McIntyre and McKitrick in the pseudo-science journal Energy and Environment. M&M later produced hockey-stick re-analyses in the Journal of Geophysical Research in 2005, which were later shown to be missing a key step. For the House Commitee report, Ed Wegman, of George Mason University, teamed up with his former grad student Yasmin Said and David Scott of Rice University.

The report had two parts. The first part summarised the science of palaeo-reconstruction and reviewed the statistics used in the hockey stick results. They concluded that the hockey stick was an artefact of the statistical analysis used to reconstruct temperature, a conclusion later proven to be wrong. The report did contain a legitimate criticism of aspects of the statistics in MBH98 and MBH99 papers. This was backed up by a National Academies of Science assessment of the hockey-stick, but the NAS and Wegman et al. (2006) came to different conclusions. The NAS report and other scientific studies suggested the scientific conclusions remained valid (with some difference in the interpretation) and Wegman et al. said they did not.

The other part of the Wegman Report was a social network analysis of the publications of the authors involved in palaeoclimate reconstructions centring on Mchael Mann. It showed a great deal of collaboration and cross-collaboration; prima facie evidence, of course, for a conspiracy. Two online sleuths, Deep Climate (DC) and John Mashey forensically examined the Wegman Report, after Mashey DC recognised that some of the text bore close similarity to a Ray Bradley text on tree-ring analysis. DC and Mashey then analysed the entire report finding passages from other texts, Wikipedia and various online sources. DC also concluded that the statistical results were cherry-picked. Rather than the statistics showing frequent random hockey sticks as was alleged, the results were taken from 100 of 10,000 in an archive, developed using a technique that had already been shown to be faulty. So, Wegman et al.’s conclusion of the MBH98 and MBH99 statistics was a stitch up.

In March 2010, Ray Bradley complained of plagiarism to George Mason University and to Rice University, where Scott was based. Rice very quickly dealt with the complaint, clearing Scott and referring responsibility back to Wegman. The GMU investigation is still unresolved.

The investigation then turned to the social network analysis. Lo, and behold! More cut and paste, more Wikipedia and more unattributed sources .

The social analysis was then written up as a paper: Said, Wegman, Sharabati and Rigsby Social Networks of Author–Coauthor Relationships, published in January 2008 in the journal Computational Statistics and Data Analysis. It was obviously an excellent paper because the editor, Stanley Azen, turned the review around in 5 days. USA Today picked up the work on plagiarism and asked experts to comment on the quality of the paper. Network analysis expert Kathleen Carley of Carnegie Mellon said it read like an opinion piece and would have needed a major revision to be publishable. USA Today also reported that the paper was to be retracted after advice by the journal’s legal team.

The paper hasn’t yet been retracted. There is no retraction notice on the journal site and the paper is still available online. Stanley Azen, the journal’s current editor, who approved the paper in the first place, suggested to Dan Vergano of USA Today that:

he must have overseen an earlier, more extensive review of the paper involving outside reviewers. But he says he has no records of this earlier review, because his records were destroyed in an office move. “I would never have done just a personal review,” he says.

External reviews in five days? Remarkable. I’m lucky to get a response to a reveiw request in five days in my role as Associate Editor for Atmospheric Science Letters.

But wait, there’s more. Wegman claims that any plagiarism was accidental and that the social networking material had been put together by a grad student in any case. That material was collated for the original Wegman Report. Whoops! One grad student thrown under the bus.

Except said former grad student refuses to be roadkill. Dan Vergano asked Denise Reeves – the person involved, receiving the following response:

I was Dr. Wegman’s graduate student when I provided him with the overview of social network analysis, at his request. My draft overview was later incorporated by Dr. Wegman and his coauthors into the 2006 report. I was not an author of the report.

The format of the 2006 report involved a limited amount of citations. The social network material that I provided to Dr. Wegman followed the format of the report.

This habit of using grad students as lackeys then taking on authorship is a particularly odious aspect of academia. By placing oneself as author, the academic should take full responsibility for whatever is written. Wegman has argued that the scholarly aspect of a congressional committee report is less than that of refereed work. Why then, did the same material make its way into a refereed paper that apparently received less than full scrutiny in review?

John Neilsen-Gammon is another scientist who has blogged on the issue. He asked Gerald North and Tom Crowley what they thought. These researchers took part in the NAS study on the hockey stick and testified in the same committee hearings as Wegman following the release of the NAS and Wegman studies. I used to admire both these guys until I read their responses. North said:

 While I cannot excuse the academic crime of plagiarism, I do feel somewhat sad that this episode has reached this stage. I think Wegman is a well meaning person who was a victim of plagiarism by a foreign student who probably did not understand this ‘strange’ American custom.

Could this be a ‘gotcha’ for ClimateGate? Institutions cannot take this kind of heat without throwing someone under the bus. I hope George Mason University can take it.

So North throws an unnamed ‘foreign’ student under the bus in sympathy. What is it about authorship and responsibility that we’re missing here?

Crowley said:

Now you will see the ugly side of the pro-warming faction – I too met Wegman, talked to him a bit, and had dinner with him and some others afterwards. I liked him.

I do not agree with some of his climate conclusions (cannot vouch for statistics), but after meeting him I just do not believe he is dishonest – and such a fate could have happened to a lot of us (“there but for the grace of God go I”…..).

Many, many professors on both sides of the climate fence need to be wary of the dangers of overly trusting a student – esp. an Asian student, as they seem more unaware of the seriousness of the issue (probably because their own culture is most different than ours).

Oh bugger, now we’re throwing Asian students under the bus because they don’t understand the quaint American custom of academic integrity. Is that careless Asian student Sharabati, the third author of the Said et al. paper, or someone else? Not satisfied with throwing grad students under buses, we need to engage in some casual racism as well. Old white guys have to stick together, you know. Except now John N-G is probably off the Christmas card list. Good on that man.

Let’s get this clear. If you are the author of an academic paper, you are responsible for what is in the paper. Not the grad student, the janitor, or the next door neighbour’s cat. It is the person or persons who have signed the publisher’s release who carry the can. No ifs, no buts, no maybes.

This whole episode does say something about social networks. North, Crowley and Wegman survived a committee grilling together. They broke bread together. That seems to be a stronger bond than the obligations that senior academics perhaps should have to junior academics.

Some commentators have suggested the Said et al. paper is self-refuting but it is quite the opposite. Social networks are very strong and can become aligned with communities of practice. Without strong ethical and professional reinforcement these practices may develop lax standards and self referential bias.

In the broad sense, the networks of authorship that comprise the Mann et al. grouping and the Wegman-Said grouping probably aren’t all that different. What may be different is the level of scrutiny applied in peer review, the ethical application of the authors and the ability of independent review to confirm, falsify or amend earlier findings. In the case of the hockey stick, a number of independent studies have now shown the result to be robust, even though the original method contained flaws.

The same can’t be said of the Wegman-Said group, who are essentially a text recycling exercise. For example, a recent WIREs Computational Statistics overview article on colour theory and design by Wegman and Yasmin Said is full of unattributed decade-old material from various websites.

The institutional framework of good scholarship is no accident. It is underpinned by professional and ethical standards. These standards are not absolute; they have been developed by trial and error and are continually evolving. Every university has a written version as do Academies and many professional societies. If these standards are not observed, poor scholarship is the result. This isn’t an amateur vs professional thing – anyone can produce good science if they apply the scientific method within an ethical framework. If there is to be a debate about climate science, or aspects of it, the rules should apply to all participants.

Update: DC’s correction & edits for clarity


Written by Roger Jones

May 28, 2011 at 3:00 pm

7 Responses

Subscribe to comments with RSS.

  1. Hello Roger,

    A couple of slight corrections.

    I was the one who first found about plagiarism in the Wegman Report in December 2009, eventually detailing 10 pages of backgound material (section 2) that contained unattributed material strikingly similar to unattributed antecedents. This led (unbeknownst to me) to Raymond Bradley’s complaint to George Mason University. John Mashey later found another 25 problematic pages in the supplementary “summaries” of so-called “important papers”.

    Also it’s Yasmin (not Jasmin) Said.

    I don’t necessarily concur with each and every one of your observations, but I do agree that the same rules should apply to all. Sauce for the goose and all that …

    Deep Climate

    May 28, 2011 at 5:06 pm

  2. Thanks DC,

    corrections made. I think your work especially in making the playing field a bit more even.John Mashey’s work has also been really useful.

    I actually don’t mind contrarian science, being pretty contrary myself at times, but there are standards. I don’t think that Wegman et al. knew quite what they were getting in for. Those of us who work on politically charged science know that one mistake in an IPCC report makes headlines, while error ridden not-the-IPCC reports pass with little comment. The sources they rely on often have even less scrutiny.

    Roger Jones

    May 28, 2011 at 6:11 pm

  3. Yes, generally good comments,and I’m glad you fixed the attribution: none of this would ahve happened without DC’s finds.

    Note, since I might have started “self-refuting”, I disagree with “Some commentators have suggested the Said et al. paper is self-refuting but it is quite the opposite.”

    See p.150 of SSWR.

    They made claims that there can be abuse of peer review, which I certainly don’t disagree with.

    The self-refuting tag was really applied to the idea that the entrepreneurial networks were more prone to abuse than mentor (Wegman-like) networks, for which they gave zero evidence, and the low quality of the paper (done by Wegman + 3 of his students) showed exactly the opposite. Also, I noted that Said had become an Associate Editor of CSDA fairly quickly.

    John Mashey

    May 31, 2011 at 7:22 am

    • Aah. Thanks.

      Re, the self-refuting thing. I hadn’t picked up on that – didn’t follow their argument that closely because I found it risible. I agree with that totally then.

      What were they thinking? They obviously believed their argument despite the fact they were engaging in opposition to their conclusions. I had noticed Said’s rise but not mentioned it – it looks very much like a patronage system one would find in Italy or the Philippines.

      I haven’t found this in the more physical sciences – not too much anyway. There is one group I can think of that award each other a lot at conferences but the peer reviewed stuff is of good quality.

      I work in a centre for economics research and have heard some stories in that discipline that make my hair (the remaining one) uncurl.

      I would like to do a companion post on the rigours of good peer review, but have a pile of tasks ahead. Recent protests made because inexperienced contrarian authors run afoul of the review system seem to have a common thread. Most people are unaware of the rigours of review. Even good friends slice and dice each other if the work demands it. It is one key area where mentoring really matters.

      Roger Jones

      May 31, 2011 at 8:11 am

  4. What were they thinking?
    I think Wegman believes statisticians can jump into a different field and quickly do credible work, but really, I think the conclusion was mandated from the beginning, and they wanted to get a peer-reviewed paper that said so.

    See the history in STaE, Appendix B.3 (last few lines), which clarifies some connections discussed in SSWR, pp.185-186.

    I.e., I speculate that Jerry Coffey discussed this with Wegman, using McIntyre+McKitrick’s May 11, 2005 presentation for George Marshall Institute, and I think Wegman got the idea to do the put in a fancy SNA facade to motivate the “paleo peer review is bad” meme that MM & thinktanks had been pushing.
    Put another way, I’d guess that this idea was rolling early, like Sept/October 2005, not an outcome of their research, and at some point, hopefully people will get asked under oath when certain things happened.

    John Mashey

    June 1, 2011 at 12:58 pm

    • That’s my view also, but I gained that pretty much from what you and DC had uncovered. The whole episode was developed to prevent any US and Canadian entry into the Kyoto Protocol, which every player named so far seems to think was a “bad idea”. The MO was to discredit the science in the manner that Oreskes and Conway document in the Merchants of Doubt.

      The statistician as instant expert came up the Greenhouse 2011 conference in April. After a presentation by Scott Power where he showed a decadal line of best fit and correlation for decadal variability in the Pacific, in question time a statistican announced his prestigious UK School, that good statistics was vital for good science and attempted a take-down of the presentation because one couldn’t correlate smoothed data. Scott responded that the smoothing was for communication and ran through their statistical validating criteria – 1, 2, 3, 4 – finishing off with the observation that the questioner could have asked what procedure was used before delaring it in error.

      Statistics can’t be good statistics without good theory.

      Roger Jones

      June 1, 2011 at 1:44 pm

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: