Via Prometheus I find von S’s testimony on the Hockey Stick and related issues. Interesting point number 1 is that von S has clearly noticed he is being used (or selectively quoted) by the septics, and so starts his testimony with Based on the scientific evidence, I am convinced that we are facing anthropogenic climate change brought about by the emission of greenhouse gases into the atmosphere. Maybe that will be enough to stop too many septics pointing to it.
We also have I conclude that the claim of “detection of anthropogenic climate change” is valid independently of which historical temperature reconstruction one chooses to believe in. and It should also been taken notice that the claims of successful detection on non-natural warming trends and its attribution to chiefly elevated greenhouse gas concentrations in the atmosphere in the Third Assessment report were not based on the historical reconstructions but on the analysis of the instrumental temperature record as well as on numerical experiments with climate models. Which is what RC has been saying for a while but no-one listens 😦
But its not all good news for MBH…
So whats wrong with MBH? von S says The Wegman-report claims that a major problem in studies such as MBH would be an insufficient engagement by mainstream statisticians. I think a major problem with this study and its transformation into a policy-relevant issue is an insufficient comprehension of the social dynamics of the post-normal process of (not only) climate science. Um. Whenever I hear “post normal” I think “Sokal”; I’m not entirely sure what von S means by this but if its his leading criticism on MBH, then they can be very happy, because thats effectively a criticism of the whole process, and how the study was used, not of them.
So what else? If you’re interested, read the report, I can only pick out a few bits:Nevertheless, attempts like those by MBH are useful and should be explored. They may provide useful estimates. The problem with MBH was that the result was presented by the IPCC and others in a manner so that one could believe a realistic description of historical temperature variations had successfully been achieved. The NRC report published in June 2006 has made
clear that such a belief was incorrect. Whoompf. More a crit of the TAR than MBH, though.
the danger is that a few scholars may become powerful gatekeepers, for example as reviewers who are regularly called upon or as editors of scientific journals. The primary goal of such gatekeepers is to fend off publications which may contradict their own thinking… Unfortunately this seems to have happened in the field of historical global climate reconstructions, where a small group of scientists has exerted an undue control of the entire field. No names, but a harsh criticism of somebody.
a further mechanism more closely tied to the substance of research is used to quality control scientific knowledge claims, namely reproducibility. This mechanism has ceased to operate in some quarters of paleo-climate science, since some scientists consider “their” data as their personal property and not that of the scientific community, so that others are unable to challenge conclusions drawn from these data by analysing the raw data in their own manner… Data must be become public; the methods employed must be described in algorithmic detail. I agree that the data should become public (after a suitable period for the initial investigator to get a return on investment) without taking a position on whether this is a problem. Mind you, I know the BAS ice core folk tend not to like giving out their data even to other BAS people…
He also, fairly, ticks of Nature and Science for including “newsworthiness” amongst theircriteria for publication.
And lastly… what about being able to replicate work? Von S says:
[Scientists] should document what they have done, so that others can replicate. However, this documentation often can not take the form of keeping runnable old codes of the applied algorithms, simply because the software is no longer consistent with quickly replaced hardware. For instance, most of the state-of-the-art coupled AOGCMs used in the mid 1990s are simply no longer available and running at, for instance, the German Climate Computer Center. After replacing a high performance computer with a new system, the standard model codes, including community models, need to be adapted to the requirements and possibilities of the new system, and the old code will often no longer run. This has nothing to do with the norms of the community but simply with technological progress. Also specific commercial libraries of specialized algorithms may no longer be accessible. Data and codes written on old magnetic tapes or even floppies are usually no longer readable. Therefore the documentation must take the form of a mathematical description of the algorithms used. This is in many if not most cases sufficient for replication. Also, the intention of replicability is not to exactly redo somebody’s simulation and analysis, but to find the same result with a similar code and different but statistical equivalent samples.
The argument about no-longer runnable doesn’t really apply to the MBH stuff, as it ws std fortran. But the CM question is interesting. von S is saying the obvious: many old GCMs are unrunnable now. But to say that you *should* keep the mathematical description seems to me very highly idealised. I get the strong impression that much of the basics of models in (in theory) written down, and much of the initial versions too (HadCM3 was pretty good at that) but that as the model developes the documentation often gets left behind. You might well be able to recreate 95% of the scientific content of a model from its doc, but thats not nearly enough! And, of course, getting independent people to do this would be a huge task. And also a soul-destroying one, since its so totally pointless.
Did you notice I stopped warning you about boring HS posts? Based on the comments, its fairly clear that people don’t find it boring at all 🙂