Kuhn, Paradigm shifts, String Theory and Observations

Venturing onto thin ground for me, and giving the stringy folk a chance to patronise me in return. So… I’ve been reading Kuhn, the structure of scientific revolutions. An interesting book, which I shall blog about in a bit. In some ways its a bit like reading Leviathan, but in reverse.

So: the basic point is: that scientists do “normal” science for most of the time, until enough observations pile up that simply cannot be explained under the current theory, or until complications in the theory needed to explain obs piles up, and eventually someone comes up with a blindingly novel viewpoint, the “paradigm” shifts, and everyone hares off down a new track, except for the old fogeys stuck in their ways.

And the obvious examples are Newtonian gravity/mechanics; SR/GR; and QM. Fair enough.

Now, stop me if you’ve heard this before, but are we, today, with String Theory for the very first time desperately *searching* for a paradigm shift? Its as if they’ve all read the book, said “sod all this normal science, we’re off on a new track” and deliberately set off to do so, *despite* a total lack of observations that contradict relativity or QM. Although of course ST is by now normal science. But ignore that.

The catalyst for this post is “The List” by Steinn; which again supports my contention that the current attempt-at-shift is unusual in being not driven by *any* inexplicable obs (let alone lots of them piling up; OK there is a possible pioneer effect but thats pretty marginal). Of course, what it is driven by is GR and QM not being compatible as theories; however this is really a rather different thing.

GRACE puzzle

The paper-for-today is Isabella Velicogna and John Wahr, Measurements of Time-Variable Gravity Show Mass Loss in Antarctica. Both Chris Mooney and Kevin Vrames have things to say about this washingtonpost write-up; I’ll try to talk about some different aspects.

First of all, this is only 3 years of data; to make any sense of it, you have to assume that this is representative of the long term trend. Suppose we do this. What washpost, and the Grauniad, and every one else seems to have missed is that this just tells you, if we take the paper at face value, that Antarctica is losing mass, at 0.4 mm/y of sea-level-equivalent. I can’t remember what the TAR thought; this suggests that they were unable to decide a good value, though this fig suggests they settled on a net negative contribution. But change from -0.2 mm/y to +0.4 mm/y doesn’t mean the true value has changed, just a new estimate of what the value might be; if you read the newspapers you get the impression that Antarctica has suddenly started melting.

Another point (see last IPCC link) is that its quite hard to match the *observed* sea level change to the various components; switching Ant from -ve to +ve would help resolve this.

How does this affect predictions that Antarctica will become a net relative sink in the future, due to extra precipitation outweighting increases in melt? Not at all, I should say. That still remains valid.

Lastly, how much should we trust these results? 3 years of data is not much; longer trends are needed. Also, James Annan has some wise words about trusting too much on the latest paper; these things really need time to settle. But all that aside, note that the results depend very heavily on the adjustment for isostatic rebound. This is because Antarctica is smaller now than at the height of the last glacial; hence lighter; hence the rock underneath is slowly moving upwards (post-glacial-rebound; PGR). GRACE measures gravity anomalies; ie weight of rock and snow (errm, and actually also weight of nearby water masses, which could be a problem on so short a scale as 3 years). So if the rock is going up, that needs to be subtracted to get mass of ice. The “problem” is that this is not a small term; see the papers figure 2, which shows that the trend is flat, without the correction for the rebound term. As the authors say:

The main disadvantage of GRACE is that it is more sensitive than other techniques to PGR; in fact, our error estimates are dominated by PGR uncertainties. As more GRACE data become available, it will become feasible to search for longterm changes in the rate of mass loss. A change in the rate would not be contaminated by PGR errors, since the PGR rates will remain constant over the satellite’s lifetime.

This doesn’t make the results invalid, of course; its just that an accurate assessment of PGR is hard, too.

[Update: the Scotsman Antarctic ice sheet is melting at rate of 36 cubic miles a year, says study sez much the same as the Grauniad (cos they got the same press release…) but does quote “our” David Vaughan with However, Professor David Vaughan, of the British Antarctic Survey, said the figures produced were not radically different from previous estimates, despite being produced using a totally different method. Normally scientists base their calculations on measurements of the height of the ice. And he added: “The record [showing the loss of ice] that’s been published is over a very short period. We’d really like several decades of record to be confident the changes we’re seeing are long-term trends.”

Oh, and I should have mentioned my thanks to John Fleck]

[Another update: I should have realised that the TAR fig has Gr/Ant (recent) as well as a *postive* contribution from Gr/ANt (long term). Which is mixed up with the PCR term. So this may not help as much as I suggested :-(]

Wild wiki excitement!

Wikipedia reached its one millionth article today, and I was there on IRC watching as it happened… although actually I’d popped into the kitchen to do the washing up at the crucial moment.

Predictably enough the millionth article itself is not very exciting: Jordanhill railway station.

My own minor attempt to get article 1M was to create Ray Bradley and Phil Jones. But was then astonished to find myself in a delete-undelete war over the RB entry, on the grounds that he wasn’t notable… unlike Jordanhill railway station. Sometimes wiki is a bit odd. To be fair, if you follow the RB link now its a better article than it was then. OTOH it may have been deleted again…

Wild IPCC excitement!

Well yes indeed, someone has leaked bits of the upcoming IPCC AR4 report to the BBC. The only odd thing is that its taken this long. The draft has “do not cite or quote” written on it, of course, but so many people have access to it that its hard to believe the media don’t. Chris Mooney has noticed the BBC; but the Grauniad had much the same a day earlier. RP predictably enough uses this as a peg to hang his favoured IPCC-is-politicised hat on; but this is nonsense: there is no evidence at all to connect this to anyone IPCC-ish.

The Grauniad leads with The Earth’s temperature could rise under the impact of global warming to levels far higher than previously predicted, according to the United Nations’ team of climate experts. This is probably nonsense, and a bad paraphrase of something that might actually be in the report scientists are now unable to place a reliable upper limit on how quickly the atmosphere will warm as carbon dioxide levels increase (if I sound unsure here its because I have an early draft, but I’m guessing this is from a later one; and it may well be an attempt at the SPM. Who knows, this is all fluff…).

What I would like to draw your attention to is lower down in the Grauniad, because its less exciting: James Annan, a British climate scientist who works on the Japanese Earth simulator supercomputer in Yokohama, says the risks of extreme climate sensitivity and catastrophic consequences have been overstated. He is about to publish a study showing that the chance of climate sensitivity exceeding 4.5C is less than 5%. In fact James’s study is more interesting than that, see the 2006 in press GRL paper for details. I’m allowed to say that now cos his press conf is finished 🙂

The Beeb take is somewhat different: The global scientific body on climate change will report soon that only greenhouse gas emissions can explain freak weather patterns (ie, this is *attribution* not rate-of-change, for those of you who haven’t been following…). I’m sort-of assuming that all this freak weather hype includes the boring but far more statistically do-able temperature increase. If true, this is a distinct strengthening over the TAR, wot said only “In the light of new evidence and taking into account the remaining uncertainties, most of the observed warming over the last 50 years is likely7 to have been due to the increase in greenhouse gas concentrations. (7 is a footnote ref to the technical meaning of likely in this context: 66-90%) (again, RP’s post is bizarre, since he purports to believe the new language isn’t stronger than the TAR; however, I don’t trust the Beeb to have paraphrased correctly).

Of course, even UK efforts to reduce CO2 are still a mess: read the end of the Beeb: [Blair] … would strive to meet his unilateral target of cutting Britain’s CO2 emissions by 20% by 2010…. Central figures in the review process are now admitting that the 20% target will be virtually impossible to hit, and are looking for a “respectable” near miss.