I was – I still am – going to write a post about my recent adventures in “skeptic” land, but I’ve got distracted by Atmospheric Layers, The Biosphere, The Boundary Layer, Microclimate and Inadequate IPCC Models which is comically incompetent. To a degree that I found hard to believe. There’s an open goal there waiting for shots, yet Tim Ball hits every one wide. Lets go: (oh, but don’t miss the update at the end)
TB begins by complaining
During a university presentation I said the climate models do not include the Milankovitch Effect… My mistake was I forgot to say I was talking about Intergovernmental Panel on Climate Change (IPCC) models.
He then continues – just to prove that he’s already been told the correct answer, but is too dumb to understand it – “An IPCC climate modeller told me the time scale was not appropriate to include the Milankovitch Effect”. Which is correct: Milankovitch cycles have their shortest period at ~20 kyr; over the typical IPCC runtime of 1850 to 2100, the change is too small to be worth worrying about. Incidentally, I could point out that in this, rare, case the reason for not adding the effect is only that its too trivial to be worth it, not because of increased runtime or such.
Climate below 2 m
We’re now talking about the boundary layer, and then the near-surface layer. Most people when talking about GCM resolution focus on the horizontal (typically ~100-300 km) and forget about the vertical. Martin et al., in Geosci. Model Dev. Discuss., provide some helpful info about The HadGEM2 family of Met Oﬃce Uniﬁed Model Climate conﬁgurations, so we can discover that the lowest model level is at 10 m and (scroll down to fig 2, and bump up the magnification) the next one up is at 50 m. The problem here isn’t the time-lag that TB identifies, its just that you can’t throw too much resolution at the near-surface because it would use up too much processing time. How much of a problem is it? Not as much as you might think, because the boundary layer is parameterised – the model is not relying on its own native resolution to get things right.
There’s a more interesting issue with the thickness of the ocean surface layers and the effects on ?tropical evaporation? but I forget the details; that one *is* related to the time-lag problem.
That one was a bit so-so. But TB totally blows it with:
Other near surface measures like CO2 are taken above 2 meters. “Air samples at Mauna Loa are collected continuously from air intakes at the top of four 7-m towers and one 27-m tower.” How does that help understand energy flows in the atmosphere?
WTF? GHGs like CO2 are well mixed – at least, they’re well mixed enough for the purposes of a GCM. Yes, I know, if you look closely you can see variations in space or time, but look at the scale. We’re back to his failure to understand the Milankovitch forcing; its the new Aristoleans all over again. If you want to understand something, and make progress, you have to not insist on getting every teensy tiny thing right down to the finest level of detail. Trying to do that leaves you as stuck as Greek Science.
(This is TB’s ection header, not mine) TB then helpfully explains that models run on a discretised grid, and that this is a problem because
There is virtually no weather data for some 85 percent of the world’s surface. Virtually none for the 70 percent that are oceans… It’s worse in the vertical with virtually no data in space and time and constantly changing very complex conditions.
The more data we have the better of course, but the lack of “weather” data isn’t a great problem. The data we already have (on, say, sea ice extent) is enough to know that the models have flaws. Describing this as a “fundamental” problem is wrong. Or, less esoterically, you can just look at the seasonal MSLP (mean sea level pressure) pattern, which we know very well from the reanalyses. Or warm biases in the tropopause. None of these are strongly affected by lack of obs; all of them are enough to show up model errors.
Not having a proper stratosphere is a bit of a model problem. But TB manages to mangle even this. After a brief- presumably obligatory, but irreleveant – fling at Mann, coupled with a brief genuflection towards the adored Lamb he continues
We know from Pinatubo and all other major eruptions a significant factor is the amount of dust injected into the stratosphere. The IPCC models don’t appear to include the stratosphere as they state “Due to the computational cost associated with the requirement of a well-resolved stratosphere, the models employed for the current assessment do not generally include the QBO.”
Yes, that’s right: TB doesn’t really know if the models have a stratosphere or not; he has to rely on the IPCC to tell him, obliquely. But he’s missed “well-resolved”. The Martin et al. paper does discuss how increasing stratospheric resolution improves the QBO. But at this point, TB is talking about aerosols, which are prescribed in the models, so the need for higher resolution in the stratosphere isn’t obvious.
Lower Layers of the Atmosphere
We’ve been up, we come back down (I’m following TB’s sequence, not anything logical :-). TB complains:
Boundary layer; Surface fluxes are computed from bulk relationships with transfer coefficients according to Monin-Obukhov similarity theory… What are they using to create parameterised values? Most of the references in reports are to 1990s material.
Errm, yes: an Euclid was written more than 2000 years ago, but is still valid. The point is that boundary layer theory was largely worked out some time ago, though I’m sure many exciting problems still remain.
The top two meters of the Earth’s surface and the bottom two meters of the atmosphere are the most critical layers… Too bad they are the least measured or understood of all the layers and omitted from the IPCC models.
Well they’re certainly not the least measured. The fact that they’re so near the ground tends to mean they get measured a lot. Nor are they least understood – they are more complex, though, unlike the more free-flow stuff higher up. And “omitted” isn’t quite right either, as discussed above.
All you “skeptics” out there… I’m a little shaky on all this, its been seven years after all. If you can’t pick decent holes in this post, you may as well give up.
There’s an other even more like totally-wacky-maaaan post at WUWT, Steve Burnett’s Hard vs. the Soft Sciences Essay; An Ongoing Debate Central To Climate which is Tim Ball trying to explain to the Watties why they shouldn’t look down on him for being a geographer. But in the course of it he goes even further overboard in demonstrating he knows nothing about parameterisation in GCMs.