Breaking News
Loading...
Saturday, June 6, 2009

Info Post
Back when both my arms were fully functional (a time that hopefully will return soon) I would spend one of the two posts I wrote on Saturdays looking at what was being said on some of the Climate debate websites. I visit sites on both sides of the debate, being particularly interested in the evidence that is called forth, and the conclusions drawn from it. (The current list is given by the CC sites on the blogroll). These usually lead on to others, and so one can, to a degree, filter out new evidence from that which is being rehashed. However it can require a bit more due diligence, when both sides claim that the rational answer lies only with their side of the argument.

Consider, for example, the little debate this week between Real Climate and Climate Audit on the analysis of the data from the Antarctic that I have mentioned in an earlier couple of posts. The debate relates to how many Principal Components (PCs)* should be considered in deriving the predictive equation from which to generate the model results that decide whether the Antarctic is warming or cooling.

In the original paper Steig et al had used 3 PCs in order to generate the model and thus to conclude:
Here we show that significant warming extends well beyond the Antarctic Peninsula to cover most of West Antarctica, an area of warming much larger than previously reported. West Antarctic warming exceeds 0.1 °C per decade over the past 50 years, and is strongest in winter and spring. Although this is partly offset by autumn cooling in East Antarctica, the continent-wide average near-surface temperature trend is positive.
The paper had a number of flaws in it, based on the source data, but I initially assumed that the statistical analysis was carried out using reasonably widely accepted methods, a conclusion that I am increasingly unsure of.

In an earlier post at Climate Audit the number of PCs used was shown to have a very dramatic effect on the conclusions that could be drawn from the data with Antarctica getting either warmer or cooler depending on how many PCs were chosen (ranging up to 15). And so it seems that it is time to “bite the bullet” and talk a little about what proxies, PC’s and the terms that crop in these discussions mean. (And then to point to the misdirection of a couple of stories where they aren't).


I originally sought wisdom from review of the debate over the use of recognized statistical methods that was performed for the House Energy and Commerce Committee.
The Chairman of the House Committee on Energy and Commerce along with Chairman of the Subcommittee of Oversight and Investigations have been interested in discovering whether or not the criticisms of Mann et al. are valid and if so, what are the implications. To this end, Committee staff asked for advice as to the validity of the complaints of McIntyre and McKitrick [MM] and related implications. Dr. Wegman formed an ad hoc Committee (Drs. Edward J. Wegman – George Mason University, David W. Scott – Rice University, and Yasmin H. Said – The Johns Hopkins University). The Committee was organized with our own initiative as a pro bono committee.
It submitted a report to the Committee on Energy and Commerce.

As part of that report they explained PC analysis thus:
*Principal component analysis is a method often used for reducing multidimensional datasets to lower dimensions for analysis. In this context, dimensions refer to the number of distinct variables. The time series proxy data involved are transformed into their principal components, where the first principal component is intended to explain most of the variation present in the variables. Each subsequent principal component explains less and less of the variation. In the methodology of MBH98/99, the first principal component is used in the temperature reconstruction, and also has the highest explained variance. This method is intended for dimension reduction. In most datasets, the first principal component should be the least smooth (because of the higher variance).
(MBH98/99 refers to the Mann paper that produced the “hockey stick” curve.)

They discuss the effects of temperature change on tree rings, ice cores and coral (of which more anon). but it doesn’t really help those of us who would like this explained in layman’s language. So let me see if I can do that without getting too many folk offended.

When we want to find out the surface temperature of the Earth at some time in the past, and in places where there were neither thermometers, nor folk who kept records, we have to find some other measure that shows what the temperature was. We call these measures “proxies” in part because the behavior of the selected measure approximates the changes in the value we’re interested in. For example, from the list above, the structure, density and width of the ring that a tree grows in a year will vary with the local temperature. And so, by taking a core that recovers the section through a tree that has been around for a long time, we might get an estimate, from the size and structure of the individual tree rings, of the temperature when each ring was formed.

Unfortunately it is not that simple, for example for a Scots pine in Finland.
Earlywood width is controlled by precipitation in June and temperatures in mid winter (December/January) and March. Low mean temperature in April, adequate precipitation in May and a warm July results in wide rings. A long and warm vegetation period results in high latewood density, the strongest correlations occurring with July and August temperature and precipitation.

Tree Ring structure Source NOAA

Since the density and width of the ring can vary from more than one cause, the obvious two cited above are temperature and rainfall, we need to know how much each contributes to the change in the ring. This is not that easy, since the above quote shows that you can’t just use an average value for the year, but have to look at seasonal variables within the year, and this imposes additional variations in the result that we are using as our measure. Further there are some things that may change around the tree that we don’t know about. (from the Wegman report)
Each tree ring is composed of large thin walled cells called early wood and smaller more densely packed thick walled cells called late wood. The average width of a tree ring is a function of many variables including the tree species, tree age, stored carbohydrates in the tree, nutrients in the soil, and climatic factors including sunlight, precipitation, temperature, wind speed, humidity, and even carbon dioxide availability in the atmosphere. Obviously there are many confounding factors so the problem is to extract the temperature signal and to distinguish the temperature signal from the noise caused by the many confounding factors.

Given that complexity the first thing that we need to do is to create a model from data that we already have sufficient information so that we can estimate the impact of changing different values. And this is where our Principal Component comes in. (And for the purists again my apologies I am trying to keep this simple). We can take, for example, data from the last 100 years for a site where we know the variations in temperature, rainfall, relative sunlight etc, and plot the variation in our ring values against these various factors to see which ones best explain the changes in the tree value. The ones that correlate best are called the Principal Components. For example we might find that we explain 60% of the change in tree value by change in temperature, and then when we add rainfall that we have an equation that explains 80% of the variation. By adding further factors (say number of sunny days) we can improve our model so that it predicts with increasing accuracy (smaller variance) the actual values that we found from measuring the rings. So we build our predictive model, adding additional PCs (say soil nutrition) in order of their effectiveness in reducing the variation in the model from the measured data. At some point the improvement is too small to be considered significant, and variables that fall below that value (perhaps insect density) can be neglected.

Having got, or “calibrated” the model, we can then go back from the time and values that we know the PC values for, to look at times when we don’t. Knowing the inter-relationship between the factors and the values, it is possible to estimate what the temperatures were back when the tree rings were grown. (The process is also described by NOAA)

At least that is the basis of the procedure. The problem comes in the complexity of some of the proxies that have been used to assess what the temperatures were back then. To make calculations simpler the data that is used does not use the actual values, but rather looks at the variation in the proxy value from the averaged value over the time period. So that the variation in value from that mean is the measured value against which the models are calibrated. (The process is known as centering the model, since it is the variation from the central value of the data that is used in the subsequent analysis). One of the reasons for doing this is to cope with conditions that are below the average. (By using variation from the mean one can more easily note that a drop in temperature below the average can create a negative growth in the ring that year.) It is important that this be carried out correctly since, when it has not been, and the Wegman report notes that the “hockey stick” paper did not do so, then the results obtained can be flawed.
Our committee believes that the assessments that the decade of the 1990s was the hottest decade in a millennium and that 1998 was the hottest year in a millennium cannot be supported by the MBH98/99 analysis. As mentioned earlier in our background section, tree ring proxies are typically calibrated to remove low frequency variations. The cycle of Medieval Warm Period and Little Ice Age that was widely recognized in 1990 has disappeared from the MBH98/99 analyses, thus making possible the hottest decade/hottest year claim. However, the methodology of MBH98/99 suppresses this low frequency information. The paucity of data in the more remote past makes the hottest-in-a-millennium claims essentially unverifiable.

It also comments on something else that I find quite troubling, namely the very small circle that is formed of the “climate scientists” that publish in this area, and their inter-relationships, with most of the major authors in the field being tied to Dr. Mann – but that is an issue for another day. However, this "clique" does not include many with a strong grounding in statistics, and it is this weakness, and the errors generated in the predictions as a result, which make the papers that the groups generate (and which get significant international publicity) very vulnerable to criticism, which may well be justified.

I want to go back, however, to the comments on coral, since the growth of coral is one of the parameters that are used to assess Climate Change. The Wegman Report noted:
Reef-building corals are strongly affected by temperature and, as temperature drops, the rate of calcification drops with lower temperature potentially presaging the death of the colony. Coral growth rates vary over a year and can be sectioned and x-rayed to reveal high- and low-density bands. High density layers are produced during times of higher sea surface temperatures. Thus not unlike tree rings, data on corals also can be calibrated to estimate (sea) surface temperatures.

I draw attention to that since there is a current interest in coral data, with stories commenting on the “ravages” that will occur to coral with increasing temperature. And yet, even in those stories, dealing with coral damage one finds
Sea level rise, which is projected to occur this century as the world's glaciers melt, would not necessarily kill coral reefs, Tamelander said, since the reefs can grow as waters get higher.

"A healthy reef should be able to keep up," Tamelander said.
. Climate Audit points out the inadequate statistical analysis behind the headline, and then when the statistics are applied the conclusion is wrong.

But then these facts, and the impact of land sinking
"It doesn't matter who's causing global warming. Sea-level rise is something we can measure," said Rob Young, a geosciences professor at Western Carolina University. "You can't argue that sea level isn't rising."

And it has been rising faster in the mid-Atlantic because the land here is sinking. Understanding this phenomenon requires thinking of the Earth as an enormous balloon. Push down in one spot on the ball's surface and surrounding areas are raised up. Glaciers did this to Earth's surface during the last ice age: They pressed down on northern North America and areas to the south tilted up, like the other end of a seesaw. Today, thousands of years after the glaciers retreated, the seesaw is tipping back the other way, and the region from New York to North Carolina is falling about six inches per century.

And this is my gripe for the day – we see anecdotal headlines about how this disaster or that is coming, courtesy of global warming, but these warnings are rarely accompanied by an adequate statistical analysis that shows that the facts discussed are in fact real and correlated to global warming in the manner projected.

One can see this, for example, in the recent comment by Kofi Annan story on the level of death caused by global warming. This is rebutted by the statistics as noted in the Wall Street Journal (Hat tip Irv.) There is no mention of the hundreds of thousands of acres of the sub-Sahara that have been converted to agriculture in the last decade, thereby prolonging, instead of shortening life. Rather those stories are hidden in the headlines of starvation
Oxfam expects cereal production across five countries in the dry Sahel belt south of the Sahara -- Burkina Faso, Mali, Mauritania, Niger and Senegal -- will be a record 18.5 million tonnes this year, but the food on sale will be beyond the budget of many in these, some of the world's poorest countries.

0 comments:

Post a Comment