An objective analysis of the evidence for global warming suggests little if any anthropogenic effect, consistent with a direct radiative effect from increased CO2. It is also obvious that global temperature and ocean heat content should be related, so it’s somewhat surprising to see OHC rising so fast around 2002-3 when ocean temperature is relatively stable (upper line below).

# Month: October 2009

## Robustness of Natural Variation

Here, out-of-sample tests are used to test the robustness of the linear regression models of natural variation in global temperature. Previous models were developed on the whole data set. Here we develop them on partial data sets and examine how well they predict temperatures on the other part. These are also called independent tests.

The models that do well on the unseen data are in some sense more robust, reliable, and it gives you a feel for the constraints the data are placing on the models. You can see what conditions are needed to give certain results.

The results are placed in the animated gif above, where the blue temperatures are the out-of-sample values.

## Natural Variation Predicted the Flat Temps

To continue our excursion into natural variation models of global temperature: What do they predict?

Here are a couple of different models fit with data up to the year 1990. This was in order to compare their projections with out-of-sample reality after 1990. The year 1990 is also the start of the major IPCC projections from the TAR WG1 available here.

## Natural Variation vs Human Influence

One simple way to separate the influence of humans from natural variation is to fit a simple linear regression containing sinusoidal terms, as shown in previous posts.

The figure below shows the result: linear (dotted red), periodic (dashed red) and their sum (solid red) applied to global temperature data sets (A) GISS and (B) HadCRUT and (C) to a selection of simulation models.

## Natural Variation – 60 year cycle

Below is quick review of some of the evidence and consequences of a 60 year climate cycle. According to Roy Spencer, the argument that increasing carbon dioxide concentrations alone are sufficient to explain global warming is reasoning in a circle. By ignoring natural variability, they end up claiming that natural variability is insufficient.

However, the recent paper by Craig Loehle finds only a very small linear warming trend is left (potentially attributable to AGW) after subtracting the 60â€“70 yr cycle. While cause of the 60yr cycles is unexplained at present, he claims the small trend disproves AGW because it is:

clearly inconsistent with climate model predictions because the linear trend begins too soon (before greenhouse gases were elevated) and does not accelerate as greenhouse gases continue to accumulate with no acceleration in recent decades.

That oscillations are persistent features of the climate has been known for a long time. Stoker and Mysak in 1992 reviewed ice cores, tree-ring index series, pollen records and sea-ice extents over the last 10,000 years, finding:

The traditional interpretation that decadal-to-century scale fluctuations in the climate system are externally forced, e.g. by variations in solar properties, is questioned. A different mechanism for these fluctuations is proposed on the basis of recent findings of numerical models of the ocean’s thermohaline circulation. The results indicate that this oceanic circulation exhibits natural variability on the century time scale which produces oscillations in the ocean-to-atmosphere heat flux. Although global in extent, these fluctuations are largest in the Atlantic Ocean.

Even a paper by Michael Mann in 2000 identifies the cycle:

Analyses of proxy based reconstructions of surface temperatures during the past 330 years show the existence of a distinct oscillatory mode of variability with an approximate time scale of 70 years.

As far back as 1995 Mann published a paper in Nature stating:

THE recognition of natural modes of climate variability is essential for a better understanding of the factors that govern climate change. Recent models suggest that interdecadal (roughly 15â€“35-year period) and century-scale (roughly 50â€“150-year period) climate variability may be intrinsic to the natural climate system.

The issue is: How large is the cycle relative to potential warming due to AGW?. Klyashtorin and Lyubushin (2003) demonstrated that a 50â€“60 year period temperature signal is dominant from about 1650 (the end of the Little Ice Age) in Greenland ice core records, in several very long tree ring records, and in sardine and anchovy records in marine sediment cores. This result was also reported by Biondi et al. (2001), who also made the pithy remark:

Anthropogenic greenhouse warming may be either manifested in or confounded by alterations of natural, large-scale modes of climate variability.

A wide range of phenomena move in sync with this cycle. Long-term changes of Atlantic spring-spawning herring and Northeast Arctic cod commercial stocks also show 50-70-year fluctuations: sufficient to predict the probable trends of basic climatic indices and populations of major commercial fish species for up to 20-30 years into the future.

Zhen-Shan and Xian (2007) found China temperature from 1881 can be completely decomposed into four quasi-periodic oscillations including an ENSO-like mode, a 6â€“8-year signal, a 20-year signal and also a prominent 60-year timescale oscillation of temperature variation. While they found CO2 concentration contributed a small trend, its influence weight on global temperature variation accounted for no more than 40.19% of the total increase.

Perhaps its all a coincidence. Or perhaps we have yet to see much global warming from CO2, and its all going to suddenly leap out and ambush us in 20 years time.

Maybe, but speculation is a mugs game. Just the facts please. The last 50 years coincides with an upswing in the 60 year cycle, and the recent flat global temperatures coincide with the peak and subsequent downturn.

## Is OHC Accelerating (II)?

In tests of the rigor of the Steffen/Wong statement that “not only is the OHC increasing, it is increasing faster“, we previously used a linear regression model including natural cycles. The question was raised about the confounding of an upward trend with part of the quadratic terms representing ‘acceleration’. This risk is increased by the short run of data (only 54 years) and also because the phase of the periodic terms is a free variable. The periodic is free because both sin() and cos() are used.

The phase can be bound easily by the simplification below. I introduce 1976 as a start date for the sin() periodic, the date of the Great Pacific Climate Shift, a widely recognized change in ocean and atmospheric phenomena. The code for obtaining the probability that the model is improved by a quadratic term is then:

## Is OHC Accelerating?

Code and figures to quantify the answer to the question “Is ocean heat content is accelerating?” are below. The idea is that ‘acceleration’ is synonymous with the significance of a quadratic term in a regression:

1. Annual OHC data from NODC.

2. Fit a regression model (M1) incorporating linear and periodic terms of period 60 years (to account for Pacific Decadal Oscillation):

`x=time(OHC);`

f=x*pi*2/60;

M1 = lm(OHC~x+sin(f)+cos(f))

3. Fit another regression model with the addition of a quadratic term,

`M2 = lm(OHC~x+sin(f)+cos(f)+I(x^2))`

4. Compare the reduction in the regression sum of squares due to the incorporation of the quadratic term, taking into account the loss of degrees of freedom due to autocorrelation (see http://en.wikipedia.org/wiki/F-test for tests of nested models)

The result below shows M1 as a solid line and M2 as a dashed line. The p value for the F test is a marginally significant 0.052 (not significant at the 95% CL) for an improvement in the model due to addition of a quadratic term.

## Links for October

Innovation is not dead as Dyson eliminates the fan (maybe there is one inside though), showing the power of careful attention to shape.

Rational policy analysis is not an oxymoron, as Peter Gallagher deconstructs the emissions trading scheme (ETS).

## Ocean Heat Content Stumbles

We’ll be watching the drop in ocean heat content (OHC) raised by the brilliant Bob Tisdale for a potential follow-up to the Recent Climate Observations: Disagreement With Projections paper, where observations disproved speculations.

To some, the OHC represents a change in alarmist direction that became evident at as a result of due diligence activities of Senator Fielding and the Minister for the Climate Change and Water, Penny Wong. According to Penny Nova,

the alarmists have abandoned air temperatures as a measure of global temperature, because the air temperature graphs are just too hard to argue with and switched to ocean temperatures, which they often disguise as ocean heat content (a huge number like 15Ã—10Â²Â² Joules sounds much more scary than the warming it implies of 0.003Â° C/year).

The next step would be to check the unattributed graphic from the Climate Minister’s response to Senator Fielding below.

Surely there is robust statistical evidence and the Climate Minister is not trying to mislead when she states that:

not only is the heat content of the ocean increasing,

it is increasing faster.

The citation for the graph on OHC in the error-ridden Copenhagen Synthesis Report is Improved estimates of upper-ocean warming and multi-decadal sea-level rise which says nothing about an acceleration in the OHC.

I might start to gather information for a possible note along these lines, so if anyone can point me to a study justifying her claims of accelerating OHC I would appreciate it. My eye doesn’t see unusual rates on the graph at any scale.

## Briffa McIntyre tree-rings etc

My comments on the topical ‘Yamal’ issue:

My AIG article demonstrating reconstruction of a hockey stick with red noise, neatly illustrated the possibility of circular reasoning in screening trees by their response to temperature. Around 20% of random series (or 40% if you count the inverted ones) correlate significantly with the temperature instrument record of the last 150 years, and when averaged back beyond the present create the straight handle of the stick.

## Magnetism Science Exercise

This numeracy exercise for schools can be adapted to grades pre-up. You need:

1. A number of strong magnets, at least two per group

2. Iron filings

3. Selection of nuts of various metals: iron, copper, brass, some lead sinkers.

The hour long exercise is presented as an introduction to scientific thinking in the higher grades. I don’t worry about this for lower grades.

I usually choose an assistant to help handing out magnets, though this changes throughout.

1. Discovery – I introduce the magnets as my “floater rays”. I let them explore the repulsion of the magnets, showing how the magnet can “float” above another if in the correct orientation. I also do the moving the magnet under the desk trick.

2. Theory formation/induction – I hand out more magnets suggesting they find out what magnets are attracted too. After exploring for a while I solicit the theory, invariably “metal”. I ask them to put up their hand if they agree this is a good theory or not. (All usually agree with the theory).

3. Falsification/tests – I hand out nuts of various metals, indicating that we are going to test this theory. They soon get the point that their theory is wrong, and I cross it out on the board.

4. Refinement – Together we refine the theory that only certain metals are attracted, etc.

5. Visualization/schematics – I place a bar magnet (or the group of 8 or so magnets) under a piece of paper and sprinkle iron filings on top, Awesome! and Cool! are the usual responses at this stage. I introduce it by asking “Would you like to see my floater rays?”.

I get them to draw the rays on a piece of paper, but then go around an draw the schematic lines of magnetic force (familiar magnetic field for a bar magnet). I explain the difference between an artistâ€™s impression, and a schematic diagram.

6. Measurement/Numeracy – For prep and lower classes, I will get them to number the field lines in their schematic. For higher grades I go into the measurement phase, asking them to design an apparatus for measuring the strength of one or two magnets.

The way I do it is to attach one of the iron nuts to three or so rubber bands, and measure the extension as a magnet is pulls the nut against the band. This can be done next to a ruler, so is very easy to set up. I am sure there are other ways.

As I collect the (3) lengths from the groups, I list them in a table, calculate the differences (to get the extension) then add the numbers up. This is an opportunity to test their mental arithmetic and encourage agility.

The strength of one, two, three magnets etc. can be indicated from the overall result. More elaborate approaches are possible with higher level classes.