The value of tau

Admin: Posted up for Steve, with an initial response by Miklos. The slides Steve referred to are here. My bad for not telling Miklos that.

Link to TF&K08

Miskolczi theory proposes a tau (Ta if you will) significantly different from that found by at least a dozen other studies published in the peer-reviewed literature over more than a decade, as well as a number of other new relations A_A = E_D, f = 2/3 etc., etc.
Continue reading The value of tau

GBR recovery

A feel-good story of nature’s resiliency, “Doom and Boom on a Resilient Reef: Climate Change, Algal Overgrowth and Coral Recovery” has been making press with headlines focusing on the state of mind of the authors:

Marine scientists say they are astonished at the spectacular recovery of certain coral reefs in Australia’s Great Barrier Reef Marine Park from a devastating coral bleaching event in 2006.

Continue reading GBR recovery

Errors of Global Warming Effects Modeling

Since 2006, in between promoting numeracy in education, and examples of simple statistics using topical issues from the theory of Anthropogenic Global Warming (AGW) to illustrate points, I asked the question “Have these models been validated?”, in blog posts and occasionally submissions to journals. This post summarizes these efforts.

Species Extinctions

Predictions of massive species extinctions due to AGW came into prominence with a January 2004 paper in Nature called Extinction Risk from Climate Change by Chris Thomas et al.. They made the following predictions:

“we predict, on the basis of mid-range climate-warming scenarios for 2050, that 15–37% of species in our sample of regions and taxa will be ‘committed to extinction’.

Subsequently, three communications appeared in Nature in July 2004. Two raised technical problems, including one by the eminent ecologist Joan Roughgarden. Opinions raged from “Dangers of Crying Wolf over Risk of Extinctions” concerned with damage to conservationism by alarmism, through poorly written press releases by the scientists themselves, and Extinction risk [press] coverage is worth the inaccuracies stating “we believe the benefits of the wide release greatly outweighed the negative effects of errors in reporting”.

Among those believing gross scientific inaccuracies are not justified, and such attitudes diminish the standing of scientists, I was invited to a meeting of a multidisciplinary group of 19 scientists, including Dan Bodkin from UC Santa Barbara, mathematician Matt Sobel, Craig Loehle and others at the Copenhagen base of Bjørn Lomborg, author of The Skeptical Environmentalist. This resulted in Forecasting the Effects of Global Warming on Biodiversity published in 2007 BioScience. We were particularly concerned by the cavalier attitude to model validations in the Thomas paper, and the field in general:

Of the modeling papers we have reviewed, only a few were validated. Commonly, these papers simply correlate present distribution of species with climate variables, then replot the climate for the future from a climate model and, finally, use
one-to-one mapping to replot the future distribution of the species,without any validation using independent data. Although some are clear about some of their assumptions (mainly equilibrium assumptions), readers who are not experts in modeling can easily misinterpret the results as valid and validated. For example, Hitz and Smith (2004) discuss many possible effects of global warming on the basis of a review of modeling papers, and in this kind of analysis the unvalidated assumptions of models would most likely be ignored.

The paper observed that few mass extinctions have been seen over recent rapid climate changes, suggesting something must be wrong with the models to get such high rates of extinctions. They speculated that species may survive in refugia, suitable habitats below the spatial scale of the models.

Another example of an unvalidated assumptions that could bias results in the direction of extinctions, was described in chapter 7 of my book Niche Modeling.

range_shift

When climate change shifts a species’ niche over a landscape (dashed to solid circle) the response of that species can be described in three ways: dispersing to the new range (migration), local extirpation (intersection), or expansion (union). Given the probability of extinction is correlated with range size, there will either be no change, an increase (intersection), or decrease (union) in extinctions depending on the dispersal type. Thomas et al. failed to consider range expansion (union), a behavior that predominates in many groups. Consequently, the methodology was inherently biased towards extinctions.

One of the many errors in this work was a failure to evaluate the impact of such assumptions.

The prevailing view now, according to Stephen Williams, coauthor of the Thomas paper and Director for the Center for Tropical Biodiversity and Climate Change, and author of such classics as “Climate change in Australian tropical rainforests: an impending environmental catastrophe”, may be here.

Many unknowns remain in projecting extinctions, and the values provided in Thomas et al. (2004) should not be taken as precise predictions. … Despite these uncertainties, Thomas et al. (2004) believe that the consistent overall conclusions across analyses establish that anthropogenic climate warming at least ranks alongside other recognized threats to global biodiversity.

So how precise are the figures? Williams suggests we should just trust the beliefs of Thomas et al. — an approach referred to disparagingly in the forecasting literature as a judgmental forecast rather than a scientific forecast (Green & Armstrong 2007). These simple models gloss over numerous problems in validating extinction models, including the propensity of so-called extinct species quite often reappear. Usually they are small, hard to find and no-one is really looking for them.

Hockey-stick

One of the pillars of AGW is the view that 20th-century warmth is exceptional in the context of the past 1200 years, illustrated by the famous hockey-stick graph, as seen in movies, and government reports to this day.

Claims that 20th-century warming is ‘exceptional’ rely on selection of so-called temperature ‘proxies’ such as tree rings, and statistical tests of the significance of changes in growth. I modelled the proxy selection process here and showed you can get a hockey stick shape using random numbers (with serial correlation). When the numbers trend, and then are selected based on correlation with recent temperatures, the result is inevitably ‘hockey stick’ shaped: i.e. with a distinct uptick where the random series correlated with recent temperatures, and a long straight shaft as the series revert back to the mean. My reconstruction was similar to many other reconstructions with low variance medieval warm period (MWP).

from-clipboard-2

It is an error to underestimate the effect of ex-post selection based on correlation or ‘cherry picking’ on uncertainty. Cherry picking has been much criticised on ClimateAudit. Steve McIntyre and Ross McKitrick published in February 2009 a comment, cited my AIG article, in a criticism of an article by Michael Mann, saying:

Numerous other problems undermine their conclusions. Their CPS reconstruction screens proxies by calibration-period correlation, a procedure known to generate ‘‘hockey sticks’’ from red noise (4).

The response by Michael Mann acknowledged such screening was common, used in their reconstructions, but claimed it was ‘unsupported’ in the literature.

McIntyre and McKitrick’s claim that the common procedure (6) of screening proxy data (used in some of our reconstructions) generates ‘‘hockey sticks’’ is unsupported in peer-reviewed literature and reflects an unfamiliarity with the concept of screening regression/validation.

In fact, it is supported in the peer-reviewed literature, as Gerd Bürger raised the same objection in a Science 29 June 2007 comment on “The Spatial Extent of 20th-Century Warmth in the Context of the Past 1200 years by Osborn and Keith R. Briffa (29 June 2007)” finding 20th-Century warming not exceptional.

However, their finding that the spatial extent of 20th-century warming is exceptional ignores the effect of proxy screening on the corresponding significance levels. After appropriate correction, the significance of the 20th-century warming anomaly disappears.

The National Academy of Science agreed that uncertainty was greater than appreciated, and shortened the hockey-stick of the time by 600 years (contrary to assertions in the press).

Long Term Persistence (LTP)


Here
is one of my first php applications, a fractional differencing simulation climate. Reload to see a new simulation below, together with measures of correlation (r2 and RE) with some monthly climate figures of the time.

This little application gathered a lot of interest, I think because fractional differencing is an inherently interesting technique, creates realistic temperature simulations, and is a very elegant way to generate series with long term persistence (LTP), a statistical property that generates natural ‘trendiness’. One of the persistent errors in climate science has been the failure to take into account the autocorrelation in climate data, leading to inflated significance values.

It has been noted that there are no requirements for verified accuracy for climate models to be incorporated into the IPCC. Perhaps if I got my random model published it would qualify. It would be a good benchmark.

Extreme Sensitivity

“According to a new U.N. report, the global warming outlook is much worse than originally predicted. Which is pretty bad when they originally predicted it would destroy the planet.” –Jay Leno

The paper by Rahmstorf et al. must rank as one the most quotable of all time.

The data available for the period since 1990 raise concerns that the climate system, in particular sea level, may be responding more quickly to climate change than our current generation of models indicates.

This claim, made without the benefit of any statistical analysis or significance testing is widely quoted to justify claims that the climate system is “responding more strongly than we thought”. I debated this paper with Stefan at RealClimate, and succeeded in demonstrating they had grossly underestimated the uncertainty.

His main defense was that the end point uncertainty would only affect the last 5 points of the smoothed trend line with an 11 point embedding. Here the global temperatures were smoothed using a complex method called Singular Spectrum Analysis (SSA). I gave examples of SSA and other methods where the end point uncertainty affected virtually ALL points in the smoothed trend line, and particularly more than 5 end points. Stefan clearly had little idea of how SSA worked. His final message, without an argument, was:

[Response: If you really think you’d come to a different conclusion with a different analysis method, I suggest you submit it to a journal, like we did. I am unconvinced, though. -stefan]

But to add insult to injury, this paper figured prominently in the Interim Report of the Garnaut Review where I put in a submission.

“Developments in mainstream scientific opinion on the relationship between emissions, accumulations and climate outcomes, and the Review’s own work on future business-as-usual global emissions, suggest that the world is moving towards high risks of dangerous climate change more rapidly than has generally been understood.”

As time moves on and more data is available, a trend line using the same technique is regressing to the mean. It is increasingly clear that the apparent upturn was probably due to the 1998 El Nino. It is an error to regard a short term deviation as an important indication of heightened climate sensitivity.

More Droughts

The CSIRO Climate Adaptation Flagship produced a Drought Exceptional Circumstances Report (DECR), suggesting among other things that droughts would double in the coming decades. Released in the middle of a major drought in Southern Australia, this glossy report had all the hallmarks of promotional literature. I clashed with CSIRO firstly over release of their data, and then in attempting to elicit a formal response to issues raised. My main concern was that there was no apparent attempt demonstrating the climate models used in the report were fit for the purpose of modeling drought, particularly rainfall.

One of the main results of my review of the data is summed up in the following graph, comparing the predicted frequency and severity of low rainfall over the last hundred years, with the observed frequency and severity of low rainfall. It is quite clear that the models are inversely related to the observations.

image003

A comment submitted to the Australian Meteoreological Magazine was recently rejected. Here I tested the models and observation following an approach of Rybski of analyzing difference between discrete periods 1900-1950 and 1950-2000. The table belows shows that while drought decreased significantly between the periods, modeled droughts increased significantly.

p>Table 1: Mean percentage area of exceptionally low rainfall over time periods suggested by KB09. A Mann Whitney rank-sum test shows significant differences between periods.

1900-2007 1900-1967 1951-2007 P 1900-2007 vs. 1951-2007 P 1900-1950 vs. 1951-2007 Test
Observed % Area Drought 5.6±0.5 6.2±0.7 4.9±0.6 0.10 0.004 Mann-Whitney test
(wilcox.test(x,y) in R)
Modelled % Area Drought 5.5±0.1 4.8±0.2 6.2±0.2 0.006 <0.001 Mann-Whitney test
(wilcox.test(x,y) in R)

Moreover I showed that while similar results were reported for temperature in the DECR (where models and observations are more consistent), they were not reported for rainfall.

The reviewers did not comment on the statistical proof that the models were useless at predicting drought. Instead, they pointed to Fig 10 in the DECR, a rough graphic, claiming “the models did a reasonable job of simulating the variability”. I am not aware of any statistical basis for model validation by the casual matching of the variability of observations to models. The widespread acceptance of such low standards of model validation is apparently a feature of climate science.

Former Head of the Australian Bureau of Statistics Ian Castles solicited a review by ANU independent Accredited Statisticians, Brewer and Other. They concurred that models in the DECR required validation (along with other interesting points).

Dr Stockwell has argued that the GCMs should be subject to testing of their adequacy using historical or external data. We agree that this should be undertaken as a matter of course by all modelers. It is not clear from the DECR whether or not any such validation analyses have been undertaken by CSIRO/BoM. If they have, we urge CSIRO/BoM make the results available so that readers can make their own judgments as to the accuracy of the forecasts. If they have not, we urge them to undertake some.

A persistent error in climate science is using models when they have not been shown to be ‘fit for purpose’.

Miskolczi

Recently a paper came out potentially undermining the central assumptions of climate modeling. Supported by extensive empirical validation, it was suggested that ‘optical depth’ in the atmosphere is maintained at an optimal, constant value (in the average over the long term). Finding an initial negligible sensitivity of 0.24C surface temperature increase to doubling CO2 increase, it then goes on to suggest constrains that ensure equilibrium will eventually be established, giving no increase in temperature, due to reversion to the constant optical depth. The paper by Ferenc Miskolczi, (2007) called Greenhouse effect in semi-transparent planetary atmospheres, was published in the Quarterly Journal of the Hungarian Meteorological Service, January–March 2007.

I was initially impressed by the extensive validation of his theory using empirical data. Despite a furious debate online, there has been no peer-reviewed rebuttal to date. The pro-AGW blog site RealClimate promised a rebuttal by “students” but to date has made none. This suggests either that it is carefully ignored, or it is transparently flawed.

Quite recently Ken Gregory encouraged Ferenc to run his model using actual recorded water vapor data which declines in the upper atmosphere over the last few decades. While there are large uncertainties associated with these data, they do show a decline consistent with Ferenc’s theory, that water vapor (a greenhouse gas) will decline to compensate for increased CO2. The results of Miskolczi’s calculations using his line-by-line HARTCODE program are given here.

The theoretical aspects of Ferenc’s theory have been been furiously debated online. I am not sure that any conclusions have been reached, but nor has his theory been disproved.

Conclusions

What often happens is that a publication appears which gets a lot of exciting attention. Then some time later, rather quietly, subsequent work gets published that questions the claim or substantially weakens it. But that doesn’t get any headlines, and the citation rate is typically 10:1 in favor of the alarmist claims. It does not help that the IPCC report selectively cites studies, and presents unvalidated projections as ‘highly likely’, which shows they are largely expert forecasts, not scientific forecasts.

All of the ‘errors’ here can be attributed to exaggeration of the significance of the findings, due to inadequate rigor in the validation of models. This view that this is an increasing problem is shared by new studies of rigor from the intelligence community, but apply even more to data derived so easily from computer modeling.

The proliferation of data accessibility has exacerbated the risk of shallowness in information analysis, making it increasingly difficult to tell when analysis is sufficient for making decisions or changing plans, even as it becomes increasingly easy to find seemingly relevant data.

I also agree with John P. A. Ioannidis, who in a wide-ranging study of medical journals found that Most Published Research Findings Are False. To my mind when the methodologies underlying AGW are scrutinized, the findings seem to match the prevailing bias. To make matters worse, in most cases, the response of the scientific community has been to carefully ignore, dissemble, or ad hom dissenters, instead of initiating vigorous programs to improve rigor in problem areas.

We need to adopt more practices from clinical research, such as the structured review, whereby the basis for evaluating evidence for or against an issue is well defined. In this view, the IPCC is simply a review of the literature, one among reviews by competing groups (such as NIPCC REPORT 2008 Nature, Not Human Activity, Rules the Climate). In other words, stop pretending scientists are unbiased, but put systems in place to help prevent ‘group-think’ and promote more vigorous testing of models against reality.

If the very slow, to no rate of increase in global temperature continues, we will be treated to the spectacle of otherwise competent researchers clinging to extreme AGW, while the public become more cynical and disinterested. This would have been avoided if they had been confronted with “Are these models validated? If they are, by all means make your forecasts, if not, don’t.”

Newcastle Lecture Update

Still on my way home, after the lecture at Newcastle University by Miklós Zágoni and myself, this will be short. The lecture was well attended, with around 50 people — surprising considering the campus is on a break and parking at a premium. The lectures were well received with a very engaged and relevant question time. There were some suggestions of disruption by anti-skeptics, but they did not eventuate.

Transcript: errors-of-agw-science

Powerpoint: newcastle-presentation

Press Release: herald-article-14-4-09

Miklós Zágoni talk: long version

Continue reading Newcastle Lecture Update

Newcastle Lecture Wednesday 15th April

Miklós Zágoni and I will be speaking in a public lecture at 1pm on Wednesday the 15th of April at the Engineering faculty, Newcastle University, in lecture theater ES203. Miklós will speak on the theory of Ferenc Miskolczi and I will give a short introduction of the work from the blog in the last 3 years in the global warming arena.

A much longer version of my talk is incorporated into a new “Highlights” page.

Continue reading Newcastle Lecture Wednesday 15th April

Jan Pompe Science Project

Some time ago I had a brief discussion with Leif Svalgaard on ClimateAudit blog inspired by an exchange between Leif and David Archibald when the latter complained that Leif’s TSI reconstruction was “too flat”.

The sunspots exhibited cyclic variability in terms of the frequency of the cycles and that most thermostats work by pulse width modulation and some digital music with pulse frequency modulation. Both these work in a similar manner the thermal inertia of whatever the thermostat is controlling smooths the temperature variability and the pulse frequency modulation’s demodulator is a simple low pass filter often just a series resistor and shunt capacitor. In both these cases only the duty cycle or the frequency varies but not the amplitude. Below is a description of how this behaviour can be simulated with an electrical circuit emulator called ‘qucs’.

Continue reading Jan Pompe Science Project

Oceanic Cyanobacteria in the Modern Global CO2 Cycle

There appears to be a very interesting fine structure to great Southern Ocean (SO) atmospheric CO2 levels if one calculates residuals relative to the ‘official’ NOAA global average. This also applies to individual Northern Hemisphere (NH) and Southern Hemisphere (SH) monitoring stations such as Mauna Loa (MLO) and Easter Island (EIC) respectively, sited in the Northeastern and Southwestern Pacific Gyres.

The purpose behind calculating % residuals relative to the (smoothly rising) global average is that this maximizes factoring out the net effects of (temporal) trends in anthropogenic emissions or oceanic up welling and down welling across the planet. The following graph illustrates this point. The graph below was obtained by analysing all NOAA monthly near surface CO2 data from 1982 to 2007 to compute the annual average for all global stations and then computing annual residuals relative to that global average for:

(a) the Mauna Loa (MLO) station only;
(b) the Easter Island (EIC) station only; and
(c) the (unweighted) pooled average of all SO stations from below 30 S to the South Pole.

Please note that residuals above the x = 0 axis are negative (meaning SO or EIC total CO2 levels are below the global mean) and residuals below the x = 0 axis are positive (meaning MLO total CO2 levels are above the global mean). Note also that the residuals of the annual average CO2 at all SO stations are shown with appropriate one standard deviation error bars (on the mean for all stations). These have reduced in magnitude over the years as the number of SO CO2 monitoring stations has risen from only 3 in 1982 to a contemporary maximum of 9 stations. Only data was used where a full (monthly) annual record was accredited by NOAA so that at stations where a full 12 month record was sometimes not available e.g. due to equipment problems, any estimations of monthly CO2 levels obtained by extrapolation could be totally avoided. In other words, this graph contains no addition ‘data massaging’ whatsoever.

In this graph we may clearly see that over the period 1982 – 2007 CO2 levels at MLO were always greater than the global average. For MLO 1998 was an obvious peak in exceedance of the global CO2 average but despite the fading of the large 1998 El Nino, at least until 2007 the trend for MLO seemed to be for an increasing margin above the global average.

In this graph we may also clearly see that over the period 1982 – 2007 CO2 levels over the SO were always lower than the global average. For the SO 1998 was not a special year with respect to the negative residual for SO CO2 level below the global average.

However, it can be clearly seen that over the period 1982 – 2007 the (negative) residual of CO2 levels over the SO relative to the global mean has trended towards greater values. In other words over 1982 – 2007 CO2 levels over the entire SO have slowly lagged increasingly below the (rising) global average CO2, falling from about 0.35% below the global average to about 0.55% in recent years – a trend of about 0.1%/decade against the (always rising) global average CO2 level.

Additionally, it can also be seen that the CO2 residual below the global average over the SO has been approaching the long term average residual at the Easter Island Station (EIC), which has always typically lagged about 0.65% below the global average CO2 level since records commenced in 1994.

image003

Following my identification of the above residual and trends in residuals about global CO2 levels I embarked on a spare time investigation to try to understand what may be going on with respect to CO2 dynamics in the SO. My investigations focussed on looking at the important role of cyanobacteria (formerly known as blue-green algae) in the oceans.

Cyanobacteria are very important organisms in the global biosphere because they comprise about 48% of the global living biomass, live in the top 50 m or so of the oceans and are photosynthetic. They absorb carbon in the form of dissolved CO2 and bicarbonate from seawater and respire (emit) oxygen. They are of course the micro-organisms which more or less gave us our 21% oxygen atmosphere following their evolution about 3 Gy ago. After land plants (which evolved from cyanobacteria and no make up about 52% of the worlds living biomass) they are the most important photo-synthesizers on the plant. After simple water temperature effects on CO2 solubility the biotic cycle of uptake of CO2 by cyanobacteria is the next most important mechanism why may affect atmospheric CO2 levels over the oceans.

The following text attempts to summarize the outcomes of my investigations thus far. It is not intended to be a definitive or dogmatic statement but is submitted simply to try to raise interest in the very important issue of global cyanobacterial productivity and its relationship to the global carbon cycle and make a few speculative comments that readers may wish to comment-on and take further.

Below is a two component graph showing average monthly daytime Chlorophyll a (black) over all oceans over the last ten and half years and average monthly daytime sea surface temperatures (SSTs; in green) over the last five and half years for the latitude band 0 (Equator) to 30 N (i.e. Sub-Equatorial NH). Chlorophyll a is a (satellite sea surface colour-sensed) measure of cyanobacterial density ‘productivity’) at the sea surface.

Note the pronounced 1998 El Nino sea surface temperature (SST) effect on cyanobacterial productivity. Furthermore, please especially note the presence of a bimodal population of cyanobacteria in each annual cycle i.e. a ‘Consortium S’ which blooms more-or-less in summer and a ‘Consortium W’ which blooms more-or-less in winter. Note also the peaks and troughs in annual SSTs.

Note also the increased strength of the winter 2006 and winter 2008 Consortium W blooms.

image004

Now here below is the equivalent graph for the Equatorial oceanic latitude band along the Equator i.e. 15 N – 15 S.

Note again the presence of a bimodal pattern of populations of cyanobacteria in each annual cycle i.e. a ‘Consortium S’ which blooms in summer and a much weaker ‘Consortium W’ which blooms in winter.

image006

Similarly, here below is the equivalent graph for the Sub-Equatorial SH latitude band just below the Equator i.e. 0 – 30 S. Note the almost complete absence of a 1998 El Nino SST effect.

Note also the almost complete absence of a bimodal pattern of population of cyanobacteria in each annual cycle i.e. Consortium W dominates completely – unlike the situation with the Sub-Equatorial NH oceans.

image008

Now here below is the equivalent plot for the mid-NH latitudes 30 N – 60 N. Note again the almost complete absence of a 1998 El Nino SST effect. Note well the now marked and very consistent presence of ‘Consortium S’ and a shift of the (no stronger) ‘Consortium W’ to warmer waters later in each year, relative to more equatorial waters. Note also how the weaker ‘Consortium S’ has however increased in activity from a peak Chlorophyll a level of about 0.6 mg/m3 in 1997 to approx. 0.7 mg/m3 in 2006 -7.

image010

The final graph shows the equivalent SH plot for the mid-latitudes of 30 S – 60 S. Note the weak but still-evident 1998 El Nino SST effect. However, most importantly, note the complete absence of ‘Consortium S’ unlike the equivalent oceanic band of the NH (30 N – 60N).

There is also a shift of the (now solitary) ‘Consortium W’ to warmer waters later in each year (relative to more equatorial waters).

image012

These graphs are sufficient to demonstrate that the behaviour of two vast crops of oceanic cyanobacteria which I have completely arbitrarily labelled ‘Consortium S’ and ‘Consortium W’ depending upon in which part of the year they approximately bloom, in the NH Equatorial and Mid-Latitude oceans, is markedly different to the SO below 30 S (which has only a ‘Consortium W’ type population).

As far as I know this observation does not appear anywhere in the modern scientific literature on the ocean.

Why there should be two distinct NH oceanic cyanobacterial consortia producing two annual phases of blooming in the NH oceans (i.e. blooming over two fairly distinct water temperature ranges) is an interesting and as yet unresolved question.

Is it a modern adaptation to the hemisphere where most anthropogenic CO2 has been increasingly generated over the last 200 or so years?

Or is it (say) a past-evolved consequence of the timing of (say) iron and silica export in dusts (noting these are limiting nutrients for cyanobacterial growth) from the (proportionately larger) NH continents e.g. Sahara, Gobi etc?

It has long been known that cyanobacterial productivity tends to be higher in NH oceans because of the higher iron and nitrogen nutrient levels in those ocean by comparison with SH oceans. This is a consequence of the greater proportion of land in the NH and the consequential observed higher nutrient levels in the surface layers of NH oceans.

My personal inclination is to infer that NH mixed populations of oceanic cyanobacteria might well be adapting already to the more rapidly increasing NH atmospheric CO2 levels, the higher SSTs there and probably the larger anthropogenic fixed nitrogen pollution of the NH by adaptation to establish a stronger ‘Consortium S’ population designed to consume those elevated CO2 and fixed nitrogen nutrient levels.

Yet despite the lack of evidence for two annual consortia in the great SO increasing negative deviations of CO2 levels over the great Southern Ocean from the global mean CO2 level (the ‘residuals’ – refer my first graph above) still strongly suggest that this ‘CO2 fertilization effect’ is occurring in the SH too.

Possibly we simply can’t discern it in the Northern Hemisphere from looking just at annual CO2 residuals relative to the global mean because the Northern Hemisphere is where the CO2 flux to the atmospheric (both through land and sea-based aerobic decay of natural organic matter and through anthropogenic emissions) is much greater.

Are these data a modern example of evolution in action? They certainly appear to indicate evolution of the vast crop of oceanic cyanobacteria in the direction of increasing adaptation-to and attenuation-of elevated atmospheric CO2 (from whatever source) and increasing SSTs (regardless of their cause).

There is therefore clear evidence that, on a regional basis the modern capacity of the oceans for CO2 removal is both regionally variable and in some regions is likely increasing.

In my view, this is a result of the effects on cyanobacterial primary productivity of increasing CO2 fertilization, perhaps delayed fertilization from iron and silicon fallout/washout from volcanos like Pinatubo, Chaiten etc, but perhaps most importantly, the massive and rapidly increasing input to the coastal shelves of anthropogenic fixed nitrogen.

It is not often appreciated that the total export of fixed forms of nitrogen into coastal shelf waters by mankind is massive and approximately equal to the sum of all natural exports.

In fact the anthropogenic fraction of total nitrogen emitted from the land to the oceans and the atmosphere is much greater than for the anthropogenic fraction of total carbon emitted from the land to the oceans and the atmosphere. Somehow, in the midst of all the hysteria about anthropogenic carbon emissions we rarely get to consider this extremely significant fact.

Cyanobacterial primary productivity has a negative feedback effect on SST which in turn increases CO2 solubility etc.

This arises through:
• increased sea surface reflectivity (albedo) induced by the blooming of biogenic calcite-secreting cyanobacteria (‘coccolithophores’);
• increased sea surface reflectivity reduced sea surface evaporation rates caused by mono- and multi-layers of lipids formed on the sea surface during cyanobacterial blooming (via zooplankton predation and the action of cyanobacteriophages); and
• enhanced cloudiness and increased reflectivity of low level clouds by cyanobacteria-emitted dimethylsulfide (DMS) and isoprene-based aerosol Cloud Condensation Nuclei (CCN).

Finally, in the Australian context, I note (a little mischievously) that the reduced SSTs which Cai and Cowan (2006), recently noted for the seas to the north of Australia might be induced by increasing cyanobacterial primary productivity in the coastal shelf zones of South East Asia and the Indonesian archipelago, itself driven by the known increasing anthropogenic nutrient pollution of those shallow seas.

If that were the case then this effect could well be a subtle but key driver of the increasing dryness of the Murray Darling Basin.

Causes of Global Warming

Here is a graph that suggests something intriguing about climate dynamics — global temperature from 1979-2009 from UAH satellite records for land, southern hemisphere ocean, and globe, each fit with a 3rd order polynomial. Also plotted is the difference between SH Ocean and Global temperatures, and the difference between SH Ocean and Land temperatures. Notice that the 3rd order polynomial of the differences is almost dead straight! The Ocean-Land difference tends to drop a bit in the last 5 years.

3polyglobals

This shows that despite the ups and downs of temperature, the divergence between global and ocean temperatures has been almost linear over 30 years.

Cosmic Ray Flux effects are postulated to be much stronger over clear air with insufficient condensation nuclei. Hence, oceans and particularly the southern hemisphere oceans could be most sensitive to variations in CRF.

There is evidence from various measures of CRF, such as the Be10 concentration in ice cores, that the CRF has been increasing. If that is the case, could the land-ocean divergence be attributed to CRF? Another plausible explanation is that land temperatures have increased due to urban heat effect (UHI) during that time, due to placement of thermometers in burgeoning urban areas.

Also of note is the sigmoidal curve, present even with higher order polynomials. Could also be attributed to the the influence of the solar activity magnetic cycle of approximately 22 years, over which the polarity of the Sun’s magnetic dipole reverses?

Making too much of short time periods is not advisable when talking about climate which changes over longer periods. Still it is fascinating to see how easily patterns emerge in support of solar modulated CRF based climate changes.