Upper Atmosphere Inflow of Moisture?

For those interested in the theory that upper atmosphere inflow of moisture from the Indian Ocean is a major determinant of rain in Australia, check out the satellite loop for the last 4 hours right now.

Here is the development of the inflow over the last few days.

Note this is in the presence of a high pressure system with an upper atmosphere ridge and trough, as can be seen by the slight deformation of the isobars over Queensland.

Continue reading Upper Atmosphere Inflow of Moisture?

Australian Temperature Records in Question

Ken Stewart is engaged in the first ever independent study of the complete High Quality Australian Site Network. Ken has a series of posts, the first including a lot of background information and explanation. Subsequent posts are not be as long and part 6, the data from the Victorian sites has just been done.

Like many people, he thought that the analysis of climate change in Australia, and information given to the public and the government, was based on the raw temperature data. He was wrong. He averaged maxima and minima for all stations at each site, then compared the result with the High Quality means. By these calculations (averaging the trend at each site in Victoria) the raw trend is 0.35 degrees C per 100 years, and the High Quality state trend is 0.83C. That’s a warming bias of 133%!


Continue reading Australian Temperature Records in Question

Page-Proofs of the DECR Paper

Corrected the page-proofs of my drought paper today.

CRITIQUE OF DROUGHT MODELS IN THE AUSTRALIAN DROUGHT EXCEPTIONAL CIRCUMSTANCES REPORT (DECR)

ABSTRACT
This paper evaluates the reliability of modeling in the Drought Exceptional Circumstances Report (DECR) where global circulation (or climate) simulations were used to forecast future extremes of temperatures, rainfall and soil moisture. The DECR provided the Australian government with an assessment of the likely future change in the extent and frequency of drought resulting from anthropogenic global warming. Three specific and different statistical techniques show that the simulation of the occurrence of extreme high temperatures last century was adequate, but the simulation of the occurrence of extreme low rainfall was unacceptably poor. In particular, the simulations indicate that the measure of hydrological drought increased significantly last century, while the observations indicate a significant decrease. The main conclusion and purpose of the paper is to provide a case study showing the need for more rigorous and explicit validation of climate models if they are to advise government policy.

Meanwhile, scientists are finding new ways to communicate worthless forecasts to decision makers.

These models have been the basis of climate information issued for national and seasonal forecasting and have been used extensively by Australian industries and governments. The results of global climate models are complex, and constantly being refined. Scientists are trialling different ways of presenting climate information to make it more useful for a range of people.

Conducting professional validation assessment of models would be a start, followed by admitting they are so uncertain they should be ignored.

Continue reading Page-Proofs of the DECR Paper

Watts Tour at Emerald

Anthony’s Tour continues at a breakneck pace this week — with only four venues to go.

The talks at Emerald that I organized went quite well, considering this is a small regional town. About 80-100 people attended an teaser session during the Property Rights Australia meeting during the day, and around 40 attended at night. We got a standing ovation during the day — the first time for me! The crowd was a mixture of ages and sexes and I think messages of bureaucratic sloth and opportunism resonated with them. Central Queensland turned on one of its trademark sunsets for Anthony:

It was good to spend a bit of time with Anthony and catch up on the goss — well not really gossip, but about bloggers and the people behind the curtain. You know how it is, you tend to get a certain view of the people involved, but when you learn more about them, it turns out they are just regular people who put their hand up for something they believe in.

Continue reading Watts Tour at Emerald

Niche Logic

The ‘strongest male’ is itself a highly variable component.

How to formalise this as a niche? Preamble. All we have, really, are observations. To put niches into a statistical framework, we only have the expected distributions of those observations (both singly and jointly). Selection (either natural or through our study design) changes the distribution of features, and we observe those changes.

For example, if the sample of breeding males is generally taller than the population of breeding males, then we could presume there is selective pressure on this feature — an important item of information. This could be detected statistical significant (e.g. the distribution differ in a Chi–squared test).

Continue reading Niche Logic

Niche Theory

A couple of questions from the last nichey post prompted this post. Geoff said that:

I’m not even sure what is meant by an optimal environment for a species/genus/whatever.

while Andrew said that:

it wouldn’t surprise me if a lot of species tend to live at the margins of their “ideal” habitat.

We need a bit of abstraction to address these questions. In a laboratory, a plant would be expected to show a humped response to the main variables of temperature and water availability. The parameterisation of this function can be termed the ‘fundamental niche’ of the species, and may be equated with a physiochemical optimum unaffected by competition.

Continue reading Niche Theory

Sceptics Tour Update

Having just returned from my leg of the tour, I have been offline for awhile, but expect to catch up this week. Here is my powerpoint presentation “Tweeter and the Monkey M(e)an — Negating Climate Change Policy” (4.3MB).

The title comes from a song by the Traveling Wilburys. The message is that without proper validation, climate models are no more credible than Tweets, and from my (and others’) validation testing, the model forecasts are not fit-for-forecasting, showing no more accuracy than the “Monkey Mean” — the average temperature and rainfall. I critique CSIRO and BoM reports and conclude with an example of how to make rational business decisions under climate forecast uncertainty.

Continue reading Sceptics Tour Update

Extinction artifact in coarse scales

CO2 Science reviews a study showing that the appearance of high levels of extinction due to shifts in climate is due to the coarse resolution of the grid cells used in the simulations. This is another vindication of the conclusion of our 18 author collaboration.

When grid cells are coarse, a one degree shift in temperature, say, affects a large area, and can appear to eliminate all habitat for a species in the grid cell. The virtual species must move a long way to find another suitable grid cell. In actuality each coarse grid cell contains a range of temperatures. When the grid cells are finer, there will most likely be areas within the grid cell with suitable habitat for the species, enabling it to persist through large climate variations.

Refugia are well known to have a crucial role in species’ persistence, and may be characterized as areas of high spatial heterogeneity. It is easy to see that choice of scale would have a large effect on determinations of species persistence, and great caution would be needed in interpreting results of simulations conducted on coarse grids.
Continue reading Extinction artifact in coarse scales

Australia's Government Debt

Below is a graph of the blow-out in Australian Government Debt.

I don’t know why everyone is blaming Rudd. Growing the State is what tax-and-spend-spend liberals do.

The idea that the budget should be in deficit for the next four or five years when the economy is at near-full employment, should be laughable. But Rudd and Abbott would prefer to test the electorate’s mendacity than complete our rise as a world-beating economy by paying our own way in recovery.

But the article makes an interesting observation related to Hauser’s Law.

In a speech last year, Treasury secretary Ken Henry identified a little known fact. Government spending exploded under Gough Whitlam, from 18.9 per cent of GDP in 1971-2 to 24.8 per cent in 1975-6. And it has stayed at about that level ever since, through Fraser-Howard, Hawke-Keating and Howard-Costello.

“In the 3 1/2 decades since, while there have been significant annual fluctuations, the average level of spending by the Australian government has changed little, to be around 25.25 per cent of GDP.”

Continue reading Australia's Government Debt

On the Use of the Virial Theorem by Miskolczi

Virial Paper 6_12_2010 submitted by Adolf J. Giger.

Allow me to make some more comments on the Virial Theorem (VT) as used by Ferenc Miskolczi (FM) for the atmosphere.

As I said on this blog back in February, a very fundamental derivation of the VT was made by H. Goldstein in Section 3-4 of “Classical Mechanics”, 1980, Ref.[1] : PE= 2*KE (potential energy=2 x kinetic energy). Then, he also derives the Ideal Gas Law (IGL), P*V = N*k*T as a consequence of the VT, and shows that PE=3*P*V and KE=(3/2)*N*k*T. The two laws, IGL and VT, therefore are two ways to describe the same physical phenomenon. Despite its seemingly restrictive name, we know that the IGL is a good approximation for many gases, monatomic, biatomic, polyatomic and even water vapor, as long as they remain very dilute. Goldstein’s derivations are made for an enclosure of volume V with constant gas pressure P and temperature T in a central force field like the Earth’s gravitational field. They also hold for an open volume V anywhere in the atmosphere. As to FM, he points out that the VT reflects the fact that the atmosphere is gravitationally bounded.

Ferenc Miskolczi in his papers [2,3] relates the total potential energy of the atmosphere, PEtot, to the total IR upward radiation Su at the surface. This relationship has to be considered a proportionality rather than an exact equality, or Su=const* PEtot. We see that this linkage makes sense since Su determines the surface temperature Ts through the Stefan-Boltzmann law, Su = (5.6703/10^8)*Ts^4 , and finally the IGL ties together Ts, P(z=0) and PEtot.

FM then assigns the kinetic IR energy KE (temperature) in the atmosphere to the upward atmospheric IR emittance Eu, or Eu=const*KE. The flux Eu is made up of two terms F + K , where F is due to thermalized absorption of short wave solar radiation in atmospheric water vapor, and K due to heat transfers from the Earth’s surface to air masses and clouds through evaporation and convection. Neither F or K are directly radiated from the Earth’s surface. They represent radiation from the atmosphere itself. There is an obvious limitation for such an assignment mainly because for the VT , or the IGL in general, the temperature (the KE) has to be measured with a thermometer, whereas Eu represents the radiative temperature (flux) that has to be measured with a radiometer, and these two measurements can give vastly different results as we see for the two following extreme cases:

In between these two extremes we have the Earth where FM’s version of the VT , Su = 2 * Eu applies reasonably well. We will see next in a discussion of FM’s exact solution how close, and for what types of atmospheres FM’s VT ( Eu/Su=0.5) holds, but we can say already that no physical principle is violated if it doesn’t. The VT that always holds for gases is not being violated, it is simply not fully recognized by FM’s fluxes that have to be measured by radiometers. This may be an indication that the VT is less important for FM’s theory than normally assumed.

On the other hand, the IPCC assumes a positive water vapor feedback and arrives at very imprecise predictions for the Climate Sensitivity ranging from 1.5 to 5K (and even more). It is clear that this wide range of numbers is caused by the assumed positive feedback system, which apparently is close to instability (or singing, as the electrical engineer would call it in an unstable microphone-loudspeaker system). With such large uncertainties in their outputs true scientists should be reluctant to publish their results.

Continue reading On the Use of the Virial Theorem by Miskolczi

No evidence of global warming extinctions

My rebuttal of Thomas’ computer models of massive species extinctions has been mentioned in a statement by Sen. Orrin G. Hatch before the United States Senate, on June 10, 2010.

1. Stockwell (2000) observes that the Thomas models, due to lack of any observed extinction data, are not ‘tried and true,’ and their doctrine of ‘massive extinction’ is actually a case of ‘massive extinction bias.’

[Stockwell, D.R.B. 2004. Biased Toward Extinction, Guest Editorial, CO2 Science 7 (19): http://www.co2 science.org/articles/V7/N19/EDIT.php]

The one extinct species mentioned in the Thomas article is now thought to have fallen victim to the 1998 El Nino.

Continue reading No evidence of global warming extinctions

New Miskolczi Manuscript

Ferenc sent out reprints of his upcoming manuscript, and graciously acknowledges the contribution of a number of us for support, help and encouragement. I particularly like the perturbation and statistical power analysis, checking that a change in the greenhouse effect due to CO2 would likely have been detected if it had been present in the last 61 years.

The Stable Stationary Value of the Earth’s Global Average Atmospheric Planc-weighted Greenhouse-Gas Optical Thickness
by Ferenc Miskolczi,
Energy & Environment, 21:4 2010.

ABSTRACT
By the line-by-line method, a computer program is used to analyze Earth atmospheric radiosonde data from hundreds of weather balloon observations. In terms of a quasi-all-sky protocol, fundamental infrared atmospheric radiative flux components are calculated: at the top boundary, the outgoing long wave radiation, the surface transmitted radiation, and the upward atmospheric emittance; at the bottom boundary, the downward atmospheric emittance. The partition of the outgoing long wave radiation into upward atmospheric emittance and surface transmitted radiation components is based on the accurate computation of the true greenhouse-gas optical thickness for the radiosonde data. New relationships
among the flux components have been found and are used to construct a quasi-all- sky model of the earth’s atmospheric energy transfer process. In the 1948-2008 time period the global average annual mean true greenhouse-gas optical thickness is found to be time-stationary. Simulated radiative no-feedback effects of measured actual CO2 change over the 61years were calculated and found to be of magnitude easily detectable by the empirical data and analytical methods used. The data negate increase in CO2 in the atmosphere as a hypothetical cause for the apparently observed global warming. A hypothesis of significant positive feedback by water vapor effect on atmospheric infrared absorption is also negated by the observed measurements. Apparently major revision of the physics underlying the greenhouse effect is needed.
Continue reading New Miskolczi Manuscript

Species extinction by Johnston

It’s gratifying to see the essay by Johnston getting the attention it deserves (at WUWT and JoNova) after Pielke brought it to our attention. Johnston reviews many areas of climate science in 82 pages of readable prose and concludes:

Insofar as establishment climate science has glossed over and minimized such fundamental questions and uncertainties in climate science, it has created widespread misimpressions that have serious consequences for optimal policy design.

Apparently somebody asked “What does a lawyer know about climate science?”. Well, firstly, he is an environmental law professor. Secondly, the areas he writes about where I am knowledgeable show surprising insight. His assessment of estimates of species loss (20-30%) due to global warming restates exactly what I said at the time in Biased Towards Extinction:

Given the extensive and foundational criticism by biologists of the methodology underlying the species loss probability prediction generated by Thomas et al., the IPCC’s publication of that probability without qualification seems dangerously misleading, and in any event clearly exemplifies the rhetoric of adversarial persuasion, rather than “unbiased” assessment.

Of five “problematic” uncertainties and complications that he raises (and there are many more I might go into) of the Thomas et al. study, one that I mentioned in the CO2 science editorial is particularly offensive:

v) Finally, and perhaps most strikingly to my layperson’s sensibilities, the methodology employed by Thomas et. al. will “inevitably detect extinctions. Negative changes in the size of a species’ range contribute to an increased extinction risk overall, while positive changes have no net effect on extinctions,” this despite the fact that
locally, “the net effect on diversity at any one locality might well be positive, as species spread towards the poles from the most species-rich habitats near the equator.”

Thomas and authors achieved this statistical slight-of-hand by ‘cherry picking’ all species whose home ranges were reduced by warming, and removing those whose home ranges increased. The change in the size of home range is assumed to affect the survival of the species.

When I questioned Thomas about this, his defense was that the method had been approved by a number of eminent conservation biologists who found it perfectly fine. He said that those species with reduced range were are greater risk of extinction from global warming, while those with increasing range were of course going to be OK.

However, it takes a layman’s sensibilities to see, apparently, that for every species that decreases its range another increases its range, therefore the overall rate of extinctions does not change. If overall rate of extinctions does not change, then no increase in extinctions could be expected from global warming.

I tried to explain this trivial point to the coauthors and three rounds of reviewers without success. The final straw was the assertion of one reviewer to the effect that “We know that global warming is going to increase extinctions, so your analysis must be wrong.” I concluded, as Jason Johnston did, that the field “exemplified the rhetoric of adversarial persuasion, rather than ‘unbiased’ assessment.” – i.e which is, I suppose, code for ‘green advocacy’.

So, read the summary of the state of play on species losses by Johnston. He mentions the article I helped prepare with 17 scientists in related fields rebutting Thomas et al. which has largely been ignored by the conservation field. He also touches on the murky origins of the iconic statement that:

“[a]pproximately 20-30% of plant and animal species assessed thus far are likely to be at increased risk of extinction if increases in global temperature exceed 1.5 – 2.5°C.”

In what could be called a Species-gate, this statement was eventually attributed to Thomas et al. by the IPCC, despite the fact that the Thomas et al. methodology and manuscript do not mention probability. The determination of species ‘committed to extinction’ is not quantified by probability.

As if anybody cares about accuracy and precision anymore. The sad part is that such papers as Thomas et al., and the minimization of fundamental questions and uncertainties even when legitimate problems are raised, lead people to believe that “the science is settled”, and take us further from knowing the real truth about climate change and survival of species.

Problem 4: Why has certainty not improved

Problem 4. Why has a community of thousands or tens of thousands of climate scientists not managed to improve certainty in core areas in any significant way in more than a decade (eg the climate sensitivity caused by CO2 doubling as evidenced by little change in the IPCC bounds)?

This problem has been the hardest, probably because it takes enormous hubris to claim solution to a problem that defeats thousands of usually intelligent people. One man who does that is — Dr Roy Spencer — claiming a huge ‘blunder’ pervades the whole of climate science regarding the direction and magnitude of ocean-cloud feedback, the subject of his upcoming book and paper.

What I want to demonstrate is one of the issues that is almost totally forgotten in the global warming debate: long-term climate changes can be caused by short-term random cloud variations.

The main reason this counter-intuitive mechanism is possible is that the large heat capacity of the ocean retains a memory of past temperature change, and so it experiences a “random-walk” like behavior. It is not a true random walk because the temperature excursions from the average climate state are somewhat constrained by the temperature-dependent emission of infrared radiation to space.

As showed previously, an AR coefficient of 0.99 is sufficient to change a random walk behavior (AR=1) to the kind of mean-reverting behavior his model shows. This difference is virtually undetectable using the usual tests on the available 150 years of global temperature data. Global temperature cannot be a random walk, but it can be ‘almost a random walk’. It can also respond to random shocks, such as volcanic eruptions, and sudden injections of GHGs, and oscillating solar forcings while still retaining the random walk character.

Continue reading Problem 4: Why has certainty not improved

Frequency dependent climate sensitivity

Nicola Scafetta published another paper today, confirming the period dependency of climate sensitivity. (I would have loved to write this, but he attributes the original idea to a book chapter by Wigley in 1988, so its not original anyway.)

In his words, climate sensitivity is frequency dependent:

However, the multiple linear regression analysis is not optimal because the parameters ki and τi might be time-dependent and, in such a case, keeping them constant would yield serious systematic errors in the evaluation of the parameters ki . Moreover, climate models predict that the climate sensitivity to cyclical forcing increases at lower frequencies because of the strong frequency-dependent damping effect of ocean thermal inertia [Wigley, 1988; Foukal et al., 2004].

When the signal is properly decomposed, solar forcing is significantly stronger at longer periods of oscillation:

Continue reading Frequency dependent climate sensitivity