“This is commonly referred to as ‘research’” – Gergis

Just what is the ‘research’ that Gergis et.al. claims to have done? And what are the sceptics complaining about?

The ‘research’ claimed by the Gergis et.al. team is captured in the following graphical representation of the past temperature of the Australiasn region.

The hockey stick shape has also been produced using similar methods and random data, as shown in my AIG news article in 2006, and also in chapter 11 of my 2007 book “Niche Modeling“.

It is obvious that if the same result is achieved with random data and with real-world data, the real-world data are probably random. That is, whatever patterns seen are not proven to be significant patterns, by the yardsticks of rigorous statistical methods.

These problems have been widely discussed at ClimateAudit since 2006, and my publications probably grew out of those discussions. Moreover, the circular argument has become commonly known as the “Screening Fallacy” and widely discussed in relation to this area of research ever since.

To claim results when they could equally be achieved by random numbers would get you laughed off the podium in most areas of science. Gergis et.al. informed Steve McIntyre superciliously, however, that this is commonly referred to as ‘research’.

One of the co-authors, Ailie Gallant, stars in the cringe-worthy We Are Climate Scientists, a pretentious rap-video proclaiming they are “fucking climate scientists” and “their work is peer reviewed” in dollar-store sunglasses and lab coats. They have no reason to act superior, and this recent effort proves the point.

Of course, Gergis et.al. claimed to have detrended the data before performing the correlations, and whether this ad-hocery would mitigate the circularity above is questionable. Whether by oversight or intent, it appears the detrending was not performed anyway. I don’t know whether this is the reason for the paper being pulled. We shall find out in time. The paper appears to be the result of a three-year research program, announced on Gergis’ personal blog.

The project, funded by the Australian Research Council’s Linkage scheme, is worth a total of $950K and will run from mid-2009 to mid-2012.

It gives me a job for three years and money to bring a PhD student, research assistant and part time project manager on board.

More importantly, it will go a long way in strengthening the much needed ties between the sciences and humanities scrambling to understand climate change.

Who is contributing the most to research, unpaid bloggers, or a one million, three years, tied with humanities fiasco?

Gergis’ hockeystick “on hold”

You may by now have heard here or here that “Evidence of unusual late 20th century warming from an Australasian temperature reconstruction spanning the last millennium” by Joelle Gergis, Raphael Neukom, Stephen Phipps, Ailie Gallant and David Karoly, has been put “on-hold” by the Journal, due to “an issue” in the processing of the data used in the study.

It is illuminating to review the crowing commentary by Australian science intelligencia and the press reaction to the paper.

ABC’s AM show, “Australia’s most informative (government funded) morning current affairs program. AM sets the agenda for the nation’s daily news and current affairs coverage.”

TONY EASTLEY: For the first time scientists have provided the most complete climate record of the last millennium and they’ve found that the last five decades years in Australia have been the warmest.

He then speaks for the IPCC:

The Australian researchers used 27 different natural indicators like tree rings and ice cores to come to their conclusion which will be a part of the next report by the United Nations Intergovernmental Panel on Climate Change.

The Gergis paper was proof enough for the ABC Science Show, which gives “fascinating insights into all manner of things from the physics of cricket”.

Robyn Williams: Did you catch that research published last week analysing the last 1,000 years of climate indicators in Australia? It confirmed much of what climate scientists have been warning us about.

Here is another via ABC Statewide Drive tweet.

Dr Joelle Gergis from @unimelb: We are as confident that the warming in the post 1950 period is unprecedent in the past 1000 years.

Such shallow and gullible commentary is no better than blogs such as Gerry’s blogistic digression gerry’s blogistic digression “I’ve got a blog and I’m gonna use it.”

It’s offical: Australia is warming and it is your fault.

The tone of the Real Scientists from realclimate is no better, jubilant that the “hockey-stick” has now been seen in Australia.

First, a study by Gergis et al., in the Journal of Climate uses a proxy network from the Australasian region to reconstruct temperature over the last millennium, and finds what can only be described as an Australian hockey stick.

As Steve Mosher said, such papers cannot be trusted. Putting aside questions of the methodology (that I will get to later), the reviewers in climate science don’t check the data, don’t check the numbers produce the graphs and tables published, or check that the numbers actually do what the text describes.

Yet they approve the paper for publication.

He is stunned this has to be explained to anyone. Apparently it does.

10 of 11 emails not threats

ANU Climate Science girlie-men exaggerated the threats as well:

In a six-page ruling made last week, Mr Pilgrim found that 10 of 11 documents, all emails, “do not contain threats to kill” and the other “could be regarded as intimidating and at its highest perhaps alluding to a threat”.

Levitus data on ocean forcing confirms skeptics, falsifies IPCC

The IPCC, in the AR4 working group one, stated what could be called the central claim of global warming, the estimate of the net radiative forcing.

“The understanding of anthropogenic warming and cooling influences on climate has improved since the TAR, leading to very high confidence that the effect of human activities since 1750 has been a net positive forcing of +1.6 [+0.6 to +2.4] W m–2.”

Remember a forcing is an imbalance that causes heating, like a hot plate heating a saucepan of water. While the forcing continues, the temperature of the water will continue to rise. Global warming is the theory that increases in anthropogenic CO2 in the atmosphere are producing a radiative imbalance, or forcing, causing the earth to warm dangerously.

The IPCC level of forcing equates to the stated estimates for doubling of CO2 from around 1.5 to 6C per doubling, and the central estimates of warming to the end of the century from increasing CO2 of about 3C.

The paper by Levitus et al. uses the array of ARGO floats, and other historic ocean measurements, to determine the change in the heat content of the ocean from 0 to 2000m, and so derive the actual net radiative forcing that has caused it to warm over the last 50 years.

“The heat content of the world ocean for the 0-2000 m layer increased by 24.0×1022 J corresponding to a rate of 0.39 Wm-2 (per unit area of the world ocean) and a volume mean warming of 0.09ºC. This warming rate corresponds to a rate of 0.27 Wm-2 per unit area of earth’s surface.”

To compare these figures, say the continuous top-of-atmosphere forcing is 1Wm-2, a figure given by Meehl and Hansen and consistent with the IPCC estimates. The forcing of the ocean from a TOA forcing of 1Wm-2 is a lower 0.6m-2 due to losses, estimated by Hansen.

The best, recent measurements of the forcing 0f 0.3Wm-2 are half these IPCC estimates. The anthropogenic component of the forcing is even less, as a large part of the 0.3Wm-2 in the last 60 years is due to increased solar insolation during the Grand Solar Maximum.

This mild forcing is right in the ballpark that skeptic scientists such as Lindzen, Spencer, Loehle and Idso (and myself) have been consistently saying is all that is justified by the evidence. It appears that Levitus et al. confirms the skeptics, and the IPCC has been falsified.

What commentary on Levitus do we hear from the alarmists? Skeptical Science ignores that the IPCC has been exaggerating the net forcing, and attempts to save face:

“Levitus et al. Find Global Warming Continues to Heat the Oceans”

Skeptical Science “Put this amount of heat into perspective”, in a vain attempt to sound an alarm by quoting a scenario that is almost insane, having a infinitesimally small probability of happening.

“We have estimated an increase of 24×1022 J representing a volume mean warming of 0.09°C of the 0-2000m layer of the World Ocean. If this heat were instantly transferred to the lower 10 km of the global atmosphere it would result in a volume mean warming of this atmospheric layer by approximately 36°C (65°F).”

To do this, heat would have to defy all known physics and move backwards, from the boiling water to the hot plate.

The ocean is a big place. The best evidence is that its heating very slowly, much slower than the IPCC projected, and just as the skeptics predicted. The ARGO floats are arguably the most important experiment in climate science. It is all about good science: directly measuring the phenomenon of interest with sufficient accuracy to resolve the questions.

UPDATE: data1981 explains.

It’s definitely a confusing issue. What we’re talking about here is basically the amount of unrealized warming, whereas the radiative forcing tells you the total net energy imbalance since your choice of start date (the IPCC uses 1750). So they’re not directly comparable figures.

The unrealized warming has been fairly constant over the past ~50 years whereas the radiative forcing increases the further back in time you choose your initial point. So if you look at the unrealized warming starting at any date from 1950 to 2010, it will be a fairly constant number. But the radiative forcing from 1950 to 2010 is larger than the forcing from 1990 to 2010, for example.

Hopefully I got that right.

No he didn’t.

UPDATE: Roger Pielke Sr has a post on this topic.

Finkelstein the new face of totalitarianism

Members of the Independent Inquiry into Media Regulation at Sydney University. In the middle is Chair of Inquiry Ray Finkelstein centre, Chris Young (left) and Prof. Matthew Ricketson (right)

When I started in 2005 fighting to defend normal scientific standards over the exaggerations and biases of climate science extremism, I never thought it would end up in a fight for free speech over left-wing totalitarianism. Apparently, based on the Finkelstein Media Inquiry, it has come to this.

Some comments from blogs, proposed in the report to be regulated by a new Ministry of Truth:

So it can’t happen here, can’t it? by Steve Kates:

But surely they cannot be thinking of looking at my opinion to decide whether I can be prosecuted? But if not that, what, precisely, do they have in mind? This is more than just a thin edge of the wedge. This is how it starts and this is not where it will end.

Its worth skimming the report to see what social scientists are up to these days. It’s something called “social responsibility theory” and its origins are apparently in totalitatianism, as well described on page 50.

Authoritarian theory, the oldest and through history the most pervasive, reflected societies which held that all persons were not equal, that some were wiser than others and it was those persons whose opinions should therefore be preferred; societies in which fealty to the monarch or ruler or tyrant was demanded of all and where the people were told what their rulers thought they ought to know. Totalitarian theory shared many of these characteristics, but contained one important additional dimension: the education of the people in the ‘correct’ truth.

Note that the Review is proposing regulating of blogs “with a minimum of 15,000 hits per annum” – a miniscule traffic that would include Niche Modeling.

If a publisher distributes more than 3000 copies of print per issue or a news internet site has a minimum of 15 000 hits per annum it should be subject to the jurisdiction of the News Media Council, but not otherwise. These numbers are arbitrary, but a line must be drawn somewhere.

The figure of 40 hits a day is even more ridiculous considering around 90% of hits are from bots, crawlers, spammers and the like. So basically any blog that is read by anybody would be captured in the Green/Labor Government net.

A government regulator installed to monitor and control the free press. What could possibly go wrong? – Gab

If it moves, tax it. If it keeps moving, regulate it. And if it stops moving, subsidise it. Ronald Reagan

The Finkelstein Report in over 400 pages attempts to justify its intervention into free speech, devoting considerable space to the topic of “market failure,” where it is claimed that a free press has lead to undesirable results, such as the creation of monopolies, unjustified harm to people, and an unjust coverage of issues like global warming. In raising the notion of “market failure,” they never argue for the moral and productive superiority of capitalism, to which we owe a free press, and the moral bankruptcy and destructive economic consequences of Green environmentalism.

Sign the Menzies House Free Speech Petition

Scafetta vs the IPCC

Great new application from WUWT contrasts the predictions of two models of global warming, Scafetta’s empirical resonance model and the IPCC general circulation models.

I was asked to make sense of this from Rahmstorf and Foster:
http://iopscience.iop.org/1748-9326/6/4/044022, referenced here at RC: http://www.realclimate.org/index.php?p=10475.

I haven’t read the paper in detail, and I find I have to do that to really assess it. So I can only comment on the general approach. Although it seems superficially plausible, its also somewhat novel, and so I am uncomfortable with it, as I don’t fully appreciate the statistical limitations.

IMHO only really scientific way to approach a question is to contrast between competing hypotheses, eg. the null versus the alternative, or other combination, such as the Scafetta vs IPCC above. Its clear, easy to understand and not so prone to biases.

But it seems like climate scientists are very creative in coming up with novel ways to justify their theory, and almost always fail to clearly compare and contrast the alternatives. That is their weakness, they are so damn convinced of CAGW, and shows they are generally ill-equipped with the expertise and training for conducting rigorous scientific analysis.

And of course, “creative” is meant not in the good sense.

Snowballing Bias and Corruption

One of the themes we have dealt with repeatedly is bias: in global warming reporting and research, especially statistical bias. Bias in the media against the Ron Paul campaign is becoming a big issue.

Above is a clip of people telling Dana Bash, a CNN news reporter, what they thought of her biased reporting. One incident was were Dana reported that Republicans in general were worried “like I am” that RP would continue on and hurt her presumptive nominee Mitt Romney.

In another hilarious clip from Jon Stewart, a pundit from MSNBC reports with a straight face on RPs second in New Hampshire, that if your take RP out, then Jon Huntsman is the REAL second. “You can’t do that!” says Jon. “Its physics! (mocking on) And you know if you add two zeros to the end of Huntsman’s total he would have been in first place by thousands of votes. Why is no-one talking about this? (mocking off) Because it didn’t happen!”

Moving on to climate science and the IPCC, Steve McIntyre has written a couple of posts on the blatant bias in the AR5 draft, one of the most obvious being the suppression of his and McKitrick’s peer-reviewed paper demolishing a Santer defense of climate models.

As CA readers are aware, key findings of Santer et al 2008 do not hold using updated data. Ross and I submitted a comment to IJC showing this. The comment was rejected twice, with one of the reviewers (as in the case of the comment on Steig et al) being a Santer coauthor (who was not identified to us as such). Ross eventually managed to get similar results published in another journal.

Jean S points out in a comment on the Steig thread that our findings were completely misrepresented by IPCC chapter 10 (also the source of disinformation about Steig).

Our article stated that there was a statistically significant difference between models and observations in the tropical troposphere. Instead of citing our articles as rebutting Santer’s assertions, IPCC cites us as endorsing Santer’s false assertions:

That the IPCC authors would ignore peer-reviewed papers that contradict the consensus view comes as no surprise, which is why the IPCC should be added, in the words of Richard Muller, to the list of people to no longer read.

But passively not reading is not enough, which is the point of the Dana Bash clip above.

Steve McIntyre now reports that he, and others, have received a cease-and-desist order for publishing excerpts from the AR5 draft. Its a long story, but apparently after the brouhaha over the lack of transparency in the IPCC, and the IAC recommendations for greater transparency, they have been beavering away, changing the IPCC rules.

The IPCC Procedures in Article 4.2 of the Principles Governing IPCC Work state that “The IPCC considers its draft reports, prior to acceptance, to be pre-decisional, provided in confidence to reviewers, and not for public distribution, quotation or citation.” (http://www.ipcc.ch/pdf/ipcc-principles/ipcc-principles-appendix-a-final.pdf). We therefore request the immediate removal of the ZOD chapters from your website.

The response of government scientists to calls for greater transparency is less transparency. After all, how could they promulgate their biased warming-related agendas if they were required to be fair? How could they satisfy their task-masters?

The parallel with the Ron Paul campaign, is that we see a closing of the ranks, and an intensification of the bias. Fortunately there are plenty of alternative media outlets now, and opportunities to “snowball” bias and corruption.

The capacity for “snowballing” is why Steve McIntyre has been such an inspiration to all of us too.

Ron Paul on Global Warming

November 20, 2008 Ron Paul said in a New York Times / Freakonomics interview:

“I try to look at global warming the same way I look at all other serious issues: as objectively and open-minded as possible. There is clear evidence that the temperatures in some parts of the globe are rising, but temperatures are cooling in other parts. The average surface temperature had risen for several decades, but it fell back substantially in the past few years.

Clearly there is something afoot. The question is: Is the upward fluctuation in temperature man-made or part of a natural phenomenon. Geological records indicate that in the 12th century, Earth experienced a warming period during which Greenland was literally green and served as rich farmland for Nordic peoples. There was then a mini ice age, the polar ice caps grew, and the once-thriving population of Greenland was virtually wiped out.

It is clear that the earth experiences natural cycles in temperature. However, science shows that human activity probably does play a role in stimulating the current fluctuations.

The question is: how much? Rather than taking a “sky is falling” approach, I think there are common-sense steps we can take to cut emissions and preserve our environment. I am, after all, a conservative and seek to conserve not just American traditions and our Constitution, but our natural resources as well.

We should start by ending subsidies for oil companies. And we should never, ever go to war to protect our perceived oil interests. If oil were allowed to rise to its natural price, there would be tremendous market incentives to find alternate sources of energy. At the same time, I can’t support government “investment” in alternative sources either, for this is not investment at all.

Government cannot invest, it can only redistribute resources. Just look at the mess government created with ethanol. Congress decided that we needed more biofuels, and the best choice was ethanol from corn. So we subsidized corn farmers at the expense of others, and investment in other types of renewables was crowded out.

Now it turns out that corn ethanol is inefficient, and it actually takes more energy to produce the fuel than you get when you burn it. The most efficient ethanol may come from hemp, but hemp production is illegal and there has been little progress on hemp ethanol. And on top of that, corn is now going into our gas tanks instead of onto our tables or feeding our livestock or dairy cows; so food prices have been driven up. This is what happens when we allow government to make choices instead of the market; I hope we avoid those mistakes moving forward.”

I think that is a fairer assessment than I have seen from a climate scientist. The problem is that when you dig into the field of climate science there is data and there are models and then there’s 50 feet of Climategate crap and big-government science funding. Below that there’s the IPCC.

Precautionary Principle as Deus ex machina

Andrew Bolt picked up on one of the emails from Climategate 2 that gives a fascinating insight into the collusion between academia and the WWF in the manufacturing of environmental scares.

It also shows the role of the precautionary principle as a “deus ex machina”, a device of dubious merit for solving tricky problems in a plot.

Step 1. At 12:21 13/09/99 -0400, Andrew Markham, director of World Wildlife Fund’s Climate Change Campaign, writes to Mike Hulme at UAE:

>Meanwhile, Hurricane Floyd seems to be heading for Florida. >WWF offices are keen that we have something to say on this.

Step 2: Mike says there is no evidentiary link between Hurricanes and AGW, only theory. Suggests a number of scientists with “precautionary sense” including:

Barrie Pittock at CSIRO is always good for some precautionary sense, and/or Kevin Hennessy from the same group.

Step 3: Two days later the WWF report from Andrew Markham appears.

For Release: Sep 15, 1999
Adam Markham

Growing evidence suggests that global warming may be a factor in the formation of hurricanes in the Atlantic Ocean – a prospect that makes it likely that super storms like Hurricane Floyd will occur with increasing frequency in the future.

Interesting how the precautionary principle is used to justify an environmental scare in full knowledge that there is no evidence to support it. The main criticism of the principle is well summarized (from wikipedia) in Sancho vs. DOE, by Helen Gillmor, Senior District Judge, dismissing a lawsuit which included a worry that the LHC could cause “destruction of the earth” by a black hole:

Injury in fact requires some “credible threat of harm.” Cent. Delta Water Agency v. United States, 306 F.3d 938, 950 (9th Cir. 2002). At most, Wagner has alleged that experiments at the Large Hadron Collider (the “Collider”) have “potential adverse consequences.” Speculative fear of future harm does not constitute an injury in fact sufficient to confer standing. Mayfield, 599 F.3d at 970.

Climategate Bits

Have you listened to the Ray Hadley audio here ripping into Tim Flannery?

Bolt doesn’t go into detail, but its well worth the 20min listen. Apparently Flannery fabricated the whole Crikey story about a media setup over his Hawksbury River house, with interviews with the local resident in question to prove it. Ray calls him “low scum” and a “liar” repeatedly. Strong stuff. Off to court we go again.

Climategate2: This from Mike Hulme is interesting recommending CSIRO as being reliable for providing “precautionary sense” on hurricanes, mentioning Kevin Hennessy, lead author of the alarmist drought reports from BoM and CSIRO. Its the same theme in many of the emails, bemoaning lack of evidence, and casting around for people who will follow the party line, providing the appropriate level of alarm. Conspiracy theories anyone?

Climategate 2: What Climate Scientists Really Think About You


<601> “David Jones”
subject: RE: African stations used in HadCRU global data set
to: “Phil Jones”

Thanks Phil for the input and paper. I will get back to you with comments next week. Fortunately in Australia our sceptics are rather scientifically incompetent. It is also
easier for us in that we have a policy of providing any complainer with every single station observation when they question our data (this usually snows them) and the Australian data is in pretty good order anyway

IPCC Report on Extreme Events

On IPCC scientists test the Exit doors.

Professor Palutikof said it would take a while for the effects of climate change to become visible. But without action, she said, “gradually, over time, that signal will emerge with resounding clarity”.

Well that’s inconvenient, isn’t it? So how about anomalous heat in the ocean, the melting of glaciers and arctic ice. How about the surface temperature and the sea level? How about droughts, floods, hurricanes and cyclones? You know — like the ones that are *not* found to be significantly increasing.

Jean Palutikof, director of the National Climate Change Adaptation Research Facility at Griffith University, in Queensland, said the findings of the UN report would “not surprise anyone involved in climate science”.

Condescending ass. Most people know little about the
whole field of climate science and do not have the time of day for it until they demonstrate robust predictions and the rigor of “mainstream” science. We attack it in our spare time because the claims are so incredibly extravagant, the practices so furtive and evasive, and little they say makes scientific sense.

However, Professor Steffen told the Ten Network’s The Bolt Report at the weekend that most experts agreed we would see an increase in intensity in cyclones as the warming continued.

Sorry, but I don’t see anything more than the same premature speculations. There is no clear plot of increasing energy over a long period. Sorry but I have no interest in what most climate experts agree on anymore.

Professor Steffen recently said:

”Well over 90 per cent of scientists in the area are quite clear: the Earth is warming and human activity is the major cause.” The blame for this ”phoney debate”, he believes, lies squarely with the media. ”A very small, very vocal minority is given the same weight,” he says.

You can call me a phoney but unless you’re omniscient, you don’t know the answer. And according to the report nobody does in most areas. So basically, you’re just calling anyone who questions the magnitude and cause of global warming a liar. Nice. Very professional.

You and Clive Hamilton make a good pair. On JoNova and Anthony Cox:

“But, hell, if like these two muppets you can pretend that thousands of scientists have made up two decades of research about global warming, you can attribute anything to anyone without any hesitation. It’s what they do. “

Both of these people are experts in their respective fields. I have developed new computational and statistical methods that helped to bring a new field of niche modeling to fruition. We don’t know science and its limitations? We don’t recognize the self-serving strategies of mediocre minds? Of course we do. Very very well.

Once and for all, I am not in any way, shape or form against some effects of CO2 somewhere. Show me exactly where I said that or please don’t mention it again. I am campaigning
for proper testing and proper critiques of claims instead of the fawning acceptance of AGW and undeserved praise
and adulation the IPCC has gotten from too many people in the CSIRO, the BoM and the government without proper evidence. If you want to raise objections to doing science properly, please feel free to justify it.

Hermain Cain 20% Up on GOP Contenders

Wow. Herman Cain is now 20% in front of his nearest GOP rival Mitt Romney. That’s 10% up in the last week.

His message. 9-9-9.

999 is the emergency number for a number of countries (but not the US where the number is 911).

Herman Cain has a degree in mathematics.

Cain’s 9-9-9 video concludes with text saying, “If 10% is good enough for God, then 9% should be just fine for the Federal Government.”

Praised by supporters for both its simplicity and its specificity, Cain’s plan drops the current 35 percent corporate tax rate to 9 percent, swaps the 6-bracket personal income tax system for a 9 percent flat tax and creates a 9 percent national sales tax.

The 9-9-9 plan eliminates the payroll tax and estate tax, which brought in a combined $883 billion in 2010, or about 41 percent of the $2.16 trillion collected by the federal government last year. Cain’s proposal also wipes out taxes on capital gains and repatriated corporate profits.

Cain is also a denier.

Watch the video. See the Left squirm.

Reflections on the Carbon Tax Vote

With the Carbon Tax legislation up for a vote on Wednesday, this is a critical week. Skeptics should embrace the challenge of communicating common-sense and logic over Green ideology. I feel strongly we should not give up, wave a white flag and surrender.

Every week there are more reasons not to pass this legislation. The global temperature continues not to rise. The ocean heat remains stable. Global warming appears to have peaked. More research is published showing flaws in previous studies used by the IPCC.

Even if global warming were a problem, which it is not, as hundreds of billions of tax-payer dollars have failed to find little more of concern than reduced Arctic ice, and reduced hurricanes, both of which are beneficial anyway.

But if it were, new technologies that are safe, cheap and produce abundant energy without CO2 emissions are right around the corner. Among the climate bloggers, I have been the first and consistently brought the Rossi E-Cat technology to your attention.

The E-Cat is a simple combination of nickel powder, hydrogen, a heater and a secret catalyst that produces abundant nuclear energy with very little radiation or radioactive by-products. People don’t get the importance of this yet. But you can buy one now. In 10 years this or a similar process will probably supply the majority of our power for a tenth of the cost of present sources.

The Greens say that its good to reduce our consumption of energy for the planet’s sake. That is utter nonsense. Energy use is directly correlated with human welfare, health, cleaner environment, and nature conservation.

Even if it does pass, the victory for the burgeoning bureaucracy and nervous ninnies will be temporary for two reasons:

1. With every passing year, as it has for the past decade, the temperature fails to meet the bogus model predictions, the poor science that bolstered this policy will be revealed to a poorer but wiser public.

2. As cold fusion technology takes off, the energy sources that need massive government subsidies like wind and solar will be first to go, then increasingly hydrocarbons, purely through market mechanisms.

Government intervention, research and carbon pricing will have contributed nothing. While ideologically motivated climate-science has left a bad taste, it will not last. The fruit of human ingenuity is sweet.

Labor-Greens-Independent message to 4500 Australians: P*** Off

Menzies House reports on the latest efforts to suppress opposition to the Carbon Tax, as around 4500 Australians opposed to the carbon tax have had their submissions to the Joint Select Committee on Australia’s Clean Energy Future Legislation rejected out of hand.

The 4500 number was stated by Senator Birmingham on his website:

This rushed ‘shotgun’ inquiry received more than 4500 emails in barely a week, but in an extraordinary abuse of democratic process the Labor-Greens majority opted not to receive them as submissions or publish them.

The Coalition dissenting report details the disgraceful treatment of the Australian people here:

In this report of Coalition Members and Senators we have included the comments of hundreds of Australians – not just those few who appeared before the committee in its select few days of hearings in south-eastern Australia, or those professional organisations who made detailed submissions, but also many comments from the more than 4,500 people who made submissions to this inquiry, which the Labor-Greens-Independent majority refused to have published.

To the thousands of people who feel like Noel Bowman, who stated in his submission that ‘I suppose no one will ever read this submission and in consequence I am wasting my time’, the Coalition members say we have tried to give you a voice. We could not quote or reference everybody, but in contrast to Labor’s determination to shut people out of this process we were even more determined to ensure that as many voices as possible from across Australia were heard.

Anecdotal evidence shows an overwhelming bias towards submissions in favor of the tax. According to the Bolt’s conversation, submissions in opposition were largely binned, leaving distorted public record.

If this is so, while the comrades responsible conveniently “disappearing” the opposition to Australia’s Clean Energy Future Legislation may get a “well done” from the Brown/Gillard government, but the similarities to the Road to Serfdom by Friedrick Hayek are unsettling.

Renewable Energy

Which is better for the environment: renewable energies, oil, gas, coal or nuclear energy? The environmental damage caused by energy sources can be measured by their ‘footprint’ — the area required to produce a specific amount of energy.

An article in Forbes lists the energy produced per unit area of major energy sources, from which I have calculated the area required to produce a specific amount of energy.

Source W/m2 m2/W
Biofuels 0.05 20
Wind power 1.2 0.8
Solar PV 6.7 0.15
Natural Gas 27 0.04
Oil 28 0.04
Nuclear 56 0.02
LENR 100 0.01

Table 1. Relative environmental damage from power sources in square meters destroyed per Watt of energy produced (m2/W).

Simple math shows that a gas or oil well has a power density at least 22 times that of a wind turbine as so uses 22 times less area for the same power generation. If damage to the environment was the only concern, oil and gas are 22 times more friendly.

The big environmental saviors are nuclear power which has 1000 times greater power density than biofuels, and so is 1000 times more environmentally friendly, and potentially LENR, low energy nuclear reactions, which could pack even more power into a smaller area due to low shielding requirements.

It could be argued that distributed power sources are more efficient because they are located closer to their sources. In fact, this is not the case as a source with low power density requires more resources for transmission lines and storage, reducing the economic viability and potential to scale.

It is obvious from this basic analysis, and has been shown from experience, renewable energies such as biofuels, wind and solar are bad for native fauna and flora.

The inevitable conclusion is that advocates of renewable energy do not care about the environment.

The progress of civilization is characterized by the utilization of denser energy sources. The environment has benefited from the reservation of larger areas of land in their natural state. The inclination to dispersed energy sources is a form of neo-Luddism — opposition to any modern technology.

Shaviv and Pielke on Climate Science in 2011

Nir Shaviv is an astrophysicist who wrote some of the more interesting studies showing the role of Gamma Ray Flux (GRF) on climate change, now belatedly being acknowledged by the climate establishment.

He gives some advice to students here: Stay away from Climate Science until you are tenured or retired!

My point is that because climate science is so dogmatic students do risk burning themselves because of the politics, if they don’t follow the party line. Since doing bad (“alarmist”) climate science is not an option either, I advise them to do things which are not directly related to global warming. (In fact, all but one of the graduate students I had, work or worked on pure astrophysical projects). I, on the other hand, have the luxury of tenure, so I can shout the truth as loud as I want without really being hurt.”

As shown by Roger Pielke Jr.’s revelation of how GRL has given up all pretense of due process, in its review of a manuscript on tropical cyclone frequency or intensity addressing the misrepresentations of increased damages due to climate change.

Cyclones were among those misrepresentations made by Chief Scientist Prof. Chubb in front of the Senate Inquiry.

UPDATE: ACM provided a transcript from Hansard.

The Cat asks How Credible is the Chief Scientist?, and Judith Sloan suggests it is a long time since he worked as one.

I cant resist this quote, directly contradicted by Pielke and other evidence.

Mr HUSIC: What would those weather events be?
Prof. Chubb: The argument at the moment is that there will be, for example, much more intense cyclones and whatever they are called in the Northern Hemisphere, and more intense rain and flooding. There will be a lot more intense and focused events of that type and that character as the climate changes. That is where the current view is.

Sea level rise projections bias

Sea levels, recently updated with 10 new data-points, reinforce the hiatus described as a ‘pothole’ by Josh Willis of NASA’s Jet Propulsion Laboratory, Pasadena, Calif., who says you can blame the pothole on the cycle of El Niño and La Niña in the Pacific:

This temporary transfer of large volumes of water from the oceans to the land surfaces also helps explain the large drop in global mean sea level. But they also expect the global mean sea level to begin climbing again.

Attributing the ‘pothole’ to a La Nina and the transfer of water from the ocean to land in Australia and the Amazon seems dubious, given many land areas experienced reduced rainfall at the same times, as shown above.

A quadratic model of sea-level indicates deceleration is now well-established and highly significant, and if present conditions continue, sea level will peak between 2020 and 2050 between 10mm and 40mm above present levels, and may have stopped rising already.

Reference to a ‘pothole’ in a long-term trend caused by short-term La Nina, while ignoring statistically significant overall deceleration, is another example of bias in climate science.

lm(formula = y ~ x + I(x^2))

Min 1Q Median 3Q Max
-8.53309 -2.39304 0.03078 2.45396 9.17058

Estimate Std. Error t value Pr(>|t|)
(Intercept) -2.264e+05 3.517e+04 -6.438 7.40e-10 ***
x 2.230e+02 3.513e+01 6.348 1.21e-09 ***
I(x^2) -5.490e-02 8.772e-03 -6.258 1.98e-09 ***

Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 3.448 on 222 degrees of freedom
Multiple R-squared: 0.9617, Adjusted R-squared: 0.9613
F-statistic: 2786 on 2 and 222 DF, p-value: < 2.2e-16

And the code:

figure5<-function() {
x <- time(SL); y <- SL
l=lm(y ~ x+I(x^2))
new <- data.frame(x = 1993:2050)
predict(l, new, se.fit = TRUE)
pred.w.clim <- predict(l, new, interval="confidence")
matplot(new$x,pred.w.clim,lty=c(1,2,2), type="l", ylab="Sea Level",main="Quadratic Projection of Sea Level Rise",ylim=c(-10,100),lwd=3,col=c(2,2,2),xlab="Year")

FFT of TSI and Global Temperature

This is the application of the work-in-progress Fast Fourier Transform algorithm by Bart coded in R on the total solar irradiance (TSI via Lean 2000) and global temperature (HadCRU). The results show (PDF) that the atmosphere is sufficiently sensitive to variations in solar insolation for these to cause recent (post 1950) warming and paleowarming.

The mechanism, suggested by the basic energy balance model, but confirmed by the plots below, is accumulation. That is, global temperature is not only a function of the magnitude of solar anomaly, but also its duration. Small but persistent insolation above the solar constant can change global temperature over extended periods. Changes in temperature are proportional to the integral of insolation anomaly, not to insolation itself.

The figure below is the smoothed impulse response resulting from the Fourier analysis using TSI and GT. This is the simulated result of a single spike increase in insolation. The result is a constant change, or step in the GT. This is indicative of a system that ‘remembers shocks’, such as a ‘random walk’. Because of this memory, changes in TSI are accumulated. (Not sure why its negative.)

Below is the Bode plot of the TSP and GT data (still working on this). The magnitude response shows a negative, straight trend, indicative of an accumulation amplifier. This is also consistent with the spectral plots of temperature that cover paleo timescales in Figure 3 here.

Bart’s analysis is going to be very useful doing this sort of dynamic systems analysis in a very general way. Up to now I have been using spectral plots and ARMA models.

This analysis above is an indication of the robustness of the method, as it gives a different but appropriate result on a different data set. Its going to be a very useful tool in arguing that the climate system is not at all like its made out to be.

I will post the code when its further along.

Global Atmospheric Trends: Dessler, Spencer & Braswell

Starting the S&B story at the beginning, as did Steve McIntyre, with Dessler 2010 in Science, I’ll put a new spin on the satellite data uploaded by Steve, using the accumulation theory. Although I am not familiar with the data, it turns out to be easily interpretable.

In black is the replication of Steve’s Figure 1 and Dessler’s 2010 Figure 2A, the scatter plot of monthly average values of ∆R_cloud (eradr) versus ∆T_s (erats) using CERES and ECMWF interim data. There is extremely little correlation as noted by Steve. In fact, it is not statistically significant in the conventional sense, Science apparently adopting the new IPCC-speak qualitative standard of ‘likely’.


Estimate Std. Error t value Pr(>|t|)
(Intercept) 0.01751 0.04599 0.381 0.704
X 0.54351 0.36184 1.502 0.136

Residual standard error: 0.5036 on 118 degrees of freedom Multiple R-squared: 0.01876, Adjusted R-squared: 0.01045 F-statistic: 2.256 on 1 and 118 DF, p-value: 0.1358

The points in red are the sequential difference of temperature against the cloud radiance. While these have a lower slope, unlike the former, they are conventionally significant, almost to the 99%CL.


Estimate Std. Error t value Pr(>|t|)
(Intercept) 0.01269 0.04524 0.280 0.7796
dX 1.07071 0.42782 2.503 0.0137 * ---

Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 0.4954 on 118 degrees of freedom Multiple R-squared: 0.0504, Adjusted R-squared: 0.04236 F-statistic: 6.263 on 1 and 118 DF, p-value: 0.01369

So why plot the sequential temperature differences and not the temperature directly? Firstly, while the autoregression coefficient (AR) of atmospheric temperature, erats (using arima(dess[,5],order=c(1,0,0))), is AR=0.65, for eradr its AR=0.16. This tells you that the two are different types of processes. The low AR is like a bunch of random numbers. The high AR is like a sequential accumulation of random numbers. Using different terminology, they do not cointegrate, as one can trend strongly (non-stationary) and the other stays around its mean (is stationary). Nor do they necessarily correlate. Though they can causally determine each other.

Differencing temperature is explained in accumulation theory, which pays close attention to heat accumulating in the ocean. Overlooking this basic physical model of the system causes many problems. Interpreting the data in terms of the physical model clears a lot of things up, as shown by the significant result above.

Above is the time-series plot of cloud radiance (black) and differenced global temperature (red) showing the relationship.

What does this say about cloud dynamics? The way to get intuition of dynamic relationships is to imagine the output from three types of input: impulse, step and periodic.


On an impulse of radiation, the surface (and lower atmosphere) warm and then revert. The differenced variable (like the first derivative) surges positive while temperature is rising, then surges negative while temperature is falling.

Electrical engineering buffs will appreciate this as the current-opposing behavior of an inductor. Clouds, in this view, could be compared with the electromagnetic field set up by the changing current. (The ocean heat capacity is comparable to capacitance).

The peak of the differenced pulse will lead the peak of the forcing. This shows it that lag/lead relationships are not reliable indicators of the direction of causation in dynamic systems.


On a step increase in radiation, the surface (and lower atmosphere) will ramp up as long as the forcing persists in accumulating heat in the ocean. The differenced variable will step-up and remain constant while temperature is rising at a constant rate.

This is a fundamentally different view of climate sensitivity, with different units. From the results above, the positive feedback from clouds is 1.1 W/m^2/K^2 and not 0.54 W/m^2/K. This means that clouds provide back-radiation (feedback is positive) while temperature is rising, but when the temperature stops rising, the back-radiation stops too. The number is part of the process.

I do not see how it is possible to interpret this in terms of a particular climate sensitivity. In the alternative view, cloud feedback is twice as strong as the conventional view while temperature is rising, but drops to zero when temperature is stable.


Finally, a periodic forcing is phase shifted 90 degrees (as shown by the impulse example). By simple calculus, the derivative of a sine function is a cosine function.

Could this explain the approximately 4 month lag in terms of an annual cycle (12/4 = 3 months)? Possible? It may explain the negative correlation achieved by Steve McIntyre at 4 month lag, as a 90 degree lead in a peak, produces a 90 degree lag between peak and trough.

Here is my code (you need to download the data from link above).


figure1<-function(X,Y) {
plot(0,0,cex=1,col=2,xlab="Global Temp (black) and diffTemp (red)",ylab="Clouds R",type="p",xlim=c(-0.4,0.4),ylim=c(-2,2))
plot(Y,col=1,ylab="Cloud R and diff(Temp)")


Sinusoidal Wave in Global Temperature

A dynamic way of looking at global temperature is to plot it in phase space, which is usually with the position on the x axis and velocity on the y axis. Below is the phase space graph of global temperature since 1996 with temperature on the x axis and change in temperature on the y axis.

The graph of position versus velocity displays an inward spiral. In classical mechanics, this is described as an “attractor” and shows that the system is trapped in a potential well from which it cannot escape, as in this animation of a damped harmonic oscillator.

Graphs of temperature versus time show that the temperature since 1996 should be seen as a sinusoidal wave that decreases exponentially since the sudden impulse from the “El Nino of the Century” in 1997-8.

The phase diagram provides another way of showing that the global temperature has stopped warming. With average temperature acting as an “attractor”, the system has been trapped in a potential well for 15 years, that’s as good a description of “stopped” as you find in dynamics.

NIPCC Report on Species Extinctions due to Climate Change

The NIPCC – Interim Report 2011 updates their last 2009 Report, with an overview of the research on climate change that the IPCC did not see fit to print. Its published by the Heartland Institute with lead authors Craig D. Idso, Australian Robert Carter, and S. Fred Singer with a number of other significant contributions.

I am grateful for inclusion of some of my work in Chapter 6 on the uncertainty of the range-shift method for modeling biodiversity under climate change.

The controversy centered on a paper by Thomas et.al 2004 called “Extinction Risk from Climate Change“, that received exceptional worldwide media attention for its claims of potentially massive extinctions from global warming.

Briefly, the idea is to simulate the change in the range of a species under climate change by ‘shifting’ the range using a presumed climate change scenario.

Daniel Botkin said of the Thomas et.al. 2004 study

Yes, unfortunately, I do consider it to be the worst paper I have ever read in a major scientific journal. There are some close rivals, of course. I class this paper as I do for two reasons, which are explained more fully in the recent article in BioScience:

… written by 17 scientists from a range of fields and myself (here).

While there are many problems with this paper, the most amazing, as I see it, is the way they used changes in the size of species-ranges to determine extinctions. Its generally believed that contracting a species-range increases the probability of extinction.

Consider the case of a species that disperses freely under climate change. While the range-size of individuals change, the average range-size should stay the same, unless there is a major obstruction like an ocean or mountain range. Species whose range size decreases are balanced by species whose range size increases. Overall, the net rate of extinction should be unchanged.

However, Thomas et.al. 2004 simply deleted all species whose range expanded. A massive increase in extinctions was therefore a foregone conclusion, even assuming free dispersion.

There are a number of other ways a bias towards range-reduction can be introduced, such as edge effects and over-fitting assumptions, that I show in my book “Niche Modeling“. In a normal science this would have been a cautionary tale of the dangers of ad-hoc methodologies.

It’s an example of the intellectual bankruptcy of the IPCC report that the uncertainties of Thomas et.al. 2004 and other similar studies were ignored by Working Group II. For example, in Impacts, Adaption and Vulnerability, 13.4.1 Natural ecosystems

Modelling studies show that the ranges occupied by many species will become unsuitable for them as the climate changes (IUCN, 2004). Using modelling projections of species distributions for future climate scenarios, Thomas et al. (2004) show, for the year 2050 and for a mid-range climate change scenario, that species extinction in Mexico could sharply increase: mammals 8% or 26% loss of species (with or without dispersal), birds 5% or 8% loss of species (with or without dispersal), and butterflies 7% or 19% loss of species (with or without dispersal).

And in 19.3.4 Ecosystems and biodiversity:

… up to 30% of known species being committed to extinction * (Chapter 4 Section 4.4.11 and Table 4.1; Thomas et al., 2004;

And in other summaries Table 4.1

Clearly the major difficulty with all this work, something that turned me off it but few acknowledge, is that the lack of skill of simulations of climate change renders fraudulent any claim to skill at the species habitat scale. Only now is the broader climate community finally starting to accept this about multi-decadal climate model predictions, such as contained in the 2007 IPCC WG1 the climate assessments. The NIPCC illustrates the broader opinion which should have been integral to the IPCC process from the beginning, IMHO.

NIWA’s Station Temperature Adjustments – CCG Audit

The New Zealand Climate Conversation Group have released their report and reanalysis of the NIWA 7-Station Review. CCG claim NIWA misrepresented the statistical techniques it used, and exaggerated warming over the last hundred years.

The CCG results (Figure 20 above) prove there are real problems in the adjustments to temperature measurements for moves and equipment changes in NZ (also seen in Australia).

As any trained scientist or engineer knows, failure to follow a well-documented and justified method is a sign of pseudoscience. The New Zealand Climate Conversation Group is correct in examining if Rhoades & Salinger (1993) has been followed, as advertised.

In 2010, NIWA published their review of their 7-station temperature series for New Zealand. The review was based upon the statistically-based adjustment method of Rhoades & Salinger (1993) for neighbouring stations. In this report, we examine the adjustments in detail, and show that NIWA did not follow the Rhoades & Salinger method correctly. We also show that had NIWA followed Rhoades & Salinger correctly, the resultant trend for the 7-station temperature series for New Zealand would have been significantly lower than the trend they obtained.

Despite searching, I cannot see a methodology section in NIWA’s report, which is a disjoint analysis of each of the seven sites, although it is clear in a number of places that they infer that Rhoades and Salinger (1993) forms the basis. For example, page 145 on Dunedin.

In February 2010, NIWA documented the adjustments in use at that time (see web link above). These adjustments to the multiple sites comprising the ‘seven-station’ series were calculated by Salinger et al. (1992), using the methodology of Rhoades and Salinger (1993), which extended the early work on New Zealand temperatures by Salinger (1981). Subsequent to 1992, the time series have been updated regularly, taking account of further site changes as circumstances required.

The Climate Conversation Group summarize the differences between Rhoades and Salinger (1993) and the method actually used by NIWA. The R&S method for comparing a station with neighbouring stations involves the use of:

– Monthly data
– Symmetric interval centred on the shift
– A 1-2 year period before and after the shift
– Weighted averages based on correlations with neighbouring stations
– Adjustments only performed if results are significant at the 95% confidence level

In contrast, the NIWA method uses:

– Annual data
– Asymmetric intervals
– Varying periods of up to 11 years before and after the shift
– No weighted averages
– No evidence of significance tests – adjustments are always applied.

Any of these methodological deviation could create substantial differences between the results, but the Climate Conversation Group (nor I) could not find a rationale or discussion in the NIWA review reports for not implementing the R&S method as stated.

What are the details of the methods? The CCG report compares a single station at Dunedin, using NIWA and R&S methods in their Table 1. There were five site moves — 1913, 1942, 1947, 1960, and 1997 — with five potential adjustments. The NIWA method adjusts at each of the moves, resulting in an increasing trend of 0.62C/century for Dunedin. The R&S method only implements two adjustments resulting in a 0.24C/century increasing trend.

The other six stations are similar (Masterton, Wellington, Nelson, Hokitika, and Lincoln), with the NIWA method doing generally more frequent, and more negative adjustments, and resulting in exaggerated trends, as shown in Figure 20 at the top of this post.

It would seem that significance tests and weighting of neighboring sites is very important. It ensures the nearby sites used to calibrate the site moves actually provide information on the site in question. A larger neighborhood of 11 years would probably confound short-term changes with the long-term warming trend, and may bias the adjustments to exaggerate the trend.

To ignore significance tests, weightings, and modify the method arbitrarily, whether sloppy or intentional, is bad practice, and would not be favorable to NIWA in their upcoming court case, brought by CCG.

Global Warming Trends – Gimme Some Truth

Richard Treadgold from the New Zealand Climate Conversation Group reports on the Statistical Audit of the NIWA 7-Station Review, claiming that New Zealand’s National Climate Center, NIWA, misrepresented the statistical techniques it used (Rhoades & Salinger – Adjustment of temperature and rainfall records for site changes) in order to fabricate strong warming over the last hundred years.

NIWA shows 168% more warming than Rhoades & Salinger – the method NIWA betrayed. The blue dashed line shows the warming trend when the method is used correctly. The red line reveals NIWA’s outrageous fraud – it’s much stronger warming, but it’s empty of truth.

The results of this audit corroborate the results of Ken Stewart’s audit of the Australian temperature record.

As yet, Ken has received an apology from the Australian BoM for tardiness, but no explanation for the 140% exaggeration of warming trends in Australia.

I have been begging BOM- or anyone- to check my analysis but to no avail.

Are we getting value from our public-funded science?

Just Gimme Some Truth original and HD version.

No short-haired, yellow-bellied, son of Tricky Dicky; Is gonna mother hubbard soft soap me; With just a pocketful of hope; It’s money for dope; Money for rope

Phase Lag of Global Temperature

Lag or phase relationships are to me one of the most convincing pieces of evidence for the accumulative theory.

The solar cycle varies over 11 years on average like a sine wave. This property can be used to probe contribution of total solar insolation (TSI) to global temperature.

Above is a plot of two linear regression models of the HadCRU global temperature series since 1950. The time since 1950 is chosen because it is the period that the IPCC states that most of the warming has been caused by greenhouse gasses GHG, like CO2, and because the data is more accurate.

The red model is a linear regression using TSI and a straight line representing the contributions of GHGs. This could be called the conventional IPCC model. The green model is the accumulated TSI only, the model I am exploring. Accumulative TSI is calculated by integrating the deviations from the long-term mean value of TSI.

You can see that both models are indistinguishable by their R2 values (CumTSI is slightly better than GHG+TSI at R2=0.73 and 0.71 respectively).

You can also see a lag or shift in the phase of the TSI between the direct solar influence (in the red model) and the accumulated TSI (green model). This shift comes about because integration shifts a periodic like a sine wave by 90 degrees.

While there is nothing to distinguish between the models on fit alone, the shift provides independent confirmation of the accumulative theory. Volcanic eruptions in the latter part of the century obscure the phase relation over this period somewhat, so I look at the phase relationships over the whole period of the data since 1850.

Above is the cross-correlation of HacCRU and TSI (ccf in R) showing the correlation at all the shifts between -10 and +10 years. The red dashed line is at 2.75 years, a 90 degree shift of the solar cycle, or 11 years divided by 4. This is the shift expected if the relationship between global temperature and TSI is an accumulative one.

The peak of the cross-correlation lies at exactly 2.75 years!

This is not a result I thought of when I started working on the accumulation theory. The situation reminds me of the famous talk by Richard Feynmann on “Cargo Cult Science“.

When you have put a lot of ideas together to make an elaborate theory, you want to make sure, when explaining what it fits, that those things it fits are not just the things that gave you the idea for the theory; but that the finished theory makes something else come out right, in addition.

Direct solar irradiance is almost uncorrelated with global temperature partly due to the phase lag, and partly due to the accumulation dynamics. This is why previous studies have found little contribution from the Sun.

Accumulated solar irradiance, without recourse to GHGs, is highly correlated with global temperature, and recovers exactly the right phase lag.

Accumulation of TSI comes about simply from the accumulation of heat in the ocean, and also the land.

I think it is highly likely that previous studies have grossly underestimated the Sun’s contribution to climate change by incorrectly specifying the dynamic relationship between the Sun and global temperature.

Climate Sensitivity Reconsidered

The point of this post is to show a calculation by guest, Pochas, of the decay time that should be expected from the accumulation of heat in the mixed layer of the ocean.

I realized this prediction gives another test of the accumulation theory of climate change, that potentially explains high climate sensitivity to variations in solar forcing, without recourse to feedbacks, or greenhouse gasses, in more detail here and here.

The analysis is based on the most important parameter in all dynamic systems, called the time constant, Tau. Tau quantifies two aspects of the dynamics:

1. The time taken for an impulse forcing of the system, such as a sudden spike in solar radiation, to decay to 63% of the original response.

2. The inherent gain, or amplification. That is if the Tau=10, the amplification of a step increase in forcing will be x10. This is because at Tau=10, around one tenth of an increase above the equilibrium level will be released per time period. So the new equilibrium level must be 10 times higher than the forcing, before the energy output equals the energy input.

I previously estimated Tau from global temperature series, simply from the correlation between successive temperature values, a. The Tau is then given by:

Tau = 1/(1-a)

Pochas posted the theoretical estimate of the time constant, Tau, below, that results from a reasonable assumption of the ocean mixed zone depth of 100m.

The input – output = accumulation equation is:

q sin ωt /4 – kT = nCp dT/dt

where q = input flux signal amplitude, watts/(m^2 sec). The factor 4 corrects for the disk to sphere surface geometry.

k = relates thermal flux to temperature (see below) J/(sec m^2 ºK).

T = ocean temperature,

ºKn = mass of ocean, grams.

Cp = ocean heat capacity J/(g ºK)t = time, sec or years.

Rearranging to standard form (terms with T on the left side):

nCp dT/dt + kT = q sin ωt /4

Divide by k

nCp/k dT/dt + θ = q sin ωt /(4k)

The factor nCp/k has units of time and is the time constant Tau in the solution via Laplace Transform of the above.

n = mass of water 100 m deep and 1m^2 surface area = 10E8 grams.

Cp = Joules to heat 1 gram of water by 1ºK = 4.187 J/gram.

k = thermal flux equivalent to blackbody temperature, J/(m^2 sec ºK).

Solution after inverse transform, after transients die out:

Amplitude Ratio = 1/(1+ω²T²)^½

where ω = frequency, rad/yr

Derivation of k Stefan Boltzmann equation

q = σT^4k = dq/dt

Differentiating: dq/dt = 4σT^3

Evaluating at T = blackbody temp of the earth, -18 ºC = 256 ºK

k = 4 (5.67E-8) 256^3 = 3.8 J/(sec m^2 ºK)

Calculating Time Constant Tau

Tau = nCp/k = 10E8 (4.187) / 3.8 = 1.10E8 sec

Tau = 1.10E8 / 31,557,000 sec/yr = 3.4857 yr


The figure of Tau=3.5 yrs is in good agreement with the empirical figures from the correlation of the actual global surface temperature data of 6 to 10. The effective mixed zone may be closer to 150m, and so explains the difference.

This confirms another prediction of the theory that amplification of solar forcing can be explained entirely by the accumulation of heat, without recourse to feedbacks from changing concentrations of greenhouse gases.

Solar Supersensitivity – a new theory?

Do the results described here and here constitute a new theory? What is the relationship to the AGW theory? What is a theory anyway?

The models I have been exploring, dubbed solar supersensitivity, predict a lot of global temperature observations: the dynamics of recent and paleoclimate climate variations, the range of glacial/interglacial transitions, the recent warming coinciding with the Grand Solar Maximum, and the more recent flattening of warming.

They make sense of the statistical character of the global temperature time series as an ‘almost random walk’, the shift in phase between solar insolation and surface temperature, and the range of autoregressive structure of temperature series in the atmosphere. These are all dynamic phenomena.

Conventional global warming models, based in atmospheric radiative physics, explain static phenomena such as the magnitude of the greenhouse effect, and are used to estimate the equilibrium climate sensitivity. The climate models, however, have very large error bands around their dynamics, and describe shorter term dynamics as chaotic. Does this mean they are primarily theories of climate statics, and supersensitivity is concerned with dynamics?

No. I see no reason why the accumulation theory could not be reconciled with coupled ocean/atmosphere general circulation models, once the parameterisation of these models is corrected, particularly the gross exaggeration of ocean mixing. Similarly there is no reason a model based on the accumulation of solar anomaly could not recover equilibrium states.

The difference between AGW theory and solar supersensitivity (SS) might lie more in the mechanisms. SS treats the ocean as a conventional greenhouse — shortwave solar isolation is easily absorbed, but the release of heat by convection at the ocean/atmosphere boundary is suppressed, so gradually warming the interior. In contrast, conventional AGW theory is focused more on mechanisms in the atmosphere, the direct radiative effects of gasses and water vapor. It combines many theories, of CO2 cycling, water relations, meteorology.

If mechanisms differentiate the theories, then the issue is the relative balance of the two mechanisms. Which is more responsible for recent warming? Which is more responsible for paleoclimate variations?

From basic recurrence matrix theory, the system with the largest eigenvalue will dominate the long-term, ultimate dynamics of a system, suggesting the ocean-related low loss accumulative mechanisms would dominate the short time-scale, high loss, low sensitivity atmospheric mechanisms.

If this view is correct, then what we have is a completion of an incomplete theory that promises to increase understanding and improve prediction by collapsing the range of uncertainty in the current crop of climate models.

Solar Supersensitivity – a worked example

Below is a worked example of the theory of high solar sensitivity, supersensitivity if you will, explained in detail in manuscripts here and here.

The temperature increase of a body of water is:

T = Joules/(Specific Heat water x Mass)

The accumulation of 1 Watt per sq meter on a 100 metre column of water for one year gives an expected temperature increase of

T = 32 x 10^6/(4.2 x 10^8)

= 0.08 C

Given that about one third attenuation of radiation from top-of-atmosphere to the surface, and a duration of solar cycle of 11 years, the increase in temperature due to the solar cycle will be:

Ta = 0.08 x 11 x 0.3 = 0.26 C

The expectation of the temperature increase for the direct forcing (no accumulation) using the Plank relationship of 0.3C/W would be 0.09 C. So the gain is:

Gain = Accumulated/Direct = 0.26/(0.3×0.3) = 3

For a longer accumulation of solar anomaly, from a succession of strong solar cycles such as we saw late last century, the apparent amplification will be more. From the AR correlation of surface temperature you get an estimate of gain of 10. But this is only apparent amplification, as the system is accumulative, the calculated gain increases with the duration of the forcing. For long time scales, gain (and hence solar sensitivity) approaches infinity — a singularity — and ceases to be useful. Hence the term ‘supersensitivity’. For long periods the non-linearity of the Stephan-Boltzmann law will become dominant.

Sensitivity cannot be represented in Watts/K (or K/Watt). It will be in units of rate like K/Watt/Year.

Extend this calculation for 1000 years and a small solar forcing can cause a transition between ice ages with no other input. The role of GHGs, water vapor and albedo in this theory is to maintain the heat state of the system, e.g. solar forcing increases temperature increase which causes CO2 concentrations to change. But this does not mean an increase in CO2 ‘necessarily’ increases temperature, because the system is being heated by accumulation of solar anomaly. The reason that a forcing from CO2 has apparently very low sensitivity, but solar very high, would be due to other issues that I haven’t worked through fully yet (coming soon).