Using theory, applications, and examples of inferences, Niche Modeling: Predictions from Statistical Distributions demonstrates how to conduct and evaluate niche modeling projects in any area of application. It features a series of theoretical and practical exercises for developing and evaluating niche models.
Yesterday’s post noted the appearance of station summaries at the BoM adjustment page attempting to defend their adjustments to the temperature record at several stations. Some I have also examined. Today’s post compares and contrasts their approach with mine.
The figures below compare the minimum temperatures at Deniliquin with neighbouring stations. On the left, the BoM compares Deniliquin with minimum temperatures at Kerang (95 km west of Deniliquin) in the years around 1971. The figure on the right from my Deniliquin report shows the relative trend of daily temperature data from 26 neighbouring stations (ie ACORN-SAT – neighbour). The rising trends mean that the ACORN-SAT site is warming faster that the neighbour.
The BoMs caption:
Deniliquin is consistently warmer than Kerang prior to 1971, with similar or cooler temperatures after 1971. This, combined with similar results when Deniliquin’s data are compared with other sites in the region, provides a very clear demonstration of the need to adjust the temperature data.
Problems: Note the cherrypicking of a single site for comparison and the handwaving about “similar results” with other sites.
In my analysis, the ACORN-SAT version warms at 0.13C/decade faster than the neighbours. As the spread of temperature trends at weather stations in Australia is about 0.1C/decade at the 95% confidence level, this puts the ACORN-SAT version outside the limit. Therefore the adjustments have made the trend of the official long term series for Deniliquin significantly warmer than the regional neighbours. I find that the residual trend of the raw data (before adjustment) for Deniliquin is -0.02C/decade which is not significant and so consistent with its neighbours.
Now look at the comparison of minimum temperatures for Rutherglen with neighbouring stations. On the left, the BoM compares Rutherglen with the adjusted data from three other ACORN-SAT stations in the region. The figure on the right from my Rutherglen report shows the relative trend of daily temperature in 24 neighbouring stations (ie ACORN-SAT – neighbour). As in Deniliquin, the rising trends mean that the ACORN-SAT site is warming faster that the neighbour.
The BoMs caption is
While the situation is complicated by the large amount of
missing data at Rutherglen in the 1960s, it is clear that, relative to the other sites, Rutherglen’s raw minimum temperatures are very much cooler after 1974, whereas they were only slightly cooler before the 1960s.
Problems: Note the cherrypicking of only three sites, but more seriously, the versions chosen are from the adjusted ACORN-SAT. That is, the already adjusted data is used to justify an adjustment — a classic circularity! This is not stated in the other BoM reports, but probably applies to the other station comparisons. Loss of data due to aggregation to annual data is also clear.
In my analysis, the ACORN-SAT version warms at 0.14C/decade faster than the neighbours. As the spread of temperature trends at weather stations in Australia is about 0.1C/decade at the 95% confidence level, this puts the ACORN-SAT version outside the limit. Once again, the adjustments have made the trend of the official long term series for Deniliquin significantly warmer than the regional neighbours. As with Deniliquin, the residual trend of the raw data (before adjustment) is not significant and so consistent with its neighbours.
The raw data is not always more consistent, as Amberley shows. On the left, the BoM compares Amberley with Gatton (38 km west of Amberley) in
the years around 1980. On the right from my Amberley report is the relative trend of daily temperature to 19 neighbouring stations (ie ACORN-SAT – neighbour). In contrast to Rutherglen and Deniliquin, the mostly flat trends mean that the ACORN-SAT site is not warming faster than the raw neighbours.
The BoMs caption:
Amberley is consistently warmer than Gatton prior to 1980 and consistently cooler after 1980. This, combined with similar results when Amberley’s data are compared with other sites in the region, provides a very clear demonstration of the need to adjust the temperature data.
Problems: Note the cherrypicking and hand waving.
In my analysis, the ACORN-SAT version warms at 0.09C/decade faster than the neighbours. As the spread of temperature trends at weather stations in Australia is about 0.1C/decade at the 95% confidence level, I class the ACORN-SAT version as borderline. The residual trend of the raw data (before adjustment) is -0.32C/decade which is very significant and so there is clearly a problem with the raw station record.
More cherrypicking, circularity, and hand-waving from the BoM — excellent examples of the inadequacy of the adjusted ACORN-SAT reference network and justification for a full audit of the Bureau’s climate change division.
Last night George Christensen MP gave a speech accusing the Bureau of Meteorology of “fudging figures”. He waved a 28 page of adjustments around, and called for a review. These adjustments can be found here. While I dont agree that adjusting to account for station moves can necessarily be regarded as fudging figures, I am finding issues with the ACORN-SAT data set.
The problem is that most of the adjustments are not supported by known station moves, and many may be wrong or exaggerated. It also means that if the adjustment decreases temperatures in the past, claims of current record temperatures become tenuous. A maximum daily temperature of 50C written in 1890 in black and white is higher than a temperature of 48C in 2014, regardless of any post-hoc statistical manipulation.
But I do take issue with a set of summaries being released as blatant “cherry-picking”.
Scroll down to the bottom of the BoM adjustment page. Listed are station summaries justifying the adjustments to Amberley, Deniliquin, Mackay, Orbost, Rutherglen and Thargomindah. The overlaps with the ones I have evaluated are Deniliquin, Rutherglen and Amberley (see previous posts). While the BoM finds the adjustments to these stations justified, my quality control check finds problems with the minimum temperature at Deniliquin and Rutherglen. I think the Amberly raw data may have needed adjusting.
WRT Rutherglen, BoM defends the adjustments with Chart 3 (my emphasis):
Chart 3 shows a comparison of the raw minimum temperatures at Rutherglen with the adjusted data from three other ACORN-SAT stations in the region. While the situation is complicated by the large amount of missing data at Rutherglen in the 1960s, it is clear that, relative to the other sites, Rutherglen’s raw minimum temperatures are very much cooler after 1974, whereas they were only slightly cooler before the 1960s.
WRT Deniliquin, BoM defends the adjustments on Chart 3 (my emphasis):
Chart 3 shows a comparison of minimum temperatures at Kerang (95 km west of Deniliquin) and Deniliquin in the years around 1971. Deniliquin is consistently warmer than Kerang prior to 1971, with similar or cooler temperatures after 1971. This, combined with similar results when Deniliquin’s data are compared with other sites in the region, provides a very clear demonstration of the need to adjust the temperature data.
My analysis is superior to flawed the BoMs analysis in 3 important ways:
1. I compare the trend in Rutherglen and Deniliquin with 23 and 27 stations respectively, not 3 and 1 neighbouring stations respectively (aka cherry-picking).
2. I also use a rigorous statistical panel test to show that the trend of the Rutherglen minimum exceeds the neighbouring group by O.1C per decade, which is outside the 95% confidence interval for Australian stations trends — not a visual assessment of a chart (aka eyeballing).
3. I use the trends of daily data and not annual aggregates, which are very sensitive to missing data.
Been looking forward to doing Kerang as I knew it was another dud series from ACORN-SAT. The report is here:
The first thing to notice in plotting up the time series data for the raw CDO and ACORN-SAT is that while the ACORN-SAT data goes back to 1910 the CDO data is truncated at 1962.
The monthly data, however, goes back almost to 1900. This is inexplicable as the monthly data is derived from the daily data! Here is proof that, contrary to some opinion pieces, all of the data to check the record is not available at the Bureau of Meteorology website, Climate Data Online.
The residual trends of ACORN-SAT are at benchmark and greatly exceeding benchmark, for maximum and minimum respectively.
While on the subject of opinion pieces, the statement from No, the Bureau of Meteorology is not fiddling its weather data:
Anyone who thinks they have found fault with the Bureau’s methods should document them thoroughly and reproducibly in the peer-reviewed scientific literature. This allows others to test, evaluate, find errors or produce new methods.
So you think skeptics haven’t tried? A couple of peer-review papers of mine on quality control problems in Bureau of Meterology use of models have not had a response from the Bureau in over 2 years. The sound of crickets chirping is all. Talk is cheap in climate science, I guess. Here they are:
Three more quality tests of stations in the ACORN-SAT series have been completed:
The test measures the deviation of the trend of the series from its neighbours since 1913 (or residual trend). A deviation of plus or minus 0.05 degrees per decade is within tolerance (green), 0.05 to 0.1 is borderline (amber), and greater than than is regarded a fail (red) and should not be used.
|Cape Otway Lighthouse||-0.17||0.02||-0.05||0.01|
There are more inconsistent stations among the raw CDO data — as would be expected as it is “raw”. Howover the standout problems in this small sample are in the ACORN-SAT minimums.
Results so far suggest the Bureau of Meteorology has a quality control problem with its minimum temperatures, with almost all borderline residual trends and very large deviations from the neighbours in Rutherglen and Deniliquin.
Williamtown RAAF is one case where the quality control test indicates that the adjustments were justified, even though the homogenization in ACORN-SAT produced a strong change in trend. My report is here and Ken has posted graphs for Williamtown, along with a number of sites with large changes in trend by ACORN-SAT.
This illustrates neatly that a series is not rejected just because the ACORN-SAT increases the warming trend. An ACORN-SAT series should be rejected, however, if it’s trend is inconsistent with its raw neighbours.
You may have read Ken Stewart’s excellent blog on the official Australian temperature record. With the publication of the “adjustments.xls” file of the offical adjustments to the raw data in the ACORN-SAT dataset, as reported on JoNova’s blog, there has been a flurry of work behind the scenes, so to speak.
Jennifer Marohasy has been leading the charge also to audit or at least review the practises of the Bureau of Meteorology in adjusting raw temperature to produce the synthetic ACORN-SAT series. Rutherglen in particular has been in the news for massive warming adjustment to the minimum temperature trend.
The story has gone national in The Australian with articles like Climate records contradict Bureau of Meteorology. Even I have been quoted as saying that the BoM may be “adding mistakes” with their data modelling.
Well, all of this kerfuffle has been enough to get me out of hiding and start working on some stuff. I thought it would be good to have a quality assessment method that could reliably test the ACORN data. The idea I came up with is to test the trend of the ACORN-SAT series against the trends of the raw data neighbours. Ideally, if the trend of the synthetic series exceeds the overall trend of all the neighbours, then some thing must be wrong.
The difficulty is that the neighbours all start and stop at different times and so a slightly more complex test is needed than a simple ordinary least squares regression. The answer is a panel test, or POLS. Its all explained in the reports on the first three stations below.
The test seems to work remarkably well. The ACORN-SAT minimum temperatures for Rutherglen and Deniliquin fail the benchmark residual trend of 0.05 degrees C per decade – that is they warm at a much greater rate than their neighbours. ACORN-SAT at Williamtown, by contrast, is consistent with its neighbours and the raw CDO series are not, indicating that the single large adjustment applied around 1969 was warranted.
I intend to keep working through the stations one by one and uploading them to the viXra archive site as I go. I will also release the data and code soon for those who are interested. Its almost at the stage where you enter a station number and it spits out the analysis. Wish it would write the reports too, though latex goes a great deal towards that end.
As previously covered here, Andre Rossi appears to have delivered the goods…
Cold fusion reactor verified by third-party researchers, seems to have 1
million times the energy density of gasoline
Andrea Rossi’s E-Cat — the device that purports to use cold fusion to
generate massive amounts of cheap, green energy – has been verified by
third-party researchers, according to a new 54-page report. The researchers
observed a small E-Cat over 32 days, where it produced net energy of 1.5
megawatt-hours, or “far more than can be obtained from any known chemical
sources in the small reactor volume.”…
Follow the link to story in full
Third part report is here.
Significant transformation of isotopes of Lithium and Nickel are broadly consistent with the energy produced. this leaves no doubt the source of the energy is nuclear. But the authors are perplexed, nay, dumbfounded, nay, flabbergasted at the possible physics involved as all known nuclear reactions typically have large Coloumb barriers to overcome.
They found it “very hard to comprehend” how these fusion processes could take place at such low energies – 1200C-1500C degrees. While the transmutations are remarkable in itself, they found not trace of radiation during the test, or residual radiation after the reactor had stopped – almost inevitable in a reaction of nuclear source.
What is the possible reaction(s) then? Speculations from the vortex discussion list:
Li7 + Ni58 => Ni59 + Li6 + 1.75 MeV
> Li7 + Ni59 => Ni60 + Li6 + 4.14 MeV
> Li7 + Ni60 => Ni61 + Li6 + 0.57 MeV
> Li7 + Ni61 => Ni62 + Li6 + 3.34 MeV
> Li7 + Ni62 => Ni63 + Li6 – 0.41 MeV (Endothermic!)
> This series stops at Ni62, hence all isotopes of Ni less than 62 are
> and Ni62 is strongly enriched.
> I have only briefly skimmed the report, but the basic reaction appears to
> be a
> neutron transfer reaction where a neutron tunnels from Li7 to a Nickel
> The excess energy of the reaction appears as kinetic energy of the two
> nuclei (i.e. Li6 & the new Ni isotope), rather than as gamma rays. Because
> are two daughter nuclei, momentum can be conserved while dumping the
> energy as
> kinetic energy in a reaction that is much faster then gamma ray emission.
> Because both nuclei are “heavy” and slow moving, very little to no
> bremsstrahlung is produced. There is effectively no secondary gamma from
> because the first excited state is too high. (I haven’t checked Li7).
> There is
> unlikely to be anything significant from Ni because the high charge on the
> nucleus combined with the “3” from Lithium tend to keep them apart (minimum
> distance 31 fm).
> It would be nice to know if the total amounts of each of Li & Ni in the
> were conserved (I’ll have to study the report more closely).
> Robin van Spaandonk
Fascinating new world of materials science opening up.
The Climate Council mini-statement called Bushfires and Climate Change in Australia – The Facts states in support of their view that “1. In Australia, climate change is influencing both the frequency and intensity of extreme hot days, as well as prolonged periods of low rainfall. This increases the risk of bushfires.”
Southeast Australia is experiencing a long-term drying trend.
A moment of fact-checking the BoM recorded rainfall in Southeastern Australia reveals no trend in rainfall.
Another moment of fact-checking the BoM recorded rainfall in Australia reveals an increasing rainfall trend.
When the storied Tesla Motors CEO promoted the Hyperloop, a proposed California high speed rail project between San Francisco and Los Angeles in 30 minutes, instead of the 2 hours and 40 minutes on the VFT, people naturally got excited. But there are three questions. Will the ticket price be compeditive with existing air travel? Second, will the novel technology meet problems in research and development? Third, would consumers like being shot along a tube at almost supersonic speeds?
Given the price of an LA-FS link would be comparable with air travel, and the technology is conventional, the largest question is the third – consumer acceptance.
An alternative to test the third would be to build a smaller mass transit situation to augment or replace an existing airport shuttle service from check-in to terminal, or even between gates. such a system would operate in a mode where the capsules would spend half the time accelerating, and half decelerating. It would not reach the high speeds proposed in the hyperloop of 1000km per hour, and so provide an opportunity to trial consumer reactions and refine the technology.
How fast? A 0.5g force is an acceleration of around 5 m/sec/sec. Consider a 1 km run from the baggage check-in to a remote terminal. Double integrating we get the distance travelled as 5/2 times time squared. Solving for 500m distance we get a time of 14 sec to the half way point. The top speed will be 5t or 70 m/sec (or 256 km per hour). The entire trip with deceleration would take 20 sec.
If travelers are prepared to accept a 1g force in both acceleration and deceleration the entire trip would take 20 sec with a top speed of 100m/sec or 360 km per hour.
This would be sufficient to test the system even on these short runs.
But we all know the feeling of being treated like cattle that comes with the existing shuttle systems at Dulles and other major hubs.
Private, individual or dual pods may be the most desirable aspect to consumers, as they allow transport on demand, no waiting, and would take the ‘mass’ out of mass transport. This might be the major selling point.
The semi-technical document on the Hyperloop mass transport system, recently produced by Elon Musk, estimated the price of a one-way ticket as $20.
Transporting 7.4 million people each way and amortizing the cost of $6 billion over 20 years gives a ticket price of $20 for a one-way trip for the passenger version of Hyperloop.
Multiply 7.4 million trips by two then by $20 over 20 years and you get $5.92 billion dollars which is about the $6 billion estimated cost of construction of the Hyperloop. So $20 is the price at which the cost of construction (very simplistically) is returned in 20 years.
The amortized cost is not the ticket price, which must necessarily include such costs as management, operations and maintenance, and financial costs such as interest on loans and profits to shareholders. Thus the actual ticket price of a fully private venture would be comparable to an airfare, at least $100 say.
The Musk document is poorly worded at best or misleading at best. Major media outlets universally quoted a ticket price of $20.
According to New Scientist
He also estimates that a ticket for a one-way Hyperloop trip could cost as little as $20, about half what high-speed rail service is likely to charge.
Hyperloop would propel passengers paying about $20 (£13) in pods through a 400-mile series of tubes that would be elevated above street…
The Washington Post
How the Hyperloop could get you from LA to San Francisco in 30 minutes for $20.
USA Today, Huffpost, Fox News, and all of the internet tech blogs simply repeated the same story. While this is one more example of the total absence or research in the media, the blame also surely rests on Musk, who should correct the misrepresentation immediately.
Elon Musk unveiled his concept for a new mass transport system consisting of capsules shot along a partially evacuated pipe at very high speed.
The details contain estimates of a capital cost of less than $10 billion and the cost of a one-way ticket of $20 — not bad. Compare that to the estimated capital cost of $100 billion for a very fast train (VFT) system, a reduction in the transit time between Los Angeles and San Francisco from 3 hours to 30 minutes, and the proposal looks very attractive.
The numbers would be similar for an equivalent system in Australia. The VFT has been costed at over $100 billion for a Melbourne to Brisbane link – but given this estimate is probably optimistic, it comes in at the same price for a similar distance as the Californian VFT proposal.
The savings on capital cost come largely from the greatly reduced land acquisition of an elevated system. It has been the high capital cost (that would have to be borne by the taxpayer) that has made the VFT uneconomic in the past. (Of course, a colossal waste of public money never stopped the Greens from advocating it.)
The Hyperloop would radically change that part of the equation. As Elon said:
It was born from frustration at his state’s plan to build a bullet train that he called one of the most expensive per mile and one of the slowest in the world.
If tickets on the Hyperloop were comparable with air and bus transport of $100 – or more given the travel time between Brisbane and Sydney would be around 60 minutes – would provide an adequate margin for an entirely privately-funded venture.
Free marketers and global warming alarmists alike should be heartened by the handful of companies that claim a zero carbon emissions commercial energy plant based on a safe cold fusion (CF) reaction. An Italian company demonstrated a product called E-Cat in 2011, and a Greek company named Defkalion also provided a profession demonstration of their Hyperion product.
The distain for CF by the mainstream government-funded research community and the lack of government funding support is well known. Cold fusion results are routinely and categorically rejected by physics and engineering journals and there has been virtually no support from government funding agencies, except for the military.
Meanwhile, the lack public benefit from government subsidies of green energy sources is an embarrassment. Subsidies for renewable sources such as wind and solar – $88 billion in 2011 – dropping due to political backlash from increasing electricity prices. Hot fusion research over the last 50 years – $50 billion – is no closer to break-even, let alone a working power plant.
One could argue that funding research on government priorities has been deeply harmful to research. If young faculty members in physics find a field promising, but can only secure grants in government-determined priority areas, they are incentivized to focus on politically motivated fields. Keep activists out of research funding!
Nevertheless, the field has progressed thorough the efforts of professionals working in their spare time and amateurs experimenting in their garages, though marked by contradictory experimental results and outright mistakes, secrecy and paranoia by wanna-be entrepreneurs. There are dozens of theories, but none of them properly tested. Defkalion ICCF18 slides show a realtime mass spec system being designed which they hope will nail down what is happening in the NiH fusion processes.
Note to global warming alarmists:
“Science is our way of describing — as best we can — how the world works. The world works perfectly well without us. Our thinking about it makes no important difference. When our minds make a guess about what’s happening out there, if we put our guess to the test and we don’t get the results we expect, as Feynman says, there can be only one conclusion: we’re wrong.”
In general, there are only two way to prove something in science.
1. Prove a singular (fact) with an observation such as “black swans exist”.
2. Disprove a universal (theory) with a singular fact such as “all swans are white”.
The inability to disprove a singular, or to prove a universal, is due to our finite limits to our observations. In general, we cannot gather the infinite observations required disprove (1) a fact, or to prove a universal (2).
Scientists need to be rigorous and strict particularly in the initial stages of formulating a study, whether it is a singular or a universal that is being tested, and how the observations will impact.
A case in point: the impact of observations of global temperatures on the climate model projections plotted below. By a strict interpretation of scientific method, the observed “slow rise in global temperature” is a fact that disproves the universal “all possible trajectories of climate models under AGW warming”.
The only appropriate scientific response is to throw away all of those falsified models and all of the work based on them – extinction predictions, extreme events, agricultural trends, and so on – as it is scientifically worthless. You must go back to the drawing board.
The rules of science were illustrated recently in a post on Vortex about the Wright Brothers’ first flight:
To give another dramatic example, suppose at 1:00 pm on the afternoon of December 17, 1903, you were take a poll about whether man can fly. Suppose you asked people to place bets as to whether airplanes exist. Out of the 1.6 billion people in the world alive on that day, at that moment, the only ones who had ANY KNOWLEDGE of that question were Wilbur and Orville Wright and the members of the Kitty Hawk coast guard who had helped them fly that morning. In all the world, there was not another soul who knew the facts or was qualified to address the question. The opinions of other people were worthless. Meaningless. All the money in the world placed in a bet would mean nothing. There was an undeveloped glass plate photograph showing the first flight:
That photograph was proof. It overruled all opinions, all money, all textbooks, and the previous 200,000 years of human technology. A thermocouple reading from a cold fusion experiment in 1989 overrules every member of the human race, including every scientist. Once experiments are replicated at high signal to noise ratios, all bets are off. The issue is settled forever. There is no appeal, and it makes no difference how many people disagree, or how many fail to understand calorimetry or the laws of thermodynamics. The rules of science in such clear-cut cases are objective and the proof is as indisputable as that photograph.
Axil Axil suggested in the Vortex discussion list – about the only list I read these days – the name nanoplasmonics for developments in cold fusion (while referencing a very funny mockery of how academics will revise the history of cold fusion in 2015 – “History is written by the losers”).
The field is so new, Wikipedia has yet to have an entry dedicated to “nanoplasmonics”, except as a subheading to an entry Surface Plasmon Polaritons. An effect seen in bulk Nickel powder is not a surface effect. The reactors of Rossi and Defkalion may be based on a plasma phenomenon like polaritons in a nano-sized bulk medium, obviously, the headings should by rights be reversed.
The climate scare is collapsing, it seems, as climate scientists everywhere are renouncing their previous certainty.
Skeptics OTOH have been consistent. This blog in particular has been challenging since 2005 the establishment global warming views on such predictions as mass extinctions, significance of warming, decreasing rainfall and droughts.
It is instructive to look into ourselves and ask – how could the skeptics have been right – when the consensus of the learned experts thought differently? As a recent post at WUWT asked – what was my personal path to climate skepticism? Particularly when one has never before been at odds with the scientific mainstream.
The answer for me was elegantly expressed by A.O. Scott of the New York Times review of the Disney film Chicken Little. He said the film is:
“a hectic, uninspired pastiche of catchphrases and clichés, with very little wit, inspiration or originality to bring its frantically moving images to genuine life.”
My theory is that due to their scholarship in other fields – such as engineering, the hard sciences, and economics – skeptics are attuned to genuine scientific insight and not deceived by the “uninspired pastiche of catchphrases and clichés” that constitutes the majority of global warming research.
1. Powdered nickel is loaded with hydrogen and heated its Debye temperature – which is the temperature which maximizes the vibration of the individual molecules in the nickel lattice.
2. The hydrogen molecules (H2) are dissociated into a plasma by a spark from a spark plug. In the plasma the H atoms (consisting of a proton and an electron) are excited into elliptical orbits. Due to the elliptical orbit, the electron comes very close to the proton at one end, and so is screened to appear like a neutron (no charge).
3. Driven by the lattice vibration and the pulse of plasma from the spark, the screened H atom is driven into the nucleus of a Ni atom, producing Copper, Zinc, and other transmuted byproducts, and copious heat.
That’s their theory.
Todays demonstration by Defkalion Energy of their Hyperion Nickel/Hydrogen reactor showed their technology is ready for industrialization.
This technology promises to lower fuel costs by 1000 times. Energy will soon be abundant and safe and burning carbon unnecessary. Coal and renewables are dead men walking.
See the series of articles on LENR since 2010.
John Cook, Climate Communication Fellow from the Global Change Institute at the University of Queensland is on the record saying:
“animal species are responding to global warming by mating earlier in the year. This isn’t because animals are getting randier, it’s because the seasons themselves are shifting”
IMHO science is in need of a major shakeup.
Bjerknes compensation assumes a constant total poleward energy transport (and an inverse relation between oceanic and atmospheric heat transport fluxes (Bjerknes, 1964)). Contrary to this assumption, there is empirical evidence of a simultaneous increase in poleward oceanic and atmospheric heat transport during the most recent warming period since the mid-1970s (aka the Great Pacific Climate Shift). This paper argues that TSI directly modulates ocean–atmospheric meridional heat transport.
Solar irradiance modulation of Equator-to-Pole (Arctic) temperature gradients: Empirical evidence for climate variation on multi-decadal timescales, Willie Soon and David R. Legates. PDF
This paper raises more questions than it addresses. How sensitive is the estimate of global temperature to a change in the equator to pole temperature gradient? Can a change in the gradient produce an apparent ‘amplification’?
Another thought that has occurred to me is that climate models overestimate global warming but they underestimate Arctic melting. Could both failures be due to underestimating the response of meridional heat transfer from the equator to the poles?
Dick Smith, an Australian retail millionaire, has offered a $1M prize first to an italian inventor, and now to a greek company, Defkalion, if they can demonstrate a commercial LENR (low energy nuclear reaction, aka cold fusion) to the satisfaction of third-party scientific observers.
As I am convinced this is a scam similar to Firepower International (make sure you look it up on Wikipedia) I am not prepared to waste money on this until the test conditions have been agreed on.
As with the Rossie challenge the test must be one where the result will be accepted by reasonable people in the scientific community.
I hope the Swedish scientists will be involved. If not I feel sure we can get equivalent independent experts.
Thanks for the suggestion. I would like to this live on international television- say the U S 60 minutes.
To get up to speed on this quickly moving story, read here.
Two things I have noticed while tracking polls that differentiate Ron Paul’s from the others.
The way support for Ron Paul is firming from the twitter tracking Eg. Support for Mittens has a tendency to droop suddenly, with tweets about him dropping almost to zero, but RP stays firm throughout. I think people are running out of things to talk about him, as they have with the other candidates. But RP at least gives people a hopeful, positive conversational thread. Perhaps that is how elections are fought these days. A series of posts. Building a
following. Like building a web site.
The other is in the Gallup daily tracking polls that while the other candidates “go up like skyrockets and fall to earth as dead sticks” (Paul Keating), RPs growth, OTOH, is steady. Its like, when a person decides to vote for him, they don’t change their mind. Maybe it is something to do with the barrier to committing to him.
Once you cross it you don’t go back. This is called “non-decreasing” in mathematics.
Mittens has peaked at 35% in South Carolina. Will he fall to Earth?
Click here. Libertarian Dr Ron Paul gets more than all of the others combined.
Congratulations Julia and Wayne, on your new milestone – Australia’s National debt has topped $200 billion after Labor borrowing $100 million per day.
Australia now has its largest debt in history, after we borrowed $3.2 billion over the last week. On 11 March 2009, Treasurer Wayne Swan invoked “special circumstances” to increase the debt ceiling to a “temporary” level of $200 billion. In the last budget the government has increased the debt ceiling permanently to $250 billion.
There are 12.3 million taxpayers in Australia, so thats $16,260 of debt on behalf of each of us. Are you any better off?
h/t Senator Barnaby Joyce (LNP)
When the MSM reports on the commercialization of Ni-H cold fusion energy generation, they see parallels to the scientific treatment of AGW sceptics, citing “follow the money”.
If this new technology is real, it should be easy to prove and past failures – and outside agendas – shouldn’t stand in the way. Still, scientific discovery is expensive and money is often the X factor. Fortunes and reputations are made and lost based on results. Orthodoxies develop that discredit ideas posing a threat to the money flow, whether from government sources or from private investment. In the debate over “global warming,” scientists and politicians alike have resorted to repeating the mantra “the science is settled” as a means of freezing out researchers whose climate findings undermine public acceptance of the warming-planet credo and jeopardize billions in research funds.
This could be regarded an example of basic monopoly theory where the producers have an advantage in getting together and dividing up the higher profits. However, the more cartel member there are, the more difficult it is to maintain the “consensus”, and the smaller the slice of the pie. This is why a mainstream climate science has tried to limit and marginalize skeptical scientists who “undercut” the alarmist claims.
As always, the market will decide, but not without considerable sacrifice and dedication over a long period on the part of the skeptics and the truly innovative. Another solution would be the discourage academic cartels, by opening peer review and grant applications to a wider range of participants.
And the list of failures keeps growing.
1. Carbon Tax Lie – ‘There will be no carbon tax under the Government I lead.’
2. NBN – $50 billion Telstra subsidy
3. Building the Education Revolution – The school halls fiasco
4. Home Insulation Plan (Pink Batts) – Dumped after 3 deaths, and x house fires.
5. Citizens Assembly – Dumped
6. Cash for Clunkers – Dumped
7. Hospital Reform – Nothing
8. Digital set-top boxes – almost redundant technology that is cheaper at Harvey Norman
9. Emissions Trading Scheme – Abandoned
10. Mining Tax – Continuing uncertainty for our miners
11. Livestock export ban to Indonesia: – over-reaction, without trouble-shooting, that almost sent an industry broke
12. Detention Centres – Riots and cost blowouts
13. East Timor ‘solution’ – Announced before agreed
14. Malaysia ‘solution’ – In shambles
15. Manus Island ‘solution’ – On the backburner
16. Computers in Schools – $1.4 billion blow out; less than half delivered
17. Cutting Red Tape – 12,835 new regulations, only 58 repealed
18. Asia Pacific Community – Another expensive Rudd frolic. Going nowhere
19. Green Loans Program – Abandoned. Only 3.5% of promised loans delivered
20. Solar Homes & Communities plan – Shut down after $534 million blow out
21. Green Car Innovation Fund – Abandoned
22. Solar Credits Scheme – Scaled back
23. Green Start Program – Scrapped
24. Retooling for Climate Change Program – Abolished
25. Childcare Centres – Abandoned. 260 promised, only 38 delivered
26. Take a “meat axe”‘ to the Public Service – 24,000 more public servants
27. Murray Darling Basin Plan – back to the drawing board
28. 2020 Summit – Meaningless talkfest
29. Tax Summit – Deferred and downgraded
30. Population Policy – Sets no targets
31. Fuel Watch – Abandoned
32. Grocery Choice – Abandoned
33. $900 Stimulus cheques – Sent to dead people and overseas residents
34. Foreign Policy – In turmoil with Rudd running riot
35. National Schools Solar Program – Closing two years early
36. Solar Hot Water Rebate – Abandoned
37. Oceanic Viking – Caved in
38. GP Super Clinics – 64 promised, only 11 operational
39. Defense Family Healthcare Clinics – 12 promised, none delivered
40. Trade Training Centres – 2650 promised, 70 operational
41. Bid for UN Security Council seat – An expensive Rudd frolic
42. My School Website – Revamped but problems continue
43. National Curriculum – States in uproar
44. Small Business Superannuation Clearing House – 99% of small businesses reject it
45. Indigenous Housing Program – way behind schedule
46. Rudd Bank – Went nowhere
47. Using cheap Chinese fabrics for ADF uniforms – Ditched
48. Innovation Ambassadors Program – Junked
49. Six new Submarines – none operational
50. Copenhagen Climate Summit:- Rudd took 112 advisors on a big “carbon footprint”; for nothing.
51. Took a $20 billion surplus and turned it into a $57 billion deficit:- a $77 Billion turnaround. Took the $60 billion ‘Futures Fund’ and turned it into a $100 billion debt:- a $160 Billion turnaround.
From Tomy Gomme of the Climate Sceptic Party
The main candidate theories for low energy nuclear reactions involving Nickel-Hydrogen:
Polyneutron Theory of Fisher
Piantelli Hydride Capture Theory
Review of Possible Cold Fusion Mechanisms