How Many Millions More Do Governments Intend To Herd Into The Death Trains Of Junk Science?

The parallels are absolutely stunning…it’s all repeating in climate science.

To me this is a must watch. (If short on time, watch from 24:30 to 27:45).

History repeats. Like the lipid hypothesis, man-made global warming is fraud throughout.

The following chart at 36:18 mark tells it all:



Expert Blasts Alfred Wegener Institute Ocean Acidification Claim: “Clear Falsification Of Scientific Facts”

Ocean acidification: The terrible little brother of global warming

By Dr. D. E. Koelle
(Translated/edited by P Gosselin)

The alleged global warming, which now has not taken place for 18 years, has just received a “terrible little brother”. It was high time to find such a brother, especially since the older climate sister was becoming so weak.

Here that little brother is the not unknown “ocean acidification”, which was recently elevated by the Alfred Wegener Institute (AWI) in a press release dated 8 October 2014 to being the new global danger that comes with “dramatic impacts” ,”costs in the billions” and the claim that the pH value today is dropping 10 times faster than in the past.

There was no word however that the ocean in fact is not “acidic” in any way. Rather with a pH value of between 7.8 and 8.1 it is clearly alkaline. This is a clear falsification of scientific facts (but the citizens won’t notice at all). If anything, when viewed objectively, a reduction in alkalinity has nothing to do with an “acidification”, which would begin at a pH value of 6.9.

Figure: CO2 concentration in the atmosphere over the last 300 million years (Retallack) – completely without any anthropogenic impact.

Here the claim of the supposed pH value drop is hardly a serious one because there is no global pH measurement network that would allow such a claim to be backed up. Local datasets show a pH value fluctuation of +/- 0.1 points. What is confirmed is the fact that over 90% of the earth’s history, the atmospheric CO2 concentration was considerably higher than the very modest 400 ppm level we have today. The average over the last 300 million years was near 2000 ppm (see diagram from Retallack, which is based on changes in the stoma pores of Gingko plants). Neither the considerably higher CO2 levels over the earth’s history nor the maximum of 6000 parts per million has ever led to an “acidification of the oceans”.

If the claims of a damaging influence on coral reefs were true, then the corals would have died millions of years ago.

So just where is this kind of acidification supposed to come from? Approximately 11 Gt of CO2 (a third of the anthropogenic emissions) is taken up by the oceans, but it is ignored that at least the same amount (and there are also estimates of 20 Gt CO2 per year) getting stored as CaCO3 on the seabed. The complete CO2 circulation in the oceans is everything but known: large quantities of CO2 originate from hundreds of underwater volcanoes along tectonic plate boundaries – and without this, covering the huge need for CO2 by the underwater vegetation (assumed to be greater than even that on land) is not imaginable. The minimum pH values also do not occur at the sea surface, as is supposed to be the case with an atmospheric impact (as is falsely assumed by the IPCC report), but rather at approximately 1000 meters below the sea surface.

At the surface, the low pH values are measured in areas where the deep water currents arrive at the surface. The CO2 absorption from the atmosphere, which is supposed to cause an “acidification”, has to be considered in relation to the total amount of about 39,000 Gt CO2 that is already dissolved in the ocean. As here we are talking about 11 Gt CO2 per year, this is only about 0.028%!  Here already alone the natural impacts of annual ENSO activity and ocean currents with the temperature changes can be considerably larger.

Let’s hope that the AWI-conjured “little awful brother” soon disappears and that research in the field of ocean sciences gets back to being serious.


Austrian Daily Reports: “Huge Ice Growth Surprises Climate Scientists” … “Like One Not Seen In Decades”!

Antarctica_NASA PhotoThe Austrian online Kronen Zeitung here has an article about something most German-language media outlets have been too red-faced to report on: The sudden growth in polar sea ice.

The Kronen Zeitung opens with:

A huge growth in ice at the poles has surprised scientists and is casting questions. Is global warming taking a break? [...] For the prophets of climate change the new figures pose questions: At the poles of Mother Earth, in complete contradiction to prognoses of a complete polar melt, there is an ice growth like one not seen in decades.”

Almost the entire mainstream media has been quiet about this development. So it is refreshing to see that some media are reporting the “good” news that the planet is not warming alarmingly.

Antarctic ice growth “problem for penguins”

The Kronen Zeitung reports that Antarctic sea ice is growing at an average annual rate of 16,500 square kilometers since 2007. The case is pretty much the same for Arctic sea ice, the online Austrian daily reports.

The Kronen Zeitung also writes that the rapidly growing sea ice surrounding Antarctica is a “huge problem” for penguins, who need open water.

“Climate science turned on its head”

Moreover, the Kronen Zeitung mentions the surprise of the National Snow and Ice data Centre (NSIDC) in Colorado concerning the growth in the Arctic:

Scientists t the National Snow and Ice Data Center (NSIDC) in Boulder (Colorado) in any case have to admit that instead of a global warming, a global cooling is taking place. [...]

At the moment this development appears to have turned climate science on its head globally.”

The Kronen Zeitung then explains how the climate models have failed in that they predicted the very opposite to happen and that some scientists even desperately claimed that the measurements were wrong.

Max Planck scientists: “colder winters and cooler summers”

To explain what is happening, Kronen Zeitung turned to Professor Anastasios Tsonis of the University of Wisconsin. Tsonis says there are many factors at play. “Currents, winds, precipitation and foremost the upper and lower water layers.”

At the end of its article, Kronen Zeitung explains how the recent slowdown in overall solar activity may be playing a major role on the climate.

For years few sunspots could be observed. Colder winters and cooler summers could once again be the consequences, Max-Planck scientists say.”

Reported or not, the polar sea ice is there, and it cannot be ignored.

Hat-tip: Die kalte Sonne.

GISS Targeted Data Truncation And Tricks Alone Produce Half Of The Warming Trend Since 1880

It Is Even Worse Than I Thought!
By Ed Caryl

First, I must apologize for an error. In How Much Global Warming, I made an assumption about the old GISS file from 1999 that it was a global temperature file. I had missed the clear label on the file itself that it was surface stations only. As Bob Tisdale pointed out, I was comparing apples to oranges.

But when I found the correct land-only file to compare, I found the situation was even worse than I thought. Here is the correct comparison.


Figure 1 is the comparison of land-only (surface stations) data from 1999 and the current data.

GISS has more than doubled the warming trend in their published data in the last 15 years. This has been done by ignoring the years before 1880, cooling the readings before 1965, and warming the measurements after 1965. They basically shifted the measurement period 14 years to the right, doubling the warming trend. Here is the difference plot:


Figure 2 is a plot of the difference between the plotted data in Figure 1.

It doesn’t take much data manipulation to radically change the temperature trend. Truncating the early 14 years and adding the recent 14 years was half the change, and “adjusting” the station records slightly furnished the rest.

Surface station data is subject to many error sources. First, when thermometers are read by eye, the readings are in whole degrees, and a judgment is made on the spot on whether the reading is the lower or the higher number. The condition of the weather shelter and the thermometer is a factor. Then when the reading is recorded mistakes can be made.

More mistakes can be made when that recording is transcribed at the central office. Then higher authorities get involved with adjustments for missing records and UHI. For all these reasons, trends of less than one degree per century should be taken with a whole shovel-full of salt.

My thanks to Bob Tisdale. His expertise and experience in analyzing ocean temperatures and GISS records allowed him to instantly recognize the data I used. This is an example of open peer review in the climate blogosphere.


Activists All Pissy About Water Consumption…Call For Urinating In The Morning Shower To Save On Toilet Water!

Germany’s popular daily Bild here reports on a pair of English environmental activists from the University of East Anglia in Norwich who have a novel idea on how to cut back on water consumption: Urinating while taking your early morning shower rather relieving yourself in the toilet bowl.

Also read here in English.

Bild writes that Debs Torr and Chris Dobson calculated that if their 15,000 classmates all did the same, enough water to fill 26 Olympic-size pools would be saved each year.

This is a rather strange idea for a country where rain and water are hardly in short supply. Let’s not kid ourselves. This is just the latest nutjob idea to regulate humans back to prehistoric lifestyles.

The activist students would like to see their “GoWithTheFlow” water savings initiative implemented nationwide across Britain. Thankfully the two leakers are not calling on making it mandatory. Bild writes:

Chris requests to be considerate of others. “When students share the shower stalls, then you should only do it when it’s okay for everyone.”

Thanks for asking.

My God, can you imagine how the shower would smell after a few months time? Who brought up these two slobs?

To me it’s truly mystifying as to why there would be any urgency at all “to save water” in a rain-drenched country like Great Britain, as if water is some kind of disappearing resource. Is civilization now at “peak water”?

Even here in wet, damp Germany there are constant calls by enviro-lunatics to not waste water. Granted some places have seen the local groundwater table drop due to agriculture and industry, and so I suppose in some areas there are legitimate arguments to do so.

However at many municipalities, city officals are begging for an end to all the water-saving madness. Wastewater from homes and businesses has dropped to dangerously low levels and put city sewage systems at risk.

Waste water levels are so low that solid waste material in wastewater systems ends up stalling and plugging the system. The water is simply too inadequate to keep everything flowing. Indeed municipalities are pleading: How about a “GoWithTheFlow” initiative for sewage systems!

A number of German public utilities are forced to regularly pump millions of gallons of additional water just to keep the system from breaking down, clogging up and rotting out.

And don’t expect this kind of proposal to be the last. Look for calls for composting as a way to save water for the “other business”.


Spiegel Sees Potential Climatic Cooling From Iceland Volcanic As Its SO2 Emissions Reach “Historic Dimensions”

Volcanic activity in Iceland has risen dramatically over the past few weeks.

Yet, thankfully, the big eruption many feared never materialized and signs show that the pressure has been subsiding. Good news, many among us may think.

Bárðarbunga_Volcano,_September_4_2014_Peter Hartree

Bárðarbunga Volcano, September 4, 2014. Picture taken by Peter Hartree , CC BY-SA 2.0.

Yet science journalist and geologist Axel Bojanowski at Spiegel warns that there’s still enough to worry about. According to Bojanowski concentrations of sulfur dioxide (SO2) have “never been higher since measurements began in the 1970s“. The amount of SO2 emitted by the recent volcanic activity is surpassed only by the “largest of eruptions”.

What’s more, Bojanowski adds:

Seldom does so much sulfur gas get into the air. It could even cool the climate.”

Photo number 12 of Spiegel’s spectacular photo series here is a NASA computer model simulation depicting the spread of the sulfur dioxide cloud over Europe. The growing concentration of sulfur dioxide is a reason for “more concern”, Spiegel reports. High concentrations of sulfur dioxide in the air are corrosive and pose a threat to human health. Bojanowski writes:

Gradually it is posing an additional threat: to the climate. The emitted amounts of gas have already reached historic dimensions, reports the country’s environmental authority, the Icelandic Environmental Agency. Daily up to 60,000 tonnes of SO2 are released from the lava chasm.”

Bárdarbunga has already emitted approximately two million tonnes of SO2. Only the largest eruptions surpass this amount.”

Bojanowski adds that although the SO2 haze in the atmosphere is not visible to the naked eye, it is seen by NASA satellite, and it extends over parts of Europe. SO2 is an effective sunblock that acts to cool the atmosphere. Spiegel also describes the Laki eruption of 1783 and 1784, which led to a marked cooling and European crop failures.

According to Spiegel, Bárdarbunga eruption and gas emission is nowhere near on the same scale as Laki, which spewed 122 million tons of SO2 into the atmosphere. But Spiegel compares Bárdarbunga’s 2 million tons of SO2 to other major 20th century volcanic eruptions: El Chichon (7 million), which was enough to cause cooling globally. Pinatubo spewed 20 million tons and cooled the planet by 0.5°C for two years.

Though Bárdarbunga’s SO2 so far has not been shot up into the stratosphere, Spiegel warns that “two factors could make the volcano’s impact detectable: At high latitudes such as those of Iceland, the stratosphere is several kilometers lower than in the tropics, thus allowing the gas to reach it more quickly. Also chasm eruptions such as those at Bárdarbunga produce hot air upward currents over the volcano, which can carry the gases up to the stratosphere.”

Note that the SO2 gas has been carried in the air over to the European continent. Though Bárdarbunga’s SO2 may not have any real impact on cooling the planet, it certainly will not help to warm it either.


Using 1999 GISS Data, Global Warming Trend Since 1866 Only 0.5°C Per Century!



How Much Global Warming?
By Ed Caryl

We are told over and over again that the globe has warmed by 0.8°C since 1880 or 1850. Lately we have seen article after paper after publication that states this number in Fahrenheit, 1.44°F, because that sounds larger. But is this number correct? What is it based on?

GISS and Google “way-back machine”

Recently, a file from GISS in Google’s “way-back” machine came to my attention. This file of global temperature dates from 1999, before James Hansen became more rabid in promoting global warming. Here is a plot of the 1999 data, along with the current file from GISS:


Figure 1 is a plot of global temperature as published by GISS is 1999 versus the current publication.

Note that GISS has removed the data from 1866 to 1880, placing the beginning of their published data closer to the bottom of the early 1900s cool period. This changes the trend from 0.42°C per century to 0.66°C per century, a 50+% increase in the trend. This alone changes the warming from 0.6°C from 1866 to the present, to 0.8°C from 1880 to the present, resulting in the higher trend. Here is a chart of the difference between the two files.


Figure 2 is a plot of the difference between the two plots in Figure 1.

In Figure 2 we can see that the cool period around 1910 was cooled further by 0.2 degrees, but the cool period around 1970 was warmed slightly. They also minimized the cool 1880s and ’90s by warming those years by 0.1 to 0.2°C. So what was the real global temperature from 1866 to the present? I took the 1999 file and spliced on the satellite data from UAH from 1979 to the present, using the period of overlap from 1979 to 1999 as a baseline, avoiding the recent GISS adjustments. The result is this: Ed_2

Figure 3 is a plot of GISS global temperature from 1999 with UAH satellite TLT global temperature spliced on from 1979.

The trend in Figure 3 is half a degree C per century, with a total rise since 1866 of about 0.6°C. Because of the year-to-year variation, and the sparse station data in the early years, both the trend and the total rise have errors that are in the neighborhood of ±0.3°C. So the bottom line is that the warming since the mid-19th century is about 0.6°C ±0.3°C, or somewhere between 0.3°C and 0.9°C. Much of that warming, about 0.4°C ±0.2°C has taken place since 1980. But some of that is due to the cyclic nature of temperature.

The cycle from 1866 to 1940 had an amplitude of about 0.3°C, which, if extended to the present, means that the present temperature is at the peak of a cycle, or 0.15°C too high. This puts the total rise between 0.15°C and 0.75°C, or from almost nothing to something less than has been stated, with a center at 0.45°C. The recent solar maximum has also inflated the temperature. In the next 30 years, decreasing ocean cyclic temperature and a waning solar input will likely reduce the global temperature by about 0.4°C ±0.2°C, either back to the 1990s or to the 1960s. If the latter, there will have been no warming in the last 160 years.


Climate Change Dying As An Issue In German Media…Empty Seats Pack Hamburg “9th Extreme Weather Congress”!

This past week the 9th Extreme Weather Congress took place in Hamburg. Curiously this year there was very little coverage by the German media. Doing a Google search of the event turned up very few stories from the mainstream media.

9th ExtremWetterKongress 2014

Empty seats pack Day 3 of the Hamburg 9th Extreme Weather Conference, just minutes before starting. Source: here (11:12 mark).

The above photo is a snapshot from a Youtube video, just minutes before the start of Day 3 of the Congress.

Looks like the German media have grown fatigued by climate science in general and have sensed that something isn’t right with what the “experts” have been claiming. Record high sea ice, lack of hurricanes, low tornado activity, spectacularly failed climate models and bitter cold winters have a way of sobering them up.

Some German public television networks showed up on the first day, see here for example, but there too we see many empty seats – unusual given the opening first day hype.

Not a peep about Antarctic sea ice record

We begin to sense the media is feeling increasingly embarrassed about the climate issue overall. Any reminder of how they’ve been duped gets avoided altogether. Little wonder when Googling “rekord eis antarktis 2014“, we quickly find that the German mainstream media have totally ignored this year’s record high south polar sea ice event altogether. Too embarrassing! Besides, German viewers are totally bored by the climate issue.

Some small sites have reported the event, though. The online German weather site writes that Arctic sea ice is “considerably greater than the record low year of 2012” and that the German Polarstern research vessel of the Alfred Wegener Institute “did not succeed in crossing the Northwest Passage of the American continent in the second half of August“. also looked at the situation at the Antarctic, which this year smashed the all-time satellite era sea ice record. The site however, avoided the use of the word “record” and wrote:

…the sea ice around Antarctica reached 20 million square kilometers. Thus the 30-year maximum of last year was exceeded by about 0.4 million square kilometers.”

Of course this new satellite record high has baffled climate scientists, who are left stumped and only to speculate what is behind the unexpected trend. Wetteronline writes:

The reason for this, scientists suspect, among other factors, is a weakening sea current around Antarctica. Thus there is less mixing of the water masses which favours the growth of the sea ice.”

There’s no data to back this up, and so it just means the scientists don’t have a clue, are just shooting in the dark, and they should just say so.

Meanwhile, alarmist site Klimaretter presented its polar sea ice summary for 2014, but forgot to mention anything at all about the South Pole.


IPCC Models Fail Abominably In Projections of Northern And Southern Hemisphere Temperature

What follows is a modestly abbreviated version in English. The first part is the brief solar activity report, and the second part is about IPCC model failure.

The Sun in September 2014. Attention: X-Flares!

By Frank Bosse and Prof. Fritz Vahrenholt
(Translated/edited by P Gosselin)The sun in September was considerably more active than in the previous months. The sunspot number was 87.6, which was 89% of what is typical in the 70th month into a cycle. The current solar cycle 24 (SC 24) began in December 2008. Figure 1 shows the current cycle compared to the mean of SC 1-23, and solar cycle no. 1:

Fig. 1: The current SC 24 is shown in red, the mean of the previous 23 cycles is depicted by the blue curve, and the current cycle SC 24 strongly resembles SC 1, which is shown by the black curve.

The current cycle resembles SC 1, and should it continue to behave like SC 1, a trailing off of activity cannot be anticipated anytime soon. Indications, however, do point to a longer than normal cycle. Japanese researcher Hiroko Miyahara and his team examined this in 2013 (Influence of the Schwabe/Hale solar cycles on climate change during the Maunder Minimum). They were able to show that the length of the solar cycle correlates with solar activity. “The mean length of the Schwabe cycle during the Maunder Minimum was approx. 14 years, and during the Medieval Warm Period the average cycle length was only about 9 years.”

The sun today is relatively active, though slightly below normal. On September 10 there was an X 1.6 – flare, a a high category explosion on the sun. Flare are designated as follows: C for common, M for medium, and X for strong. See the following image, Figure 2:

Figure. 2: X-flare on 10 September 2014. Source:

With such powerful explosions, material gets ejected from the sun, what is known as a Coronal Mass Ejection (CME). When such plasma strikes the earth’s atmosphere, it leads to polar lights and other effects. The strength of X 1.6 was too weak to have any massive impact on the earth’s atmosphere and magnetic field.

Models fail to project temperature

Now let’s take a detailed look at the warming scenarios of the earth’s surface temperature, this time taking local particularities into account:

 Fig. 3: The mean surface temperature since 2000 compared to the period of 1950-1980, Source: GISS

Most of the warming took place in the northern extra-tropics at latitudes between 25°N and 90°N. This is consistent with the expectations one would have with the effects of greenhouse gases.  However, let’s take a look at the temperature series of the northern hemisphere extra-tropic region. Here we also see a “pause” since about the year 2000.

Figure. 4: The temperature curve of the northern extra-tropics as to GISS.

Indeed the trend from 1983 to 2013 differs significantly (0.33 +/- 0.06 °C/decade) from the 2000 to 2013 period (0.09 +/- 0.14°C/decade), which is no longer a significant warming. This hefty deceleration has occurred even though greenhouse gas emissions continued to rise linearly unabated. We reported multiple times on what the reasons could be for thi.

How do models handle the problem of the asymmetry in warming that we observe between both hemispheres? A recent paper from 2013 by 4 authors of the universities of Berkeley and Washington led by Andrew R. Friedman examined the difference between the northern hemisphere (NH) and the southern hemisphere (SH), named ITA, and summarized what the newest models anticipated for temperature:

 Fig. 5: The temperature difference between the NH and SH determined with the CMIP5 models. Source: Figure 2 of the just mentioned paper.

The text of the paper describes: With today’s emissions scenario (close to the IPCC scenario named RCP8.5) there is a highly linear rise of 0.17°K/decade (Point 3, “future projections“ of the aforementioned paper).

But let’s do a reality check and compare it to the actual surface observations since 1900:

Fig. 6: ITA as GISS since 1900, (Data: GISS), Model (thick green curve).

The 1982-2013 period is indeed at 0.165+/-0.04°C/decade, but it has become significantly less. The 1998-2013 period shows a trend of only 0.055+/-0.067°C/decade – not rising significantly – and is barely 30% of what was registered since 1982. No one here can claim that the trend is “highly linear”.

The cause? One cause is offered up by the 2013 paper: The drop in the late 1960s was not replicated by the models and was likely caused by internal variability, very likely by the AMOC (see Fig. 8b of the paper), the authors maintain.

Also the steep rise after 1915 would be due to variability 1915 … and at least for a part of the rise beginning in 1985 – as we have often maintained at this blog.

So it remains: Reproducing internal variability has not been adequately possible by models up to now. The dependency of temperatures on the forcing by greenhouse gases is stronger in models than what it is in reality. The models overblow the anthropogenic impact and thus yield exaggerated prognoses for the future.


Rossi’s E-Cat Verified, But Mystifies Independent Reviewers…The Dawn Of An Energy Revolution?

Sun_in_X-Ray NASABy Ric Werme

Many have been watching the gradual development of Andreas Rossi’s “E-Cat,” a device Rossi claimed to produce heat from fusing nickel and hydrogen at commonly used temperatures, as opposed to those in core of a star.

Photo: NASA

The next big event, the release of a paper reporting on a month-long test in March by a group independent from Rossi and his partner, Industrial Heat, happened today. The results are pretty much what I was expecting and essentially completely positive.

In a nutshell, the device produced so much energy that only a nuclear reaction can explain it, reaction products were seen, but no nuclear radiation was detected.

The test ran with an E-Cat cell in three phases:

1) no fuel charge

This was to verify the test setup measurement equipment could accurately measure both the electrical power into the cell and the heat released from the cell by convective heating and black body radiation.

2) approximately 800 W input power for 10 days, this produced some 1600 W excess power.

3) approximately 900 W for the rest of the test, this produced some 2300 W excess power.

This confirms what supporters expected. While the COP (ratio of output power to input power) was lower than expected, the authors make it clear that they deliberately ran the cell at low power to reduce the chance of thermal power.  They point out that the adding a little more than 100 W input power increased output by about 700 W.  That incremental amount is more in line with what was expected.

That’s mostly all that’s important – put power in, get significantly more power out. From what I’ve read, Industrial Heat has not yet used E-Cats to make high pressure steam and then electricity.  That may merely mean they haven’t settled on the mechanical design of the reactor, there’s no point in making a boiler until then.

The most interesting part of the report is the isotopic analysis of the fuel before the test run and the “ash” afterwards. The bottom line is that the reviewers have no idea what is happening during the test run.  They are utterly mystified and reject most of their speculation.

The fuel charge, only one gram, was assayed before the start of the test. The key components were determined to be nickel (Ni), lithium (Li), aluminum (Al), iron (Fe), and hydrogen (H).  (Two assay methods found carbon (C) and oxygen (O), but the paper seems to dismiss them citing the tiny granules of powder they used.)  The Ni and H were expected per Rossi’s descriptions in the past.

He also referred to a catalyst, saying it was inexpensive and not an impediment to wide scale deployment. The assay suggests the catalyst is

LiAlH4 which releases monoatomic hydrogen when it is heated, fitting the speculation about the catalyst’s role.

Each element was found to have the naturally ocurring ratios of its isotopes.

There had been speculation that Rossi used nickel enriched with particular isotopes, but apparently not.

The ash after the test run was also assayed. The small samples involved seem to preclude measuring the actual weight of various isotopes, so the paper concentrates on the percentages.  It would have been nice to have accurate weights.

Natural nickel is primarily 58Ni and 60Ni. Those were nearly completely consumed, and the nickel in the ash was nearly all 62Ni.  I had expected Ni + H leading to Cu, but several of the relevant Cu isotopes are radioactive, 62Ni is stable.

Lithium may not be a catalyst at all – natural Li is nearly all 7Li, a surface assay of the ash showed the lithium was nearly all 6Li. I’m no nuclear physicist, I’ll refrain from any speculation.  The authors explore a couple paths, but ultimately throw up their hands and simply say more study is needed.  Hydrogen wasn’t assayed – did it even participate?

All in all, this is a great, maybe historic, result. There has been plenty of evidence that the E-Cat works, but Rossi has always been directly involved.

Now we have an independent team working in their own space and with tools from their universities. They see it work and present multiple lines of evidence confirming it is a nuclear process.

That there is no explanation for the process is annoying, but won’t block commercialization of the E-Cat. The shouting isn’t over, the science has barely begun, but we may be at the start of civilization’s next major energy source.

Interesting times.

The paper is at

The best starting point is report-released/


German Federal Analysis Sees “Massive Threats To Security And Reliability Of Electric Power Supply System”!

So much for Germany’s transformation to “green” energies.

Germany’s Bun­desnet­za­gen­tur (Federal Network Agency for Electricity, Gas, Telecommunications, Post and Railway) is the federal authority overseeing and regulating the German electrical power grid, among other networks.

At its site it has a link to an expert assessment report that analyzes the needs of and risks to the German power grid for the coming 2014/15 winter.

The name of the report: “Examinations for the winter of 2014/15 with respect to risks for system security and the necessity for reserve power plants”.

The 102-page highly technical assessment examines a variety of scenarios in order to see how well Germany’s electrical power grid will hold up this winter. Looking at the report’s conclusion, one can only conclude that the power grid is more unstable and prone to a collapse than at any time in Germany’s post-war period. It’s a debacle knocking at the door.

In the summary on page 97 for example it writes (link added):

Scenarios were parameterized on the basis of historical data and realistically form expected critical situations, but do not necessarily show the worst-case scenario.
Considered scenarios show massive threats to the security and reliability of the electric power supply system which are not manageable without a substantial intervention by the ÜNB and the use of a secured redispatch-potential.
There are no safety reserves for managing additional critical or unexpected situations.”

On page 98 the report re-emphasizes.

In critical situations a substantial threat to system security is to be feared.”

The report’s summary adds: “Ssecure management of the expected critical situations requires comprehensive measures.”

This all means that on a cold winter day, Germany’s power grid could could very well collapse and citizens be left in the cold and dark for hours or even days. Parts of the report have been blacked out, which is hardly reassuring to the reader.

So why has Germany’s power grid, once one of the world’s most stable, become so vulnerable? An editorial piece at the Financial Times sums it nicely. It writes: “Merkel’s decision to phase out nuclear power has been a huge mistake.”

The FT piece writes that Germany has added a huge amount of intermittent wind and solar energy. Not only does this energy act to destabilize the power grid, but it also is costing German citizens and the economy a bundle. What a bargain: Poor quality for high cost! The FT writes that the Energiewende is “designed to make the economy predominantly dependent on renewable sources such as wind and solar power“, and adds that these are “burdens on households and businesses“, something that “Germany can ill afford”, the FT writes.

What’s worse for clean-energy-minded Germans is that the elimination of nuclear energy has led to an increase in coal burning. In the end, Germany’s power system is now dirtier, more unstable than ever, and now costs consumers far more. Does that sound like a great deal? Sounds to me like a monumental mismanagement.

Those of us living in Germany may want to consider installing a wood-burning stove in the weeks ahead as winter quickly approaches.

Hat-tip: 2 readers

Undeniable Mood Change With Regards To Quality Of Modelling Grips Climate Science…Trust Gone!

Mood change in climate modeling: Trust in the scientific community is disappearing
By Sebastian Lüning and Fritz Vahrenholt
(German text translated/edited by P Gosselin)

In the last few days we wrote two posts on the shocking deficits seen in the current climate models (see here and here). In our last part today we will look at how scientists estimate the modeling situation and look to see if there are new ideas to solve the problems.

In August 2014 a lead author of the 5th IPCC climate report, Richard Betts, publicly commented in a surprising manner. Betts directs the climate impact department of the UK Met Office, and at his website he describes himself as a climate modeling expert. In a comment at Bishop Hill, Betts wrote:

Bish, as always I am slightly bemused over why you think GCMs are so central to climate policy. Everyone* agrees that the greenhouse effect is real, and that CO2 is a greenhouse gas. Everyone* agrees that CO2 rise is anthropogenic. Everyone** agrees that we can’t predict the long-term response of the climate to ongoing CO2 rise with great accuracy. It could be large, it could be small. We don’t know. The old-style energy balance models got us this far. We can’t be certain of large changes in future, but can’t rule them out either.”

In a footnote Betts added the 2 comments:

*OK so not quite everyone, but everyone who has thought about it to any reasonable extent
**Apart from a few who think that observations of a decade or three of small forcing can be extrapolated to indicate the response to long-term larger forcing with confidence.”

Betts no longer gives climate models a central role in climate policy. There are still too many unknowns, he admits. Quite obviously even IPCC authors are now getting cold feet and are no longer able to exclude that CO2 may have only a minor impact on climate.

A month earlier in July 2014 in the Wall Street Journal climate modeler Robert Caprara conceded that a variety of freely selectable parameters exist in climate models, which allow the desired result to be “modeled in”. Caprara writes:

My first job was as a consultant to the Environmental Protection Agency. I was hired to build a model to assess the impact of its Construction Grants Program, a nationwide effort in the 1970s and 1980s to upgrade sewer-treatment plants. […] When I presented the results to the EPA official in charge, he said that I should go back and “sharpen my pencil.” I did. I reviewed assumptions, tweaked coefficients and recalibrated data. But when I reran everything the numbers didn’t change much. At our next meeting he told me to run the numbers again. After three iterations I finally blurted out, “What number are you looking for?” He didn’t miss a beat: He told me that he needed to show $2 billion of benefits to get the program renewed. I finally turned enough knobs to get the answer he wanted, and everyone was happy.”

In the climate debate Caprara recommends having an open discussion and listening to the arguments of the other side instead of cursing the other side in an attempt to disqualify them:

So here is my advice: Those who are convinced that humans are drastically changing the climate for the worse and those who aren’t should accept and welcome a vibrant, robust back-and-forth. Let each side make its best case and trust that the truth will emerge. Those who do believe that humans are driving climate change retort that the science is “settled” and those who don’t agree are “deniers” and “flat-earthers.” Even the president mocks anyone who disagrees. But I have been doing this for a long time, and the one thing I have learned is how hard it is to convince people with a computer model.”

Already in a paper from October 2012 a team of scientists led by Clara Deser in Nature Climate Change admitted that the strong natural climate variability that had been underestimated had been poorly accounted for by the climate models and so the models could not fulfill the high expectations of the political decision makers. The paper’s abstract states:

Communication of the role of natural variability in future North American climate
As climate models improve, decision-makers’ expectations for accurate climate predictions are growing. Natural climate variability, however, poses inherent limits to climate predictability and the related goal of adaptation guidance in many places, as illustrated here for North America. Other locations with low natural variability show a more predictable future in which anthropogenic forcing can be more readily identified, even on small scales. We call for a more focused dialogue between scientists, policymakers and the public to improve communication and avoid raising expectations for accurate regional predictions everywhere.”

Also well-known climate scientist Judith Curry has little trust in climate modeling. In October 2013 she complained in her blog about the missing estimations of climate historical studies – to the benefit of climate models. Huge sums had been invested in the models, without a correct result. The falsely claimed consensus by the IPCC catapulted the climate sciences backward at least a decade, said Curry:

My point is that ambitious young climate scientists are inadvertently being steered in the direction of analyzing climate model simulations, and particularly projections of future climate change impacts — lots of funding in this area, in addition to high likelihood of publication in a high impact journal, and a guarantee of media attention. And the true meaning of this research in terms of our actual understanding of nature rests on the adequacy and fitness for purpose of these climate models. And why do these scientists think climate models are fit for these purposes? Why, the IPCC has told them so, with very high confidence. The manufactured consensus of the IPCC has arguably set our true understanding of the climate system back at least a decade, in my judgment. The real hard work of fundamental climate dynamics and development and improvement of paleo proxies is being relatively shunned by climate scientists since the rewards (and certainly the funding) are much lower. The amount of time and funding that has been wasted by using climate models for purposes for which that are unfit, may eventually be judged to be colossal.

A more precise knowledge of paleoclimatology is essential and should have absolute priority ahead of free-style modeling because historical data are important calibration and check data for climate models. When the formulae are not correct, then even the largest super-computers are unable to deliver anything useful.

Also astrophysicist Richard Lindzen of the Massachusetts Institute of Technology (MIT) has no trust in climate models, as he explained at an event at Sandia National Labs, a research and development facility of the US Department of Energy.

The IPCC should finally open itself up to alternative models. In our “The Neglected Sun” book we presented a semi-quantitive approach where solar and ocean cycles played an important role. The awful accuracy rate of the IPCC models shows that it is time for a change. A serious check of the ideas of IPCC critics has to be conducted. Here models by Nicola Scafetta and Frank Lemke, which reproduce the temperature curve better than the IPCC forecasts, must be given serious attention. When it comes to oceans cycles, scientists have already given in and have even started to insert them into the models, thus making reliability of the climate prognosis dramatically better. An approach is for example DelSole et al. 2013 in a paper in the Geophysical Research Letters.


World’s Largest Re-Insurer “Munich Re” $ponsors 2014 “Extreme Weather Congress” In Hamburg!

Today parts of the German mainstream media have begun reporting on the 9th Extreme Weather Conference in Hamburg, which began today and is slated to end on Friday. The direct link to the program-pdf here. Of course the focus of the Congress will be on the claimed “increasing frequency of extreme weather events”.

In all 3000 experts are attending, along with 2500 Hamburg school pupils.

What especially raises eye brows is the fact that the event is sponsored by the Munich Re reinsurer, the biggest in the world. The Munich-Germany-based reinsurer of course stands to profit handsomely from the spreading of extreme weather fear; it makes it it a lot easier to jack up premiums (see “Spiegel Online doubts the catastrophe scenarios of the Munich Re“).

Also very murky is the identity of the organizer of the Congress: Institute for Weather and Climate Communication” (IWK). At their Die kalte Sonne website here, German skeptics Sebastian Lüning and Fritz Vahrenholt looked into who is behind the mysterious IWK? They write:

The result is sobering: Apparently the institute really does not have its own Internet platform. So we took a look at the legal page of Here indeed the “Institute for Weather and Climate Communication GmbH” is given. Listed as the managing directors are Frank Böttcher and Alexander Hübener. Frank Böttcher? Indeed we’ve written about him as well: “Extreme weather “expert” Frank Böttcher does not not know about the latest literature: Latest research results on global cyclone activity are damaging is climate-alarmist business“. Without climate fear and extreme weather alarm, the number of visitors for the commercialized Extreme Weather Congress would certainly be limited. Thus it’s little wonder that Böttcher, because of promotional reasons, is fervently preaching climatic doomsday to attract visitors into his auditorium.”

Lüning and Vahrenholt also look at a list of persons slated to appear: “Paul Becker of the German Weather Service, a climate alarmist to the bone. Peter Höppe of the Munich Re reinsurer will attend the introductory press conference, thus getting his sponsoring money’s worth.”

No skeptic was invited to the present at the Extreme Weather Congress. For the second year in a row Lüning and Vahrenholt were denied the opportunity to present there.

Vahrenholt and Lüning also ask themselves whether Mojib Latif will be able to muster the courage to disclose some inconvenient things that up to now he only has quietly admitted in the scientific literature. See “Mojib Latif: the proof of an anthropogenic climate contribution is difficult because the natural ocean cycles dominated” and “Mojib Latif in the presentation in USA: CO2 sensitivity was set too high by the IPCC“.

There is a bit of hope: “honest broker” and alarmism-critic Hans von Storch has also been invited, and so maybe he will infuse a little sobriety into the Hamburg climate panic-fest, see “Climate scientist Hans von Storch: Climate models possibly do not take solar activity sufficiently into account” and “Judith Curry prognosticates warming pause until the 2030s: Hans von Storch in such a case demands a vote of no-confidence against CO2“.

Overall, however, in view of the sponsorship by the world’s biggest player in the re-insurance industry, the murkiness surrounding the event’s organizers, and their exclusion of scientists with other views and data, the event has everything to do with serving corporate special interests rather than those of science.


Vienna Is Actually Now Cooling…And Not Warming As Media And Some Scientists Are Claiming

Vienna Climate Waltz More Data
By Ed Caryl

This is my comment on the Vienna Climate Waltz article Pierre just posted. I found the Vienna temperature data from GISS. It is listed under Wien/Hohe War, whatever that is.

My map application drops the pin nearly in the center of downtown from the GISS latitude and longitude numbers (which are notoriously inaccurate). The data file is complete from 1880 to the present, which is unusual. There are only two files in the GISS database, GHCN data before and after GISS homogenization. Here is a plot of those two files.


Figure 1 is a plot of Vienna annual temperature data before and after GISS homogenization, with the difference (the green trace). The difference scale is on the right, all scales in °C.

The homogenization has warmed the past and left the last 10 years unadjusted. This is unusual, as GISS usually cools the past and warms the present with their adjustments. The adjustment is probably for urban heat island effects, though they should be cooling the present and leaving the past alone.


Figure 2 is a plot of the last 16 years with trend lines.

There is a tenth of a degree adjustment for homogenization in the years 1998, 1999, and 2000. These adjustments change the cooling trend by more than 50%.

Curious George requested a 10-year average plot. Here is a centered 9-year average plot on the annual data. (It preserves the time accuracy better.)


Figure 3 is the annual homogenized temperatures with a 9-year centered average.


Figure 4 is a plot of seasonal temperatures (homogenized data).

While all seasons have gotten warmer in the last century, winter has the largest variation, but not much of a trend. I didn’t add the trends to Figure 4, but the numbers are: spring 1.29°/century, summer 1.41°/century, fall 0.73°/century, and winter 1.14°/century.


Figure 5 is a plot of the seasonal trends over the last 16 years.

The cooling in the last 16 years is all from the winters getting colder, at -0.382°C per decade.

Viennese Climate Waltz…Austrian Media/State Officials Still Using Faulty Models, Misleading The Public

UPDATE: Ed Caryl provided the following:


Based on GISS data


Even though IPCC climate models and expected climate trends have proven themselves to be completely false and useless, see here and here, parts of the Austrian media and state sector have no qualms using them, and in doing so they are misleading the public.

A recent example is the climate-alarmist Vienna-based Der Standard online daily in a recent piece titled: Climate in Vienna: More heat days, new plants.

The whole premise of the story is based on the climate models being right, which in fact today we know they have been universally wrong.

Palm trees in Vienna in a few decades!

In the article written by Christa Minkin and Julia Schilly, it is claimed that palm trees are to be expected in “a few decades in the Viennese forest – thanks to climate change,” citing ecologist Franz Essl of the Austrian Federal Ministry of Environment.

Minkin and Schilly also warn that Vienna is going to be hot in the future, all exacerbated by the urban heat island effect, citing “a new Austrian expert report on climate change“.

In 1910 there were only two heat days – i.e. temperatures over 30°C. In 2000 already 17 were measured.”

Der Standard also looks very deeply into the climate crystal ball…all the way to the year 2070 to 2100. Minkin and Schilly write:

For the period of 2070 to 2100 researchers anticipate a rise to more than 35 heat days per year on average. At the same time nights in Vienna will cool down less.”

Moreover, foreign plants will begin their invasion and displace domestic ones, the experts warn.

Slight cooling over the last 16 years

So with all the warnings of more unbearable heat days in the future, one might assume that temperatures in Austria must be currently on the rise. I searched the Internet for the temperature data series for Vienna, but unfortunately I wasn’t successful finding it. So I contacted the European Institute for Climate and Energy to see if they might be able to help out. They answered promptly by e-mail (slightly paraphrased):

Unfortunately we do not have the more recent data because the Austrian Weather Service does not make them public, only up to 2003. That’s why it’s not possible to show the last 15 years graphically, and so climatologists in Austria can claim whatever they want.”

Fortunately, EIKE was able to provide the recent data for Graz city center. Here we see despite the urban location there’s been a slight cooling.

Graz temperature trend

Mean annual temperature for Graz city center over the last 16 years.

The trend in Graz matches the overall trend of a slight cooling over central Europe over the last two decades.

So with the IPCC models having performed so horrendously, and in view of the fact there has been no warming trend in Austria for 16 years, it is truly a mystery how anyone could claim that summer heat days will just keep on rising linearly until the end of the century.

When the models are failures, then the future projections based on them are worthless.