Another New Paper Dismantles The CO2 Greenhouse Effect ‘Thought Experiment’

3 Atmospheric Scientists: Greenhouse Effect

Based On ‘Physically Irrelevant Assumptions’

 Dr. Gerhard Kramm                                                    Dr. Nicole Mölders

Yet another new scientific paper has been published that questions the current understanding of the Earth’s globally averaged surface temperature and its relation to the theoretical greenhouse effect.

Perhaps the most fundamental equation in climate science is the “thought experiment” that envisions what the temperature of the Earth would be if it had no atmosphere (or greenhouse gases).

Simplistically, the globally averaged surface temperature is assumed to be 288 K.   In the “thought experiment”, an imaginary Earth that has no atmosphere (or greenhouse gases to absorb and re-emit the surface heat) would have a temperature of 255 K.  The difference between the real and imagined Earth with no atmosphere is 33 K, meaning that the Earth would be much colder (and uninhabitable) without the presence of greenhouse gases.  Of that 33 K, it is assumed that CO2 concentrations in range of 200 – 280 ppm (the pre-industrial ranges for the last 800,000 years) contribute 7.2 K (~20%), while water vapor concentrations (ranging between about 1,000 to 40,000 ppm for the globe) contribute 20.6 K to the 33 K greenhouse effect.

Dr. Gavin Schmidt, NASA

“The size of the greenhouse effect is often estimated as being the difference between the actual global surface temperature and the temperature the planet would be without any atmospheric absorption, but with exactly the same planetary albedo, around 33°C. This is more of a “thought experiment” than an observable state, but it is a useful baseline.”

Atmospheric scientists Dr. Gerhard Kramm, Dr. Ralph Dlugi, and Dr. Nicole Mölders have just published a paper in the journal Natural Science that exposes the physical and observational shortcomings of the widely-accepted 288 K – 255 K = 33 K greenhouse effect equation.   They conclude that this “though experiment” is “based on physically irrelevant assumptions and its results considerably disagree with observations“.

The scientists offer a new approach to gauging the Earth’s surface temperature(s), and their results are significantly at variance with the 288 K – 255 K = 33 K  “thought experiment”.  For their calculations, they use observational measurements for the moon — which actually does not have an atmosphere — as their “testbed”.   Using moon data would appear to yield more reliable results than an imaginary-world Earth with no atmosphere.

The following is a very abbreviated summary of these scientists’ conclusions about calculating Earth’s mean temperatures.

Kramm et al., 2017

The planetary radiation balance plays a prominent role in quantifying the effect of the terrestrial atmosphere (spuriously called the atmospheric greenhouse effect). Based on this planetary radiation balance, the effective radiation temperature of the Earth in the absence of its atmosphere of Te ≅ 255 K is estimated. This temperature value is subtracted from the globally averaged near-surface temperature of about ⟨Tns⟩ ≅ 288 K resulting in ⟨Tns⟩ − Te ≅ 33 K. This temperature difference commonly serves to quantify the atmospheric effect. The temperature difference is said to be bridged by optically active gaseous gases, namely H2O (20.6 K); CO2 (7.2 K); N2O (1.4 K);CH4 (0.8 K); O3 (2.4 K); NH3+freons+NO2+CCl4+O2+N2NH3+freons+NO2+CCl4+O2+N2 (0.8 K) (e.g. Kondratyev and Moskalenko, 1984).
Since the “thought experiment” of an Earth in the absence of its atmosphere does not allow any rigorous assessment of such results, we considered the Moon as a testbed for the Earth in the absence of its atmosphere.  […] Based on our findings, we may conclude that the effective radiation temperature yields flawed results when used for quantifying the so-called atmospheric greenhouse effect.  The results of our prediction of the slab (or skin) temperature of the Moon exhibit that drastically different temperature distributions are possible even if the global energy budget is identical. These different temperature distributions yield different globally averaged slab temperatures. […] These [“drastically different temperature distributions” using the same global energy budget parameters, described in detail in the paper] values demonstrate that the power law of Stefan and Boltzmann provides inappropriate results when applied to globally averaged skin temperatures.
It is well known from physics that the mean temperature of a system is the mean of the size-weighted temperatures of its sub-systems. Temperature is an intensive quantity. It is not conserved. On the contrary, energy is an extensive quantity. Energies are additive and governed by a conservation law. Thus, one has to conclude that concept of the effective radiation temperature oversimplifies the physical processes as it ignores the impact of local temperatures on the fluxes in the planetary radiative balance.

Instead of focusing on the technicalities of these authors’ Earth-temperature calculations using moon data, it’s important to call attention to the 5-point critique of the 288 K – 255 K = 33 K greenhouse effect equation outlined in the introduction to the Kramm et al. (2017) paper.   The very first criticism listed is, by itself, worth expounding upon in detail.   Here it is:

(1) “Only a planetary radiation budget of the Earth in the absence of an atmosphere is considered, i.e., any heat storage in the oceans (if at all existing in such a case) and land masses is neglected.”

This is crucial.  Not only is the heating contribution of the water vapor-and-CO2 greenhouse effect viewed as a “thought experiment” because it uses an imaginary world without an atmosphere as its premise,  the 288 K – 255 K = 33 K greenhouse effect equation only considers a radiation budget analysis that pertains to atmospheric heating, not ocean heating.  This is theoretical negligence, as it is tantamount to claiming that we should measure the temperature of a person’s spit to accurately determine his overall body temperature.

According to the IPCC (citing Levitus et al., 2012), 93% of the Earth’s heat energy resides in the oceans.  The atmosphere hosts just 1% of the Earth’s heat energy “trapped” by greenhouse gases.  To be physically meaningful, then, the Earth’s energy budget and “mean global temperature” should be calculated by featuring measurements for the thousands-of-meters-deep oceans, and not the atmosphere vs. no-atmosphere conceptualization

Furthermore, it is essential to consider that the heat flux for the Earth’s climate system nearly always goes from ocean to atmosphere, and not the other way around.   The atmosphere does not warm the oceans; the oceans warm the atmosphere.

Ellsaesser, 1984 :  “the atmosphere cannot warm until the oceans do
Murray et al., 2000 :  “…net surface heat flux is almost always from ocean to atmosphere
Minnett et al., 2011 :  …the heat flux is nearly always from the ocean to the atmosphere

And because the direction of the heat flux is from ocean to atmosphere, for greenhouse gases like water vapor and CO2 to warm the atmosphere by 33 K, they necessarily must heat the oceans by that equivalent first.   In other words, for the Earth’s theoretical greenhouse effect to “work”, downwelling longwave infrared radiation (LWIR) from water vapor and CO2 must be fundamental players in heating the Earth’s oceans to depths of thousands of meters.

An unheralded problem with this conceptualization arises:  We have no physical measurements from a real-world scientific experiment that identify how much, if at all, parts per million (0.000001) increases (or decreases) in atmospheric CO2 concentrations heat (or cool) water bodies.

Even the anthropogenic global warming (AGW) advocacy blogs and acknowledge that we have no real-world evidence identifying the extent to which heat changes occur in water bodies when CO2 concentrations are varied in volumes of +/-0.000001 above them.   We have to use proxy evidence from clouds instead. :  “Clearly it is not possible to alter the concentration of greenhouse gases in a controlled experiment at sea to study the response of the skin-layer. Instead we use the natural variations in clouds to modulate the incident infrared radiation at the sea surface.” :  “Obviously, it’s not possible to manipulate the concentration of CO2 in the air to carry out real world experiments, but natural changes in cloud cover provide an opportunity to test the principle [that CO2 heats water].”

And the problem with using clouds as a proxy for CO2 is that even very small (1%) cloud cover variations can quite easily overwhelm and supersede the greenhouse effect associated with changes in CO2 concentrations due to the magnitude and dominance of cloud LWIR forcing.

Ramanathan et al. (1989)The greenhouse effect of clouds may be larger than that resulting from a hundredfold increase in the CO2 concentration of the atmosphere.” :    “Of course the range of net infrared forcing caused by changing cloud conditions (~100 W/m2) is much greater than that caused by increasing levels of greenhouse gases (e.g. doubling pre-industrial CO2 levels will increase the net forcing by ~4 W/m2)”

Using clouds as a proxy for CO2 in assessing how CO2 concentration changes affect water temperatures is therefore not comparing apples to apples in calculating their radiative significance, and thus any experimental results using clouds can not be generalized or assumed to simulate the heating effects of CO2 when varied over water bodies.

So we are left with an equation (288 K – 255 K = 33 K) that (a) is based upon a “thought experiment” using an imaginary world without an atmosphere; (b) claims to measure Earth’s temperatures, but doesn’t consider the temperatures of the Earth’s oceans as its primary parameter; and (c) assumes ppm changes in CO2 concentrations heat or cool water bodies to a measurable degree when raised or lowered even though no physical measurements from a real-world scientific experiment exists to support such a claim.

And this is just point (1) in the Kramm et al. (2017) critique of the 288 K – 255 K = 33 K greenhouse effect equation.   Four other criticisms of the “inadequate” equation are also listed below.

As these three atmospheric scientists conclude, the 288 K – 255 K = 33 K equation underlying the theoretical greenhouse effect “lacks adequate physical meaning as do any contributions from optically active gaseous components calculated thereby“. 

Kramm et al. (2017) critical analysis of the 288 K  – 255 K = 33 K greenhouse effect “thought experiment” (here referred to as Equation 1.4):

Kramm et al., 2017

(1) Only a planetary radiation budget of the Earth in the absence of an atmosphere is considered, i.e., any heat storage in the oceans (if at all existing in such a case) and land masses is neglected.
(2) The assumption of a uniform surface temperature for the entire globe is rather inadequate. As shown by Kramm and Dlugi (2010), this assumption is required by the application of the power law of Stefan (1879) and Boltzmann (1884) because this power law is determined by (a) integrating Planck (1901) blackbody radiation law, for instance, over all wavelengths ranging from zero to infinity, and (b) integrating the isotropic emission of radiant energy by a small spot of the surface into the adjacent half space (e.g., Liou, 2002, Kramm and Molders, 2009). These physical and mathematical reasons do not justify applying the Stefan-Boltzmann power law to a statistical quantity like Tns [globally averaged near surface temperature]. Even in the real situation of an Earth with atmosphere, (near-)surface temperatures vary notably from the equator to the poles owing to the varying solar insolation at the top of the atmosphere and from daytime to nighttime. Consequently, the assumption of a uniform surface temperature is inadequate. Our Moon, for instance, nearly satisfies the requirements of a planet without atmosphere. It has a non-uniform surface temperature distribution with strong variation from lunar day to lunar night, and from its equator to its poles (e.g., Cremers et al., 1971, Vasavada et al., 2012). Furthermore, ignoring heat storage would yield a Moon surface temperature during lunar night of 0 K (or 2.7 K, the temperature of the space).
(3) The choice of the planetary albedo of αE=0.30 is rather inadequate. This value is based on satellite observations. Hence, it contains not only the albedo of the Earth’s surface, but also the back scattering of solar radiation by molecules (Rayleigh scattering), cloud and aerosol particles (Lorenz-Mie scattering). Budyko (1977) already stated that in the absence of an atmosphere the planetary albedo cannot be equal to the actual value of α0.33 (at that time [1977], but today αE=0.30). He assumed that prior to the origin of the atmosphere, the Earth’s albedo was lower and probably differed very little from the Moon’s albedo, which is equal to αM=0.07 (at that time [1977], but today αM=0.12). A planetary surface albedo of the Earth of about αE=0.07 is also suggested by the results of Trenberth et al., 2009. Thus, assuming a planetary albedo of αE=0.07 and a planetary emissivity of εM=1εM=1 (black body) in Equation (1.4) yields T≅ 273.6K.   For αE=0.12 and εM=1εM=1 , one obtains: Te  ≅ 270K.  Haltiner and Martin (1957) explained the so-called atmospheric greenhouse effect by the difference between the Moon’s surface temperature at radiative equilibrium and the globally averaged near-surface temperature of the Earth. They argued that the mean surface temperature of the Moon must satisfy the condition of radiative equilibrium so that T≅ 266K.
(4) Comparing Te [Earth’s temperature without an atmosphere] with Tns [Earth’s globally averaged near surface temperature] is rather inappropriate because the meaning of these temperatures is quite different. The former is based on an energy-flux budget at the surface even though it is physically inconsistent because of the non-uniform temperature distribution on the globe. Whereas the latter is related to globally averaging near-surface temperature observations made at meteorological stations (supported by satellite observations).
(5) The Moon’s mean disk temperature of about 213 K retrieved at 2.77 cm wavelength by Monstein (2001) is much lower than T≅ 270which can be derived with the Moon’s planetary albedo of α0.12. Even though the Moon’s mean disk temperature observed in 1948 by Piddington and Minnett (1949) is about 26 K higher than that of Monstein (2001), it is still 31 K lower than T≅ 270K . Despite the Moon is nearly a perfect example of a planet without atmosphere, some authors argued that Equations (1.3) and (1.4) are only valid for fast-rotating planets so that the Moon must be excluded. Other authors, however, applied these equations for Venus that rotates a factor of four slower than the Moon. Pierrehumber (2011), for instance, used Equation (1.4) to calculate the temperature of the planetary radiative equilibrium for Venus. With αV=0.75α and ε1εV = 1 , he obtained T≅ 231K. Choosing α0.12α for the Venus in the absence of its atmosphere (which is similar to that of the Moon) yields Te317K and for α0.90 as listed in NASA’s Venus Fact Sheet T≅ 184K.
(Equation 1.4) is based on physically irrelevant assumptions and its results considerably disagree with observations. Consequently, the difference of ΔTa≅ 33K [the alleged planetary temperature difference with the greenhouse effect] lacks adequate physical meaning as do any contributions from optically active gaseous components calculated thereby.

Germany Shifts To The Right – May Mean Significant Slowdown For Country’s “Green Energies”

The German election results are coming in, and one thing is clear: Angela Merkel’s coalition government lost big. The preliminary figures show: (UPDATED Monday)

CDU/CSU center right – 33.0%
SPD socialist – 20.5
Left – 9.1
FDP free democrats – 10.7
Greens – 8.9
AfD hard right – 12.6 
Other – 5.0

Here we see that the German center-leftist parties (SPD socialists, Greens and Leftists) have seen their power erode further, pulling in a total of only 38.5%, while the more center-right oriented parties (Angela Merkel’s CDU/CSU, FDP Free Democrats and the hard right wing AfD) pulled in near 54.3%.

Four years ago in 2013 the result was far more balanced: 51% for the center/right side and 44.9% for the center/left parties, i.e. only a 6.1% margin.

The big winners are the business-friendly libertarian FDP Free Democrats and the right wing AfD.

The shift to the right means that the brakes are likely going to be put on the Energiewende and on efforts “to rescue the climate”. FDP leader Christian Lindner has been a vocal opponent to onshore wind park approvals in rural areas and forests and has also been critical of the subsidies paid out to green energies.

Though always ready to provide lip service on behalf of climate protection, the Paris Treaty and green energies, Angela Merkel has also softened her rhetoric against fossil fuels during the campaign. When looking at Merkel’s record on the environment, it speaks much greater volumes than her words do.

Germany has not cut back on greenhouse gas emissions in 7 years.

And even though she says she remains committed to greening up the country’s energy supply, Merkel clearly has shown to be content taking a middle way between green demands and the needs of the industry.

The latest election results will likely have Merkel taking an even more pragmatic, business-friendly approach.

AfD rises

Among the big winners of the election is the right wing, anti-immigration AfD party. During the campaign the AfD was committed to eliminating subsidies to wind and solar energy, and called for more support for fossil fuels and nuclear power.

Wind power has become a substantial issue among German environmentalists as a number of them opposed to wind-park construction have sided with the AfD.

The AfD is also the only party that has dared to challenge global warming science. The emerging AfD party is a sure sign of growing opposition to Germany’s Climate Religion. Expect tougher times for green projects in the upcoming legislative period.

Tough coalition talks ahead to form new government

Environmental policy in Germany over the next four years will of course depend on the make-up of Merkel’s next government. The question that remains: Which party is Merkel going to form a coalition government with?

A coalition between her party and the business friendly FDP falls short of a majority, so expect Merkel to court the SPD into forming yet another grand coalition government. But this time don’t expect the SPD to play along, as they’ve announced they’re no longer interested.

Coalition with the Greens

So Merkel is left to try to entice the Greens to join the coalition together with the FDP. Merkel would have no qualms working with the climate-alarmist, wide-open border Greens. She’d be absolutely content leaving the Environment Ministry under the control of the Greens and putting the business-friendly FDP in charge of the Economics Ministry and letting the 2 junior parties slug it out while appearing to be above the fray for the next 4 years. That’s how Merkel works.

It’s going to be tough. The Greens have said they will accept being a coalition partner only if the CDU agrees to end coal power by 2030, a condition that hopefully the FDP will refuse.

The CDU/CSU has already excluded any chance of forming a coalition with the AfD due to “extreme right wing tendencies“. Doing so would mean the end of the love-affair the media have with Merkel.


Germany Expected To Shift To The Right In Today’s National Elections, Green Energies To Take Back Seat

Later today I’ll be posting on the result of Germany’s national elections, once it starts coming in this evening.

Angela Merkel’s center-right Christian CDU/CSU (Union) is expected to win easily. But what the new government will look like remains totally open. The latest opinion poll shows:

Latest opinion polls show CDU/CSU (Union) in the lead. Source:

Personally I think the result will be a bit different from what the above prognosis shows. Here’s what I’m projecting:

CDU/CSU (Union): 36.4%
SPD (Socialists): 21.7%
FDP (Free Democrats): 10.4%
Greens: 7.3%
Linke (Leftists) 9.1%
AfD (Hard Right) 10.3%

Overall a rather substantial shift to the right (Union, AfD, FDP) is expected as concerns over immigration, cultural transformation, crime and uncertainty over Germany as an industrial base have swept across the working class.

Tougher times for green energies

Such a shift to the right compared to the last election will likely mean tougher times for those pushing environmental, clean energy and climate issues.

Opinion polls, it needs to be pointed out, have shown an usually large number of undecided voters still remaining, and so there’s plenty of room for surprises.

There’s been quite a hit of grumbling among German voters, and many may opt to express their discontent by secretly voting for the AfD. There’s potential of the party coming close to the 15% mark. Such a result would send shock waves across Europe, and force the established mainstream parties to do a quite a bit of rethinking.

More later today!

Rapid ENSO Prediction Reversal: ´Now Global-Cooling La Niña Is Forecast Ahead!

Making forecasts concerning weather and climate is not an easy task. There is really much we do not understand, though some like to make you think everything is all understood and settled.

This is why I get a kick out of people who claim they are able to predict decades into the future, yet have very little data about the oceans and their cycles, which play a huge role in climate, and can’t even predict the El Niño Southern Oscillation (ENSO.

Nothing demonstrates the woeful lack of certainty regarding forecasting better than the ENSO  forecasts made this year.

Joe Bastardi here at Twitter showed the following 2 charts from the International Research Institute (IRI), where the first one shows the forecast made in April and the second one shows the forecast just made this month:

The forecast made in April 2017 above shows a powerful El Niño in the works, with equatorial Pacific surface temperatures at about 1°C above normal.

The second chart above shows the recent mid-September forecast, which is now completely opposite. It indicates a La Niña in the pipeline, which means early next year’s satellite sea surface temperatures will be cooled down and the overall global temperature trend will have been flat for some 2 decades.

In the following chart from Tropical Tidbits here presents what sea surface temperatures are at the moment:

Note the relatively frigid equatorial Pacific sea surface temperatures. According to the April forecast (first chart above) by the IRI, we were supposed to be in warm El Niño territory by now, some 1°C warmer.

Also note how much the recent hurricanes have cooled parts of the Atlantic and Caribbean. Still, there’s much contrasting warmth south of the Gulf of Mexico and like meteorologist Joe Bastardi says, the conditions and patterns continue to be ripe for more hurricane incubation.

Next we see the subsurface sea temperature anaomalies for the equatorial Pacific down to 300 meters depth:

Chart: NOAA.

Four months ago subsurface temperatures were much warmer, see page 12 here. Today the NOAA says there is an increasing chance (~55%-60%) of La Niña during the Northern Hemisphere fall and winter 2017-18.

So what can we draw from all this? Expect continued global cool down in the months ahead. At the same time, never exclude the possibility of surprises.


German ARD Meteorologist: “Can’t Blame Climate Change” For This Year’s Hurricanes… “Many Factors”

Last Tuesday morning German flagship ARD public television meteorologist Donald Bäcker surprised some climate-realist viewers here with a very level-headed look at the factors behind hurricane development.

I use the word “surprise” here because the massive German public media system are generally devout warmists and vigilant gatekeepers against skeptic views. Open discussion here means discussion only among adherents and the like-minded. Anyone with a dissident view usually is branded and excluded.

Before telling viewers the day’s forecast for Germany, Bäcker gave some background information on the current hurricane situation during the first 42 seconds of the clip.


German ARD public television meteorologist Donald Bäcker tells viewers many factors are behind hurricanes, and not just climate change. Image cropped from ARD moma here.

He begins the segment by reminding us that there is “still a large need regarding research on climate and weather“, adding:

We can of course make everything really easy and say the powerful hurricanes in the current season are caused by climate change. But you just can’t do that. With this you cannot explain the years 2008 – 2015, as during this period there were practically no strong hurricanes in the Caribbean region. It has to do with a number of factors, among them ocean currents are to blame, and you need triggering factors. These are the so-called easterlies.”

Bäcker then explained how this year all the factors are in place and the conditions for producing hurricanes are “optimal”. A refreshingly non-dogmatic analysis, and so hats off to the ARD in this one particular case.

Politics, or science?

Of course, it didn’t take long for an alarmist rebuttal to be made two days later. And what better person to provide it than a “scientist” from the ultra-alarmist, politically activist Potsdam Institute for Climate Research: Manfred Stock.

A visibly desperate Stock tells viewers that man is mostly responsible for the terrible hurricanes and insists that the PIK scientists have the data to show it. Stock warns that we are at a crossroads (again) and that unless we change our ways quickly and drastically, we are going to see a catastrophe.

Stock added that hurricanes have become worse since 1980. But here he failed to explain the lack of hurricanes in the Caribbean from 2008 – 2015 mentioned by Bäcker and the overall downward trend of the past 100 years.

When it comes to hurricanes, and number of other climate and weather issues, the PIK alarmists have a very pronounced habit of forgetting things and misleading the public.

Stefan Rahmstorf’s amnesia (or fraud?)

For example just days ago Stefan Rahmstorf of the PIK claimed that this year’s spate of hurricanes were “unprecedented” – a statement I had a hard time believing. So I asked hurricane expert Philipp Klotzbach at Twitter about this. He replied:

The PIK seems to have a big problem with data selection and processing. Moreover, it’s always either we have to drastically change how we live, and quickly, or we will see planetary disaster. It’s the old playbook used by charlatans again and again throughout the history of civilization.


New Paper: ‘Extremely High’ TSI, El Niño Episodes Since 1970s Exert ‘Robust Control Over Himalayan Glaciers’

Scientists Rebuke Claims Of Human

Control Over Glacier Mass Balance

“Natural climate variability still emerges as the key deciding element governing the Himalayan glacier mass balances.” – Shekhar et al., 2017

Yet another new paper has challenged to IPCC-endorsed conclusion that the Himalayan glaciers are melting rapidly due to anthropogenic climate change, and that these glaciers will very likely “disappear” by 2035.

The claim that the Himalayan glaciers would disappear by 2035 was included in the 4th IPCC report (2007) not for scientific reasons, but to put political pressure on world leaders.

However, a 2014 comprehensive analysis of over 2,000 glaciers in the region indicated that 88% of Himalayan glaciers are stable or advancing, with overall negligible change (0.2%) since 2000.

Prior to this apparent 21st century “pause” in Himalayan glacier melt, a substantial glacier retreat occurred between the 1970s and 1990s for the region.   There were also decadal-scale periods of severe glacier melt during the 1600s and 1700s (Little Ice Age), when atmospheric CO2 levels were significantly lower (~275 ppm) than they are now.  The amplitude of the glacier retreat during the 17th and 18th centuries sometimes even exceeded the melt rates achieved during the last few decades (Shekhar et al., 2017).

According to a new paper published in the Nature journal Scientific Reports entitled “Himalayan Glaciers Experienced Significant Mass Loss During Later Phases Of Little Ice Age“, the high/low temperatures and melt rates achieved during the Little Ice Age were determined by varying magnitudes of solar activity and El Niño episodes.   After the 1970s, Himalayan glacier melt and temperature changes were only “partly” influenced by human activity, but they were primarily driven by solar activity variations (the Modern Grand Maximum) and natural oceanic heat oscillations internal to the Earth’s climate system (El Niño Southern Oscillation [ENSO], North Atlantic Oscillation [NAO], and Atlantic Multidecadal Oscillation [AMO]).

The authors conclude that there is a “robust natural climatic control over the Himalayan glaciers“, even for recent decades.  This conclusion contradicts the IPCC’s contention that Himalayan glacier melt is controlled by human activity.

The Himalayan glaciers will not be “disappearing” by 2035.

Shekhar et al., 2017

Half  Of The World’s Non-Polar Glaciers (Himalayas) Were Already Melting During 1600s, 1700s

[T]he Hindu Kush-Himalaya (HKH) harbors ~50% (by area) of all the glaciers outside of the polar regions. … Our research is the maiden attempt to reconstruct the longest regional scale glacier mass balance records for the Western Himalaya based on tree-ring sampling at an unprecedented scale. Another highlight of our study is that it presents valid evidence of the significant mass loss experienced by the Himalayan glaciers even during the LIA [1500-1850].
[W]e believe that the episodes of significantly negative mass balances … were the result of an enhanced El Niño affecting the ISM [Indian Summer Monsoon] and increasing the temperatures … [and] a more direct relationship between the high TSI and more negative mass balances during the LIA in the years with potentially weaker El Niño.

‘Extremely High’ TSI, El Niño Since 1970s Resulted In ‘Severe Glacial Mass Loss’

In the case of the Himalaya, the […] phase of rising regional temperatures, and the start of the strong solar cycles that in later years (since the 1970s), started showing substantial coupling with strong El Niño episodes.  [M]ass balance periodicities of 9–12 years during ~1970–1990 [are] a representation of the response to a few of the strongest consecutive solar cycles in past 400 years. In fact, we see that ~50% of the years since 1970 experienced an exceptionally high TSI of >1361 W m−2, ~40% of which also underwent warm phases of ENSO.
Although, the study acknowledges the contributions of anthropogenic drivers of climate change in the Himalayan region, it also highlights a strong effect from the increased yearly concurrence of extremely high TSI with El Niño in the past five decades, resulting in severe glacial mass loss.

Natural Variability, With TSI And ENSO, NAO, AMO As Drivers, Control Glacier Mass Balance

Although external anthropogenic forcing can partly control the glacial regime in the Himalaya, the natural climate variability still emerges as the key deciding element governing the Himalayan glacier mass balances. Similar to several other studies for the region, our study also identifies ENSO, NAO, and AMO as the primary drivers of the regional mass balance variability. The fact that the past few decades have experienced intensified episodes of NAO, closely correlating with rising temperature, also suggests a robust natural climatic control over the Himalayan glaciers.

Jose And Maria Frustrate Global Warming Ambulance Chasers, Media And Warmunistas

UPDATE 3: But NOAA keeps moving track further out to sea…

UPDATE 2: Joe Bastardi says Maria not hitting US “not a done deal”.

UPDATE 1: NOOOOOO! F?§#! …NOAA updates latest storm track…takes Maria even further out to sea…

A few days ago it looked as if the US coast could be hit by two large hurricanes: New England by Jose, and later the Southeast by Maria. Global warming activists and the haters of our modern industrial society were salivating.

For example on September 17, the Washington Post presented one model with Maria barreling straight into North Carolina.

After all, imagine all the wonderful media headlines proclaiming “unprecedented destruction and hurricane forces“. It would be a wonderful field day. With such destruction, how could Denier in Chief President Donald Trump possibly be able to dispute that man is the cause? The witch hunt for and purge of deniers could begin in earnest.

Wild 1933 hurricane year

Yet, reality shows us that hurricanes have always been just as violent and occurred just as often in the past, if not more often. For example, Ryan Maue here reminds us of the fury of 1933 season:

In that year, Maue tweeted, saw 15 of 20 storms hitting land “with 6 majors and 2 Cat 5’s“. Imagine if that were to happen today. This type of destruction is precisely what the global warming ambulance chasing media and fake scientists are hoping for today. so it is only natural that some days ago Jose and Maria showed signs those glory days maybe returning – possibly the chance of two hurricanes hitting the US at once!

But now the most recent computer models show that Jose is in its death throes, stuck off the coast of New England, crumbling and no longer posing any serious danger. At Twitter the outstanding wxcharts here shows the latest tracks for Maria and Jose:

A storm that protects us?

At the top of the graphic above we see the remnants of Jose. Ironically hurricane Jose, which alarmists had hoped would smash violently into the Northeast, is turning out to be a possible savior in that it could play a key role in deflecting powerful Maria away from land. Just imagine: A global warming produced hurricanes that protect us!

As the chart above shows, Maria is projected to head out to sea, thus allowing us to be more hopeful. Yet it is still too early call off the alarms. There’s still some chance that Maria could veer off the model projected course and make landfall. Readers living on the East Coast must remain vigilant. Thankfully, most computation see the storm tracking out to sea.

Today at his Daily Update, Joe Bastardi cautions us and points out that Maria still has a considerable window to make landfall around North Carolina. Hurricane forecasts beyond 5 days Harbour tremendous uncertainty.

NOAA also has Maria headed out to sea with its latest cone:

Maria projected to head out to sea. Source: NOAA.

The profiteers of bad news

The media and climate ambulance chasers of course will deny that they are becoming disappointed by the latest tracks, and that people couldn’t be so mean as to wish deadly storms to strike land. But it’s not so. Much of the mainstream media are terrible people who are agenda-driven. They deceive their readers and try to manipulate public perception with fear. They make their livings with bad news. Bad news for them is good news. How often do you ever see them write about good news? How often do we see them present things on their bright side? They’re just nasty people.

But there is some good news out there for the media, climate ambulance chasers and mass destruction fantasists: the hurricane season still has a long way to go, and so they can hope for new hurricanes. It still remains an ideal year for hurricanes.


NOAA Models Project Harsh 2017/18 European Winter…Possibly Coldest This Century

Weather and climate analyst Schneefan here writes that the 2017/18 winter in Europe could be one of the coldest of the last 20 years.

In mid September NOAA’s CFSv2 weather model once again crunched out a cold temperatures across Europe for all three winter months (December (left), January (center), February (right)) for the coming 2017/18 winter:

Meteociel/CFS prognosis dated 1 September 2017 for the temperature deviation from the long-term mean at 850 hPa (approx. 1500 m) in Europe for the 2017/18 winter. Source:

Schneefan writes one has to go back to the 1990s to find a negative 2.0°C deviation from the 1961-1990 mean that is projected for Germany. That deviation translates to almost 3°C when compared to the 1981-2010 mean. That would would be awfully cold.

The following chart shows the winter temperature anomalies for Germany for each year since 1901:

If projections come true, Germany would face one of its coldest winters in the last 50 years. Source:,3260663,3260663#msg-3260663

The latest CSFv2 model run confirms the earlier cold projections that have been calculated since mid June, 2017.

Cooler than normal autumn

Projections for this fall (September, October, November) are also on the cool side. An analysis from 17 September shows that Central Europe will see temperatures that are about 1°C below the 1981-2010 mean. So far the first three weeks have been right on the money.

Schneefan warns that it’s still too early to rely on the latest trend and to bank on it, but adds: “If these cold projections for the 2017/18 winter keep appearing in the next model runs this fall, then the probability increases.”

Also Schneefan writes that we should not expect any general warming trend soon after the coming winter, due to the lowest solar activity is 200 years, the cooling La Niña that is beginning to take hold, and the already falling temperatures taking place in the wake of the 2015/15 El-Niño.

There are other signs that change is possibly in the works:

After the ice mass growth in Greenland for the first time in the current century and a new record cold July temperature (-33°C) set in Greenland, no one should be surprised that the 2017/18 winter will be the coldest in Europe and other parts of the northern hemisphere this century.

And to potentially make matters worse, the Bali volcano Agung is now at warning level “orange”. The last eruption was in 1963 with a VEI of 5!. So rapidly could global climate unexpectedly and naturally change.

Readers need to note that the projections involve considerable uncertainty, and the winter of course may develop completely differently. Yet, many meteorologists had projected earlier this year a severe hurricane season this year based on oceanic patterns, and that has come true.


New Papers: Seismic Activity Explains 1979-2016 Temperatures, ENSO Events Better Than CO2

Could Earth’s Shifting Plates

Be Driving Modern Climate?

Within the last year, Dr. Arthur Viterito (geography professor) has published multiple scientific papers documenting the significant correlation (r=0.80) between the seismic activity changes in the Earth’s high geothermal flux areas (HGFA) and both El Niño events and global temperatures.

The HGFA/global temperature correlation has been found to be stronger than the correlation for CO2 concentration changes (r=0.74) for recent decades (1979-2016).

Other recent research has provided further support for the significant influence of seismic activity (i.e., there is a very high correlation [r=0.935] between geothermal flux and North Magnetic Dip Pole movement).

These robust and well-documented seismic activity associations have led Dr. Viterito to call for a reconsideration of the paradigm that says variations in atmospheric CO2 concentrations drive changes in global temperatures.

Viterito, 2016

Viterito, 2017

The Correlation of Seismic Activity and Recent Global Warming (CSARGW) demonstrated that increasing seismic activity in the globe’s high geothermal flux areas (HGFA) is strongly correlated with global temperatures (r=0.785) from 1979-2015. The mechanism driving this correlation is amply documented and well understood by oceanographers and seismologists.

Namely, increased seismic activity in the HGFA (i.e., the mid-ocean’s spreading zones) serves as a proxy indicator of higher geothermal flux in these regions. The HGFA include the Mid-Atlantic Ridge, the East Pacific Rise, the West Chile Rise, the Ridges of the Indian Ocean, and the Ridges of the Antarctic/Southern Ocean. This additional mid-ocean heating causes an acceleration of oceanic overturning and thermobaric convection, resulting in higher ocean temperatures and greater heat transport into the Arctic. This manifests itself as an anomaly known as the “Arctic Amplification,” where the Arctic warms to a much greater degree than the rest of the globe.

As illustrated in CSARGW, jumps in HGFA seismic activity can amplify an El Niño event, a phenomenon referred to as a SIENA or a Seismically Induced El Niño Amplification.  Accurately predicting two of these amplified El Niños (i.e., the 2015/2016 event plus the1997/1998 episode) is an important outcome of the HGFA seismicity/temperature relationship.

Applying the same methodology employed in CSARGW, an updated analysis through 2016 adds new knowledge of this important relationship while strengthening support for that study’s conclusions. The correlation between HGFA seismic frequency and global temperatures moved higher with the addition of the 2016 data: the revised correlation now reads 0.814, up from 0.785 for the analysis through 2015. This yields a coefficient of determination of .662, indicating that HGFA [high geothermal flux area] seismicity accounts for roughly two-thirds of the variation in global temperatures since 1979.

Viterito, 2017

[T]he idea that increased flux of oceanic geothermal heat (as indicated by increased seismic activity in these areas) can significantly alter temperature counters the hypothesis that increasing carbon dioxide has been the primary driver of recent global temperature change. Despite the general “non-acceptance” of this hypothesis, a recent study by Williams (2016) links a seemingly unrelated geophysical phenomenon to mid-ocean seismicity; thus a new paradigm may be emerging from this important association. Specifically, Williams shows that the speed at which the North Magnetic Dip Pole (NMDP) moves is highly correlated (r=0.935) with mid-ocean seismic activity.

More importantly, multiple regression analysis corroborates the findings of the previous studies: mid-ocean seismic activity is significantly correlated (p<0.05) with changing temperatures.

However, CO2 concentrations, along with NMDP [North Magnetic Dip Pole] displacements, do not explain a significant percentage of the total variance (p>0.05) when they are included and must be dropped from the analysis. This high degree of multicollinearity is a prominent finding.

However, it is important to note that, despite high correlations, CO2 increases cannot be causing an intensification of mid-ocean seismic activity nor can higher CO2 concentrations be driving the acceleration of the NMDP [North Magnetic Dip Pole]. There is simply no plausible mechanism that can be invoked here.

Clearly, there is far more in play than is currently accounted for in our understanding of earth’s climate. The shifting of plates, along with the concurrent shifts of earth’s NMDP, should spur the geophysical community to create a new and enduring paradigm that links these phenomena to changing global temperatures.

Climate Change, Natural Disasters Disappear From Ranking Of Germans’ Top 3 Fears!

Fear of natural catastrophes among German citizens has dwindled over the past 10 years. Back in 2007, just on the heels of Al Gore’s An Inconvenient Truth – the peak of the global warming scare – natural catastrophes took second place among the ranking of top fears for Germans.

Today ten years later in 2017 natural catastrophes are not even in the top three according to German ARD public television, which cited a study by R+V Insurances:

No automatic alt text available.

Chart source: R + V Versicherungen, via ARD German television screenshot.

Ranking in the top three are 1) terrorism, 2) political extremism and 3) tensions concerning the influx of foreigners.

South German SWR public broadcasting here cites the R + V study and writes that this year 56% of those surveyed said they feared “natural catastrophes”, putting that factor in fourth place in the ranking. A variety of other social and economic issues followed closely behind.

The ranking of fears is strongly linked to what issue happens to be dominating the news cycle at the time surveys are conducted. Coverage of climate and natural disasters comes and goes in cycles, and at times disappears for weeks or months from the German media radar.

Recently the Atlantic hurricane season was the top stories in the news, and so a survey done last week would have reflected a greater fear of natural disasters. But once the hurricane season dies out later this fall and the La Nina-induced fall of global temperatures starts to happen, the media of course will go to other bad news to feature.

Made-up news: Ice-free Arctic!

This year there has not been any record ice melt, and global temperatures have in fact begun to ease off. There really isn’t much left out there to report. And when facts aren’t there, some even make them up. For example just before its prime time 8 p.m. news, meteorologist Karsten Schwanke of flagship ARD German television announced on 15 September 2017 that the northeast and northwest passages of the Arctic were ice-free, which is a flat out lie:

No ice-free Arctic passages this year, according to the National Snow and Ice data Center (NSIDC). See details here.

Little wonder that most Germans harbour irrational fears of climate change.

Fear a function of media coverage, not observation

German fear of climate and natural disasters is not really related to real world observations made by citizens, but largely depends on media coverage. When media cover it, or make it up, they get afraid. When they don’t cover it, the fear disappears.

Obviously there’s risk involved in the media featuring climate and natural disasters constantly, namely people would simply tune it out. So the German media instead focusses only the major natural disasters, always trying not to overdo it but to keep it at a level that keeps the fear alive.

Keeping fear at high levels is a very tough and challenging job, especially when reports of growing doom don’t match real life observations, or when the reports are organized propaganda.


Analysis By German Scientists Concerning Hurricane Causes: More Propaganda Than Science

In the wake of Category 5 hurricanes Irma and Harvey, Dr. Sebastian Lüning and Prof. Fritz Vahrenholt presented an analysis of what’s behind the hurricane activity and literature at their well known Die kalte Sonne climate website. Their hope is to bring the hurricane discussion back to some rationality.

The German media of course have been covering the story quite intensely, and at times hysterically. The general tenor of most statements: Hurricanes are not directly caused by climate change, however their power and destructiveness is increasing due to global warming.

The claim is that warmer oceans are providing the fuel to produce larger hurricanes.

As plausible as the theory may sound, Vahrenholt and Lüning decided to investigate the category 4 and 5 hurricanes and plotted their frequency over the past 100 years:

Fig. 4: The number of category 4 and 5 hurricanes between 1924 and 2016

There were quite a number of hurricanes in the 1930s and 1950s, as well as in the 2000s, but the trend has been sharply downward since 2010, despite the warming, and so considerable doubt swirls surrounding all the claims heard in the media.

No correlation found between man and hurricanes

Vahrenholt and Lüning looked at some scientific literature on hurricanes. For example a 2014 paper by Holland et al attempted to show man’s impact on hurricanes. Unfortunately the authors went back only to 1975, and produced the following plot:

Fig. 5: The dependency of the share of Cat.4-5 storms on modelled temperature rises (ACCI) in different oceans, green represents the Atlantic region, red is for the Indian Ocean, and blue for the Pacific. Source: Fig. 5b from Holland (2014).

Even using the data from the carefully selected 1975 to 2011 period does not produce any significant trend, Vahrenholt and Lüning note. Moreover the two German analysts say Holland relied on too few data points coming from the Indian ocean and falsely applied them to claim a “global” trend.

Using the great number of typhoons in the Pacific for the carefully chosen period yielded absolutely no correlation (R² = 0.03). Vahrenholt and Lüning add:

Adding in earlier data also leads to a collapse in correlation for the Atlantic, as the paper only sees a man-made share first starting in 1960. Here the a carefully selected period was sought out and found.”

Decadal variability in hurricane energy and the literature shows an influence by the AMO. A paper Kevin J.E. Walsh of the University of Melbourne tells us just how difficult it is to get understand hurricane strength:

However, the Atlantic basin is noted for having significant multidecadal variability in TC (Tropic Cyclons, d.A.) activity levels. The basin was characterized by a more active period from the mid-1870s to the late 1890s as well as the mid-1940s to the late 1960s. These periods may have had levels of activity similar to what has been observed since the mid-1990s.”

No evidence of a link

“Using the trends from the 1975…2011 period to infer a powerful anthropogenic impact of the recent powerful Atlantic events in light of what we know, borders on sheer audacity,” Lüning and Vahrenholt write. “Apparently the claimed evident relationship between man-made climate change and strengthening hurricanes is not supported.”

Hurricanes driven by Passat winds

Vahrenholt and Lüning cite a new paper  to explain what impacts the energy of a hurricane. Mark A. Saunders of Great Britain and the USA diligently examined observations going back to 1878 and discovered a factor that describes the energy of a hurricane very well: the strength of the northern Passat winds.

Fig 6: Correlation (r, blue) and its significance (p<0.1 is highly significant, red) on the hurricanes energy (ACE), -solid blue curve – and its number -dotted blue curve – with the Passat winds. Source: Fig. 3a from Saunders (2017).

Driven by temperature differences between regions

A second related factor improves the correlation further: The temperature difference between the Main Developing Region (MDR) located within 10° – 20°N and 85°W – 20°W and the global tropical area within 10°S to 10°N. It is long known that hurricane development during times of El Nino is dampened and during La Nina it is enhanced, thus it has to do much more with natural oceanic variability.

Figure 8 below depicts the difference in sea surface temperature (SST) between the main developing region (MDR) and the tropics using observations ERSSTv5, with 10-year smoothing applied.

Fig. 8: The black line is not the horizontal axis, rather its is the linear trend! One observes the AMO similar pattern.

Lüning and Vahrenholt also cite literature showing that the Passat winds will not increase with climate change, but rather indicate a decrease in hurricanes.

All science that seriously looks at hurricanes show no worsening of hurricanes being caused by climate change.”

And what about the thermodynamics of greater evaporation leading to more hurricane energy? A report by Friederike Otto of Oxford finds that there are many possible interactions involved in this highly complex weather phenomenon:

Dynamical factors and thermodynamic aspects of climate change can interact in complex ways and there are many examples where the circulation is as important as the thermodynamics.”

Otto also points out that the climate models are far from adequate:

But in practice this requires climate models that are able to reliably simulate the weather systems in questions over and over again to assess the likelihood of its occurrence.”

Some media outlets have responsibly pointed out that the problems and destruction caused by hurricanes have much more to do with people living in hurricane vulnerable areas.

Sea level rise not a real factor in hurricane flooding

The claim that rising sea levels (10 cm since 1960) caused by global warming is major factor in hurricane destruction is also a non-factor in view of the fact that hurricanes generate waves that are 6 meters high!

Big driver: SST difference between MDR and tropics

In summary, the real hurricane driver of hurricanes is the SST difference between the MDR region and the global tropics. The following graph tells us why current hurricane activity is so high.

Fig. 9: The current sea surface temperature difference between the MDR and the global tropical oceans, source: Tropical Tidbits.

People living in hurricane vulnerable areas need to hope that the curve soon returns to zero. Here and with the bPassat winds doe we find the real reasons for the terrible hurricanes.  Every thing else is propaganda on behalf of a “good cause”


As La Nina Looms, Warmists Skid Into Panic Mode…Global Warming Pause Set To Surpass Two Decades!

At this point last year global warming alarmists and global socialism politicians were as giddy and as optimistic as ever. Everything was falling into place as it looked as if nothing would prevent them from imposing their green regime. The Pope was on their side, global temperatures had been near record highs (thanks to an El Nino event) , and Hillary Clinton would surely go on to become President of the USA.

Warmist agenda now getting torpedoed

With Clinton at the helm, the US would wholeheartedly commit to Paris and to strict decarbonization. Never did the green dream look so promising. But then came the mother of torpedoes, President Donald Trump.

And now there’s yet another torpedo about to slam into the already badly damaged warmunista ship: a rapidly approaching La Nina. In the wake of last year’s El Nino event, global temperatures had already been falling. A La Nina will only cause the globe to cool further. This is surprising because just months ago experts had predicted El Nino conditions to return.

La Nina powers in

The global warming alarmists are in sheer desperation and panic, as made evident by their hysterically shrill reactions to the recent hurricanes. The latest forecast shows a return to La Nina conditions (and a global cool-down).


The above chart shows La Nina conditions expected to persist into spring, 2018. This cooling will make itsself evident in satellite data with a lag of about 6 months. This means global temperature will fall even further next year, which means the warming pause will go beyond 20 years.

Note the intensifying La Nina conditions forecast for the end of the here in the following NCEP chart for the rest of the year:


This oncoming La Nina development led meteorologist Dr. Ryan Maue to comment on Twitter:

Not only La Nina is serving to cool global surface temperatures, but so are the powerful hurricanes as well. Yesterday at the Weatherbell Daily Update, Joe Bastardi showed the effects of hurricanes Irma and Jose on sea surface temperature (SST).

Note the band of cool water through the Caribbean and a substantially cooled down Gulf of Mexico. Just a week ago reports abounded on how the waters there surface had been “bathwater warm”. So quickly can weather change. True, there remains considerable amounts of heat at the ocean surfaces.

Frigid winter projected for Europe

The recent winter projection for Europe issued by Meteociel below shows Europe possiby being gripped by a frigid winter. If the prognosis holds, it could be one of the coldest in years:

Meteociel/CFS prognosis from 30 August 2017,  850 hPa temperature deviation from the mean (about 1500m) in Europe for the 2017/18 winter. For Europe very icy conditions are expected (from left to right: December, January, February). Source:

Arctic sea ice rebound

Also the Arctic has shown recovery over the past years. This year’s Arctic sea ice for mid September is about a full 1 million sq. km. over the record low set 5 years ago.

Overall Arctic sea ice has remained stable for the past 10 years, surprising global warming scientists. Source: National Snow and Ice Data center (NSIDC).


Flashback 2007: Scientists Reveal They ‘No Longer Understand How Ozone Holes Come Into Being’

Ozone Measurement Error  

‘Shatters’ Established Theory

It was 10 years ago this month that scientists revealed an order-of-magnitude-sized error in molecular chemistry measurement that threatened to severely undermine the commonly accepted explanation for  how ozone depletion occurs.

The iconoclastic discovery fomented “much debate and uncertainty in the ozone research community.”    The mechanism that causes polar ozone destruction had, with one measurement, become more “unknown” than known.

With the stunning new evidence, a leading ozone researcher proclaimed that, “Our understanding of chloride chemistry has really been blown apart.”

But then, in the ensuing months and years after the measurement error had been exposed, there was . . . silence.

Rarely, if ever, was this discovery of a molecular rate change “substantially [ten times] lower than previously thought” brought to the public’s attention again.  Ostensibly because of a lack of appetite for admitting they may be wrong, scientists just seemed to . . . move on.

The 1980s zeitgeist that insisted we humans are the predominant cause of ozone depletion due to our ozone depleting substances emissions has been maintained for more than 3 decades now despite a growing body of contrary evidence that says variations in ozone density are predominantly determined by natural phenomena (meteorology, volcanic eruptions), not human emissions.

The ozone “hole” narrative and the widely-held perception that governmental policies determine how small or large the “hole” gets would appear to be analogous to the current climate debate and its connection to the governmental push to dramatically limit CO2 emissions.

Schiermeier, 2007

As the world marks 20 years since the introduction of the Montreal Protocol to protect the ozone layer, Nature has learned of experimental data that threaten to shatter established theories of ozone chemistry. If the data are right, scientists will have to rethink their understanding of how ozone holes are formed and how that relates to climate change.
Markus Rex, an atmosphere scientist at the Alfred Wegener Institute of Polar and Marine Research in Potsdam, Germany, did a double-take when he saw new data for the break-down rate of a crucial molecule, dichlorine peroxide (Cl2O2). The rate of photolysis (light-activated splitting) of this molecule reported by chemists at NASA’s Jet Propulsion Laboratory in Pasadena, California, was extremely low in the wavelengths available in the stratosphere — almost an order of magnitude lower than the currently accepted rate. “This must have far-reaching consequences,” Rex says. “If the measurements are correct we can basically no longer say we understand how ozone holes come into being.”  What effect the results have on projections of the speed or extent of ozone depletion remains unclear.
The rapid photolysis of Cl2O2 is a key reaction in the chemical model of ozone destruction developed 20 years ago (see graphic). If the rate is substantially [10 times] lower than previously thought, then it would not be possible to create enough aggressive chlorine radicals to explain the observed ozone losses at high latitudes, says Rex. The extent of the discrepancy became apparent only when he incorporated the new photolysis rate into a chemical model of ozone depletion. The result was a shock: at least 60% of ozone destruction at the poles seems to be due to an unknown mechanism, Rex told a meeting of stratosphere researchers in Bremen, Germany, last week.
Other groups have yet to confirm the new photolysis rate, but the conundrum is already causing much debate and uncertainty in the ozone research community. “Our understanding of chloride chemistry has really been blown apart,” says John Crowley, an ozone researcher at the Max Planck Institute of Chemistry in Mainz, Germany.
 “Until recently everything looked like it fitted nicely,” agrees Neil Harris, an atmosphere scientist who heads the European Ozone Research Coordinating Unit at the University of Cambridge, UK. “Now suddenly it’s like a plank has been pulled out of a bridge.”

Ozone ‘Hole’ Size Naturally Determined  

CFCs Ban Effects Not Detectable In Trends

NASA, 2013

NASA scientists have revealed the inner workings of the ozone hole that forms annually over Antarctica and found that declining chlorine in the stratosphere has not yet caused a recovery of the ozone hole. …. [T]wo new studies show that signs of recovery are not yet present, and that temperature and winds are still driving any annual changes in ozone hole size.
The classic metrics create the impression that the ozone hole has improved as a result of the Montreal protocol. In reality, meteorology was responsible for the increased ozone and resulting smaller hole, as ozone-depleting substances that year were still elevated. The study has been submitted to the journal of Atmospheric Chemistry and Physics.
Ozone holes with smaller areas and a larger total amount of ozone are not necessarily evidence of recovery attributable to the expected chlorine decline,” said Susan Strahan of NASA’s Goddard Space Flight Center in Greenbelt, Md.
We are still in the period where small changes in chlorine do not affect the area of the ozone hole, which is why it’s too soon to say the ozone hole is recovering,” Strahan said. “We’re going into a period of large variability and there will be bumps in the road before we can identify a clear recovery.”

Barnes et al., 2016

Trends in trace atmospheric constituents can be driven by trends in their (precursor) emissions but also by trends in meteorology. Here, we use ground-level ozone as an example to highlight the extent to which unforced, low-frequency climate variability can drive multi-decadal trends. … Ozone trends are found to respond mostly to changes in emissions of ozone precursors and unforced climate variability, with a comparatively small impact from anthropogenic climate change. Thus, attempts to attribute observed trends to regional emissions changes require consideration of internal climate variability, particularly for short record lengths and small forced trends.

Pozzoli et al., 2012

The changes in meteorology (not including stratospheric variations) and natural emissions account for 75 % of the total variability of global average surface O3 concentrations.
Regionally, annual mean surface O3 concentrations increased by 1.3 and 1.6 ppbv over Europe and North America, respectively, despite the large anthropogenic emission reductions between 1980 and 2005.

Hossaini et al., 2015

Scientists report that chemicals that are not controlled by a United Nations treaty designed to protect the Ozone Layer are contributing to ozone depletion.
In the new study, published today in Nature Geoscience, the scientists also report the atmospheric abundance of one of these ‘very short-lived substances’ (VSLS) is growing rapidly.

The Ozone ‘Hole’ Reached Record Levels In Oct., 2015

     Ivy et al., 2017

Recent research has demonstrated that the concentrations of anthropogenic halocarbons have decreased in response to the worldwide phaseout of ozone depleting substances. Yet, in 2015 the Antarctic ozone hole reached a historical record daily average size in October.
Model simulations with specified dynamics and temperatures based on a reanalysis suggested that the record size was likely due to the eruption of Calbuco, but did not allow for fully-coupled dynamical or thermal feedbacks. We present simulations of the impact of the 2015 Calbuco eruption on the stratosphere using the Whole Atmosphere Community Climate Model with interactive dynamics and temperatures. Comparisons of the interactive and specified dynamics simulations indicate that chemical ozone depletion due to volcanic aerosols played a key role in establishing the record-sized ozone hole of October 2015. The analysis of an ensemble of interactive simulations with and without volcanic aerosols suggests that the forced response to the eruption of Calbuco was an increase in the size of the ozone hole by 4.5 million km2.

Today: The Assumption That Humans Determine Ozone

Destruction Persists Despite Growing Contrary Evidence

National Geographic

Chlorofluorocarbons (CFCs), chemicals found mainly in spray aerosols heavily used by industrialized nations for much of the past 50 years, are the primary culprits in ozone layer breakdown.

Science News

In a rare bright spot for global environmental news, atmospheric scientists reported in 2016 that the ozone hole that forms annually over Antarctica is beginning to heal. Their data nail the case that the Montreal Protocol, the international treaty drawn up in 1987 to limit the use of ozone-destroying chemicals, is working.
[P]ublic engagement was key to solving the ozone problem, with people coming together to identify an issue that threatened society and develop new technologies to fix it. In that respect, the most successful environmental treaty in history holds lessons for dealing with a much bigger threat, she says — climate change.

The Skeptics Witch-Hunt Is On …Pope Francis Unleashes Inquisition 2.0

Image result for Catholic Inquisition
Click image to watch.

It is the duty of every Catholic to persecute heretics.”

– Pope Gregory IX, 1170-1241, organizer of the Inquisition

They that approve a private opinion, call it opinion; but they that mislike it, heresy: and yet heresy signifies no more than private opinion.

– Thomas Hobbes (1588-1679), Leviathan

Now fast forward almost 800 years since Pope Gregory IX’s proclamation on heretics. Guided by global warming dogmatists such as Hans Joachim Schellnhuber of the ultra-alarmist Potsdam Institute for Climate Impact Research, Pope Francis went on to issue a Papal encyclical on the environment and human ecology: Laudato Si, thus making environmentalism a part of Catholic dogma – one that makes challenging climate science heresy.

Now in the wake of Hurricane Irma, Pope Francis has sharply criticized man-made climate change skeptics, implying they are “stupid”. He said on Monday during an in-flight press conference:

Those who deny it [climate change] should go to the scientists and ask them. They speak very clearly. I am reminded of a phrase from the Old Testament, I think from the Psalm: ‘Man is stupid, he is stubborn and he does not see.'”

So says His Holiness, whose Church obstinately took 390 years to apologize for the persecution of heretic Galileo. The bad weather was in fact brewed by skeptics and bad people, the pope insists.

Today a number of (Democrat) politicians and overzealous alarmists in USA are hysterically calling for the criminalization of climate science skepticism and skeptic opinion. For those having read some religious history, this may sound very familiar and remind them of the ugly Inquisition days of the Catholic Church – time when “heretics” were rounded up, brutally tortured and murdered by Church higher-ups — all over Europe, 600 years long.

Papal pact with population control advocate Paul Ehrlich

For many of the faithful, something is terribly amiss with this Pope. Recently he even suggested that Donald Trump was not really a pro-lifer, thus seemingly turns a blind eye to the US Democratic Party’s staunch support for Planned Parenthood abortion factories and a US health care law that forces faithful Catholics to fund abortions against their will.

Earlier this year “population control activist Paul Ehrlich spoke at a Vatican conference on how to save the natural world” today despite an outcry by members of the Catholic faithful“, writes here. Pope Francis appears to be opening the door to population control.

This is all getting monstrous, hysterical and irrational, and so it’s time for global warming skeptics and the faithful to take a firm and resolved stand against this rogue Pope.

We must never cave in to the Pope’s destructive Inquisition-like dogma and demonization of free thought. We have to remain steadfast, and possibly so for decades or even centuries. Those who want us to forget the past, are those who long for its repeat. Inline with his “stupid and stubborn” statement, Pope Francis seems to harbor a detestation for the human race.

I’m done listening to this Pope.


German Analysis: Florida Evacuation With E-Vehicles Would Mean “Mass Death On The Highways”

If Florida’s transportation were based mostly on electric vehicles, as activists demand, it would quickly come to a standstill in times of hurricanes and mass evacuations. Charging stations would be overwhelmed and millions of lives put at risk.

Good thing we have fossil fuel powered vehicles, which can run and be refueled whenever the power is out. Army National Guard load trucks as they prepare to hand out supplies to people in need. E-vehicles would sit idle and leave millions abandoned and at risk. (U.S. Navy photo from FEMA site, by Chief Mass Communication Specialist Ryan J. Courtade/Released)

E-mobility in times of hurricane would be “a nightmare”

Yesterday Michael Limburg of the Germany-based European Institute for Climate and Energy (EIKE) here posted a brainstorming thought exercise, posing the question of what would the evacuation of Florida look like if most of the cars were electric?

And to take it a step further, what would rescue services today be like if they depended on electric vehicles?

EIKE concludes that it would be “a nightmare!”

“Mass death”

EIKE cites a post by IT expert Hadmut Danisch here, who drove the point home, saying the outcome would be “mass death on the highways“. Though the discussion rages over the degree to which man may be at fault for hurricanes like Irma, one thing is sure: as long as it’s September, the Atlantic and the Gulf of Mexico will always be in peak hurricane season with or without man. Powerful hurricanes have always been the case, and always will be.

What has emerged, Limburg writes, is: the better the early warning system is, and the more mobile people are, the less victims you are going to see.

Cars would lose their charge, millions stranded on highway

Imagine if the environmentalists had had their way and had managed to force the US into electric cars…something that is underway now in some countries like Norway, the UK and soon France. Germany recently has been discussing in earnest banning by 2030 the internal combustion engine.

And now imagine with Irma approaching if the millions of citizens evacuating populated south Florida had had electric cars instead to make the 400-mile journey to get out of harm’s way. After 100 miles or so these cars would have lost their power, and charging stations quickly would have become overrun with cars waiting to make the one-hour charge-up.

Traffic would have rapidly come to a halt.

These millions of stranded people then would have been sitting ducks waiting to be blown away by nature’s fury.

Fossil fuel vehicles never need a rest

With fossil fuels, the car’s range is far greater, fueling time is just minutes and extra canisters of fuel also can be easily brought along. Power outages would not interrupt petrol stations because the gas pumps can be easily powered by portable fossil fuel-powered generators.

But with electric cars it would be a totally different story. If a hurricane hit, power lines would go down, knock out the power grid and thus would make the charging of electric vehicles impossible. Solar panels would fly off buildings as roofs are torn off. Wind parks would automatically turn off because they are not designed to operate in hurricane force winds. Many wind towers would simply buckle in the 250 km/hr wind gusts, meaning the green power supply infrastructure would of course get destroyed. In summary, solar panels and offshore wind parks in the Caribbean and Gulf of Mexico make about as much sense as raising sheep on a wolf-farm.

And after the hurricane passes, electric cars would remain immobile because the power grid would be knocked out. Recharging vehicles would be impossible. Emergency vehicles also would quickly lose their power charge and sit idle. Recovery and clean up efforts would take months, if not years. The toll on human life would be unimaginable.

This is the future of electric mobility.

Today, thanks to fossil fuels, energy is always portable, available and reliable. Fossil fueled vehicles are able to travel great distances, be onsite within a minutes notice, and be refueled quickly. Nuclear and coal power plants stand their ground, and so restoring power is easy.

Floridians have shown that even in the aftermath of the most destructive storms, recovery and a speedy return to business is a matter of days or a few weeks, all thanks to the always dependable and available fossil fuels.


12 New Papers: North Atlantic, Pacific, And Southern Oceans Are Cooling As Glaciers Thicken, Gain Mass

Graph Source Duchez et al., 2016

Contrary to expectations, climate scientists continue to report that large regions of the Earth have not been warming in recent decades.

According to Dieng et al. (2017), for example, the global oceans underwent a slowdown, a pause, or even a slight cooling trend during 2003 to 2013.  This  undermines expectations from climate models which presume the increase in radiative forcing from human CO2 emissions should substantially increase ocean temperatures.

The authors indicate that the recent trends in ocean temperatures “may just reflect a 60-year natural cycle“, the AMO (Atlantic Multidecadal Oscillation), and not follow radiative forcing trends.

Dieng et al., 2017    We investigate the global mean and regional change of sea surface and land surface temperature over 2003–2013, using a large number of different data sets, and compare with changes observed over the past few decades (starting in 1950). … While confirming cooling of eastern tropical Pacific during the last decade as reported in several recent studies, our results show that the reduced rate of change of the 2003–2013 time span is a global phenomenon. GMST short-term trends since 1950 computed over successive 11-year windows with 1-year overlap show important decadal variability that highly correlates with 11-year trends of the Atlantic Multidecadal Oscillation index. The GMST 11-year trend distribution is well fitted by a Gaussian function, confirming an unforced origin related to internal climate variability.

We evaluate the time derivative of full-depth ocean heat content to determine the planetary energy imbalance with different approaches: in situ measurements, ocean reanalysis and global sea level budget.  For 2003–2013, it amounts to 0.5 +/− 0.1 W m−2, 0.68 +/− 0.1 W m−2 and 0.65 +/− 0.1 W m−2, respectively for the three approaches.    Although the uncertainty is quite large because of considerable errors in the climate sensitivity parameter, we find no evidence of decrease in net radiative forcing in the recent years, but rather an increase compared to the previous decades.
We can note that the correlation between GMST [global mean surface temperature] trends and AMO trends is quite high. It amounts 0.88 over the whole time span. At the beginning of the record, the correlation with PDO trends is also high (equal to 0.8) but breaks down after the mid-1980s.  The GMST and AMO trends shown in Figure 6 show a low in the 1960s and high in the 1990s, suggestive of a 60-year oscillation, as reported for the global mean sea level by Chambers et al. (2012). Thus the observed temporal evolution of the GMST [global mean surface temperature] trends may just reflect a 60-year natural cycle driven by the AMO.

Subpolar North Atlantic Cooling Rapidly Since 2005

According to Piecuch et al. (2017) there has been no net warming of the North Atlantic Ocean in the last quarter century.  The warming that occurred in the 10 years from 1994-2004 has been completely negated by an even more pronounced cooling trend since 2005.   The predominant (87%) cause of the warming was determined to be of the same natural (non-anthropogenic) origin as the subsequent cooling: advection, the movement/circulation of heat via internal processes.   In fact, human CO2 emissions are never mentioned as even contributing to the the 1994-2004 warming.

Piecuch et al., 2017    The subpolar North Atlantic (SPNA) is subject to strong decadal variability, with implications for surface climate and its predictability. In 2004–2005, SPNA decadal upper ocean and sea-surface temperature trends reversed from warming during 1994–2004 to cooling over 2005–2015. … Over the last two decades, the SPNA has undergone a pronounced climate shift. Decadal OHC and SST trends reversed sign around 2004–2005, with a strong warming seen during 1994–2004 and marked cooling observed over 2005–2015. These trend reversals were pronounced (> 0.1 °C yr−1 in magnitude) in the northeastern North Atlantic (south and west of Iceland) and in the Labrador Sea. … To identify basic processes controlling SPNA thermal variations, we diagnose the SPNA heat budget using ECCOv4. Changes in the heat content of an oceanic control volume can be caused by convergences and divergences of advective, diffusive, and surface heat fluxes within the control volume.  [Advective heat convergence] explains 87% of the total [ocean heat content] variance, the former [warming] showing similar decadal behavior to the latter [cooling], increasing over 1994–2004, and decreasing over 2005–2015. … These results demonstrate that the recent SPNA decadal trend reversal was mostly owing to advective convergences by ocean circulation … decadal variability during 1993–2015 is in largest part related to advection by horizontal gyres.

Yeager and Robson (2017) also point out that, like it did from the 1960s to 1980s, the North Atlantic “has again been cooling”, a trend which they and others expect to continue.   Sea surface temperatures are no warmer today than they were in the 1950s.

Yeager and Robson, 2017    [W]hile the late twentieth century Atlantic was dominated by NAO-driven THC [thermohaline circulation] variability, other mechanisms may dominate in other time periods. … More recently, the SPNA [sub polar North Atlantic] upper ocean has again been cooling, which is also thought to be related to a slowdown in the THC. A continued near-term cooling of the SPNA has been forecast by a number of prediction systems, with implications for pan-Atlantic climate.

The Southern Ocean Has Been Cooling Since The 1970s, Contrary To Models

Latif et al., 2017    The Southern Ocean featured some remarkable changes during the recent decades. For example, large parts of the Southern Ocean, despite rapidly rising atmospheric greenhouse gas concentrations, depicted a surface cooling since the 1970s, whereas most of the planet has warmed considerably. In contrast, climate models generally simulate Southern Ocean surface warming when driven with observed historical radiative forcing. The mechanisms behind the surface cooling and other prominent changes in the Southern Ocean sector climate during the recent decades, such as expanding sea ice extent, abyssal warming, and CO2 uptake, are still under debate. Observational coverage is sparse, and records are short but rapidly growing, making the Southern Ocean climate system one of the least explored. It is thus difficult to separate current trends from underlying decadal to centennial scale variability.

Turney et al., 2017    Occupying about 14% of the world’s surface, the Southern Ocean plays a fundamental role in ocean and atmosphere circulation, carbon cycling and Antarctic ice-sheet dynamics. … As a result of anomalies in the overlying wind, the surrounding waters are strongly influenced by variations in northward Ekman transport of cold fresh subantarctic surface water and anomalous fluxes of sensible and latent heat at the atmosphere–ocean interface. This has produced a cooling trend since 1979.

Sea Ice Has Been Expanding For The Entire Southern Hemisphere Since The 1970s

Comiso et al., 2017    The Antarctic sea ice extent has been slowly increasing contrary to expected trends due to global warming and results from coupled climate models. After a record high extent in 2012 the extent was even higher in 2014 when the magnitude exceeded 20 × 106 km2 for the first time during the satellite era. … [T]he trend in sea ice cover is strongly influenced by the trend in surface temperature [cooling]. … A case study comparing the record high in 2014 with a relatively low ice extent in 2015 also shows strong sensitivity to changes in surface temperature. The results suggest that the positive trend is a consequence of the spatial variability of global trends in surface temperature and that the ability of current climate models to forecast sea ice trend can be improved through better performance in reproducing observed surface temperatures in the Antarctic region.

The Pacific Ocean Has Also Been Cooling Since The 1970s

Li, 2017    In the Southern Ocean, the increasing trend of the total OHC slowed down and started to decrease from 1980, and it started to increase again after 1995. In the warming context over the whole period [1970-2009], the Pacific was losing heat, especially in the deep water below 1000 m and in the upper layer above 300 m, excluding the surface 20 m layer in which the OHC kept increasing through the time.

Glaciers, Ice Sheets Stable, Even Gaining Mass

Goel et al., 2017    Ice rises are a useful resource to investigate evolution and past climate of the DML coastal region. We investigate Blåskimen Island ice rise, one of the larger isle-type ice rises at the calving front of the intersection of Fimbul and Jelbart Ice Shelves, using geophysical methods. … Using the Input-Output method for a range of parameters and column setups, we conclude that Blåskimen Island has been thickening over the past nine years [2005-2014]. Thickening rates cannot be determined precisely, but ensemble results show that thickening rate averaged over the ice rise varies between 0.07 m a−1 and 0.35 m a−1 [per year]. On longer timescales, we speculate that  the summit of Blåskimen Island has been stable within several kilometers at least in the past 600 years but no longer than several millennia.
Bader et al., 2017    Rather than reflecting major changes in ice flow path over time, the provenance changes are interpreted to indicate relative stability of the East Antarctic ice sheet.
Martín-Español et al., 2017    We investigate the mass balance of East Antarctica for 2003–2013 using a Bayesian statistical framework. … We apportion mass trends to SMB and ice dynamics for the EAIS, based on two different assumptions, different remote sensing data and two RCMs. In the first experiment, the model apportions about a third of the mass trend to ice dynamics, +17 Gt/yr, and two thirds, +40 Gt yr−1 to SMB, resulting in a total mass trend for the EAIS [East Antarctic Ice Sheet] of +57 ± 20 Gt yr−1.

Bolch et al., 2017    Previous geodetic estimates of mass changes in the Karakoram revealed balanced budgets or a possible slight mass gain since ∼ 2000. Indications of longer-term stability exist but only very few mass budget analyses are available before 2000. Here, based on 1973 Hexagon KH-9, ∼ 2009 ASTER and the SRTM DTM, we show that glaciers in the Hunza River basin (central Karakoram) were on average in balance or showed slight insignificant mass loss within the period 1973–2009.

Predictions Of Future Cooling, Ice Expansion

Årthun et al., 2017    Statistical regression models show that a significant part of northern climate variability thus can be skillfully predicted up to a decade in advance based on the state of the ocean. Particularly, we predict that Norwegian air temperature will decrease over the coming years, although staying above the long-term (1981–2010) average. Winter Arctic sea ice extent will remain low but with a general increase towards 2020.
Pittard et al., 2017    We suggest the Lambert-Amery glacial system will remain stable, or gain ice mass and mitigate a portion of potential future sea level rise over the next 500 years, with a range of +3.6 to -117.5 mm GMSL-equivalent.

“Net Increase In Greenland Ice Mass…First Time This Century” Amid Northern Hemisphere Cool-Down

The Arctic and the Northern Hemisphere surprise experts with impressive snow and ice gains, decade-long stability.

Reader David at Facebook brought my attention to the fact that Greenland this year has in fact been increasing in ice mass.

The National Snow and Ice Data Center (NSIDC) was forced to admit:

Overall, however, reduced melting and heavy early springtime snowfall may result in a net increase in Greenland’s ice mass this year for the first time this century.

The 2017 melt season has been less intense than recent years, and is below average melting for the 1981-to-2010 reference period. Surface melting has been low in the southeast, and has been limited to coastal regions at low elevations. “

What follows is the chart provided by the NSIDC:

Greenland’s ice melt this season has been well below normal, the NSIDC reports, and may end up seeing a net increase this year for the first time this century. Chart: NSIDC.

Above average snow extent

Meanwhile the University of Rutgers reports that Northern Hemisphere snow extent in August, 2017, was above normal at 2.915 million square kilometers, some 118,000 square kilometers above the mean.

Surprising Arctic sea ice stability

Arctic sea ice, unquestionably in decline since satellite measurements began in 1979 – which was a peak period for sea ice cover – has shown surprising stability over the past 10 years despite all the claims “the Arctic is melting rapidly“.


Arctic sea ice extent stable since 2007. Source NSIDC.

The plot of Arctic sea ice shows that extent did not even come close to setting a new record, which many alarmist experts had predicted earlier this year. Indeed the following chart shows Arctic sea ice has already begun to increase – some 7 – 10 days earlier than normal:


Chart courtesy of Ice Age Now.

Just as we discussed the situation on Greenland above, the sea ice surrounding the Danish island has been extraordinarily high this year, thus extending a stable ice extent plateau of the past 17 years.

It’s no surprise that the ever more desperate climate alarmists have decided to chase hurricanes this year. The Arctic is not cooperating with the warming, “death spiral” scenarios any longer.


German Energy Expert Shreds Wind Power: “Everyone’s Loses With Wind Energy”!

Late last month a video of a discussion round featuring green energies was put up on Youtube.

The segment that follows below shows Dr. Detlef Ahlborn, President of the Wind Energy protest group Bundesinitiative Vernunftkraft (German Initiative for Sensible Power) telling us, without mincing words, why wind energy has been a flop in Germany.

Though the discussion round took place late 2015, it resonates just as loudly today.

On German MDR public television, the moderator asks Ahlborn just who profits from wind energy. According to Ahlborn:

The only one who profits from all the ones you mentioned is the landowner because he has a contractual right. All the others are losers.”

Ahlborn says that 80% of German wind parks are making losses. In the German state of Hesse, for example, “not a single newly installed wind park has yielded what was promised. These yields are up to 20% below what was forecast. And the biggest losers are all of us. All of us!

The problem, Ahlborn elaborates, is that 25% of the wind energy that gets produced is “waste energy”, energy that cannot be used because there is no demand for it. This waste energy ends up getting dumped onto other foreign markets, so much so that neighboring countries have implemented measures to block it out. Ahlborn then says:

The real scandal is that this power gets sold at negative prices, or below market prices and needs to be disposed of at a fee.”

The discussion round then puts up a graphic showing power demand by the German state of Thuringia (middle curve), the state’s wind power output (lower light curve), and the max. peaks of wind power (highest curve):

Chart cropped from MDR FAKT IST!

Ahlborn blasts this inefficient production of wind energy and the waste power that results, saying that wind park projects produce waste that “is a burden on the consumers, a burden on the economy, and a burden on all society – and with this they are destroying our landscape.”

Costs out of control…huge loss of prosperity

Little wonder that the director of Germany’s top economists, Christoph M. Schmidt, recently named Germany’s Energiewende (transition to renewable energies) as being among the top three programs in need of major reform in the country, saying that “the costs are way out of control” and that it will not succeed without a “huge loss in prosperity“.

Schmidt recently handed the latest annual recommendations over to Chancellor Angela Merkel.

Over the past years, led by strong personalities like Ahlborn, and leading environmentalists, resistance to wind energy in most parts of Germany has grown to formidable levels. Lately protest groups have been increasingly successful at blocking projects. Even the government has even rolled back subsidies.


Bad Luck: Monster Hurricane Irma Could Rack Up $1 Trillion In Damages!

UPDATE 4: See animation here.

UPDATE 3: Dr. Roy Spencer thinks Miami may have averted disaster.

UPDATE 2: Euro model shows Irma eye tracking along Florida west coast. The NAM model showing eye moving along OFF west coast.

UPDATE 1: Levi Cowan of Tropical Tidbits reports at Twitter that the forecast path has shifted westward, something he calls “some good news for the Miami area”. Let’s hope for more positive developments.


All the models agree that Irma will almost certainly hit Florida directly and that it would take a miracle to divert the storm away. It’s going to hit.

A number of factors could have prevented Irma from reaching the proportions and magnitudes that it has grown to, but unfortunately luck has played against Floridians and others in the region.

The steering troughs from the High out in the Atlantic and the Low plunging down across the southeast USA could have directed the storm out to sea, if only they had been positioned just a few dozen miles differently.

Shearing could have been stronger and weakened the storm, or the storm might yet track closer to Cuba, causing it to weaken some. But that is all not to be, and this time the die came up snake eyes. Instead, all factors unfolded in total favor of Irma in almost every way possible. The result: a very powerful hurricane, one we don’t see very often.

Levi Cowan at Tropical Tidbits here explained late Thursday evening the factors driving the storm, and the likely path in the days ahead. It really couldn’t be worse and more unfortunate.

The damage is going to be great. Hurricane conditions could extend up into southern Georgia. High winds and storm surges will be widespread along all coastlines on both sides of the Florida peninsula where many metropolitan areas happen to be sitting.

Map as per 2010 US Census showing cities with a population greater than 150,000, and their respective metro areas. Image by: Comayagua99 at English Wikipedia, CC BY-SA 3.0

Damage per square mile

Florida’s area is some 65,700 square miles and has a population of nearly 19 million, most of whom living in coastal areas. Property and infrastructure are naturally concentrated there and thus the state’s metropolitan areas will see tens of millions of dollars in damage per square mile, with some places possibly seeing damages in the hundreds of millions per square mile. Like Houston did.

Especially the areas along the coast, with their ports, harbors, high-rises, transportation facilities and extensive infrastructure, will see severe damage from high winds and storm surges. Rural areas of course will see less damage per square mile.

If half (conservative) of Florida’s area (32,000 sq mi) gets hit hard by surge and/or wind conditions with an average damage of $20 million per square mile (rough estimate), then we are looking at a potential of more than $600 billion in damage for Florida alone.

Now add the damage already done in the Caribbean, plus what will happen in Georgia and beyond. We are getting close to the astronomical sum of a trillion dollars.

There is still some time to get property out of harm’s way, but it’s just about run out. Residents need to focus on getting the hell out of the way and saving their lives.

Expect hysterical and irrational demands

Expect global warming alarmists to seize on the final damage tally, no matter what it turns out to be. They’ll be hysterical, and will irrationally demand that leaders implement multi-trillion dollar “climate protection” measures mostly aimed at reducing greenhouse gases, especially CO2 emissions from burning fossil fuels.

The problem with these measures, however, is that there is no evidence that they would have any impact on future hurricanes – none! Hurricanes have always occurred, and always will no matter how ecologically pious we may become.

In fact a look at the past shows that hurricane activity trends have been decreasing over the past 140 years while CO2 emissions rose:

Hurricane activity has been falling over the past 140 years. CO2 curve added by NoTricksZone (not necessarily to scale).

If climate and hurricanes are indeed related to CO2, as alarmist and activist scientists insist, then the data are telling us to emit more and not less. Of course the CO2-hurricane-activity-relationship is silly, and is no solution.

Complacency after 12 years of low activity

The best advice here would be to invest the money into better infrastructure and more sensible urban planning. That lesson was learned long ago, but obviously got forgotten along the way.

Twelve years of no major hurricane strikes and decreasing activity likely led to just a little too much complacency.


Past Sea Levels Rose 4-6 Meters Per Century, Shorelines Retreated 40 Meters Per Year…Without CO2 Flux

New Paper: Modern Sea Levels Rising  

20-30 Times Slower Than The Past

Currently, sea levels are “believed to be” rising at a rate of 1.7-1.8 millimeters/year.

This modern rate  –  just 0.17-0.18 of a meter per century has remained relatively unchanged from the overall 20th century average, and there has been no statistically significant acceleration in the sea level rise rate (just 0.0042 mm/yr-²) since 1900.

Wenzel and Schröter, 2014     “Global mean sea level change since 1900 is found to be 1.77 ± 0.38 mm year on average. … [T]he acceleration found for the global mean, +0.0042 ± 0.0092 mm year-², is not significant

In a new paper just published in the journal Climate of the Past, 7 scientists detail the long-term context for the modern rates of sea level rise.  Between 16,500 years ago and 8,200 years ago, the average rate of global sea level rise was 1.2 meters per century (12 mm/yr), which is about 700% faster than the rate achieved during the last 115 years. … Included in that rate average is the “meltwater pulse” epoch around 14,500 years ago, when sea levels rose at rates of 4 meters per century (40 mm/yr).

Cronin et al., 2017     “Rates and patterns of global sea level rise (SLR) following the last glacial maximum (LGM) are known from radiometric ages on coral reefs from Barbados, Tahiti, New Guinea, and the Indian Ocean, as well as sediment records from the Sunda Shelf and elsewhere. … Lambeck et al. (2014) estimate mean global rates during the main deglaciation phase of 16.5 to 8.2 kiloannum (ka) [16,500 to 8,200 years ago] at 12 mm yr−1 [+1.2 meters per century] with more rapid SLR [sea level rise] rates (∼ 40 mm yr−1) [+4 meters per century] during meltwater pulse 1A ∼ 14.5–14.0 ka [14,500 to 14,000 years ago].”

Other scientists recently suggested that hemispheric-scale sea levels rose by 12 to 22 meters in just 340 years 14,500 years ago, which is 3.5 to 6.5 meters per century.  This explosive sea level rise coincided with a Northern Hemisphere-wide warming event of 4 to 5 °C within a span of a few decades, but it did not coincide with a change in CO2 levels.

Ivanovic et al., 2017    “During the Last Glacial Maximum 26–19 thousand years ago (ka), a vast ice sheet stretched over North America (Clark et al., 2009). In subsequent millennia, as climate warmed and this ice sheet decayed, large volumes of meltwater flooded to the oceans (Tarasov and Peltier, 2006; Wickert, 2016). This period, known as the “last deglaciation,” included episodes of abrupt climate change, such as the Bølling warming [~14.7–14.5 ka], when Northern Hemisphere temperatures increased by 4–5°C in just a few decades (Lea et al., 2003; Buizert et al., 2014), coinciding with a 12–22m sea level rise in less than 340years [3.5 to 6.5 meters to per century] (Meltwater Pulse 1a (MWP1a)) (Deschamps et al., 2012).”

J.F. Donoghue (2011) puts the historical magnitude of the “more than 20 times that of today” sea level rise rates into perspective.  Shorelines would have necessarily retreated by as much as 40 meters per year – 75 centimeters per week – during those centuries of obscene sea level rise.

Furthermore, Donoghue specifies that during the last 20,000 years, the overall average sea level rise has been 6 mm/yr, which is more than 3 times the rate of the last century.   In other words, modern rates of sea level rise have significantly decelerated compared to the overall long-term trend.

Donoghue, 2011     “For much of the period since the last glacial maximum (LGM), 20,000 years ago, the region has seen rates of sea level rise far in excess of those experienced during the period represented by long-term tide gauges. The regional tide gauge record reveals that sea level has been rising at about 2 mm/year for the past century, while the average rate of rise since the LGM has been 6 mm/year, with some periods of abrupt rise exceeding 40 mm/year [4 meters per century].”
Sea level has at times risen at rates more than 20 times that of today, more than 40 mm/year. At such rates, the regional shorelines would have retreated by as much as 40 m/year, or more than 75 cm/week.”

UK geologist Dominic Hodgson and colleagues have determined that sea levels rose at rates of 1.2 to 4.8 meters per century near East Antarctica about 10,000 years ago, when sea levels were 8 meters higher than they are now.

Hodgson et al., 2016     Rapid early Holocene sea-level rise in Prydz Bay, East Antarctica … The field data show rapid increases in rates of relative sea level rise of 12–48 mm/yr between 10,473 (or 9678) and 9411 cal yr BP in the Vestfold Hills and of 8.8 mm/yr between 8882 and 8563 cal yr BP in the Larsemann Hills. … The geological data imply a regional RSL [relative sea level] high stand of c. 8 m [above present levels], which persisted between 9411 cal yr BP and 7564 cal yr BP, and was followed by a period when deglacial sea-level rise was almost exactly cancelled out by local rebound.

And Zecchin et al. (2015) have suggested that “episodic” and “rapid” sea  level rises could reach 6 meters per century (60 mm/yr) for hundreds of years at a time.   That’s about 35 times the current rate.

Zecchin et al., 2015     “The evidence presented here confirms drowned shorelines documented elsewhere at similar water depths and shows that melt-water pulses have punctuated the post-glacial relative sea-level rise with rates up to 60 mm/yr. [6 meters per century] for a few centuries.”

In sum, there has been nothing unusual about the modern rates of sea level rise over the last century.

Actually, sea levels are currently rising at less than a third of the average rate (6 mm/yr) of the last 20,000 years (Donogue, 2011).

The continuance of this long-term deceleration during an era of allegedly significant anthropogenic climate influence strongly suggests that abrupt sea level rise (and fall) occurs independently of the variations in the atmospheric CO2 concentration.