Heartland Institute Now Distributing ‘The Neglected Sun’ …Scientists Say IPCC “Grossly Incorrect”

Neglected Sun HeartlandA reader recently left a comment saying he had been having difficulty getting a copy of “The Neglected Sun“, the best-selling non-alarmist climate science book showing how man-made climate change is nowhere near as serious as the IPCC wants us to believe it is.

Order here now.

Good news! The Die kalte Sonne site here reports that The Neglected Sun, the English version, which sold out a few months ago, will once again be printed and available from the Chicago-based powerhouse think-tank The Heartland Institute, who have purchased the rights to the book.

It is now available at Amazon here, or at the Heartland Institute online shop for US$ 19.99. The Kindle version is available at Amazon for US$ 11.11. Shipping begins July 1, 2015.

The book was also translated in Polish and has been available since October 2014.

IPCC’s “grossly incorrect radiative forcing values”

According to authors Dr. Sebastian Lüning and Prof. Fritz Vahrenholt, the book is up-to-date, cites hundreds of peer-reviewed literature and explains in easy terms why the CO2 climate sensitivity has been totally overblown and how the sun and oceans are the primary climate drivers.

They commented in an e-mail:

Detailed comparison with the palaeoclimatological development demonstrates that the climate change observed over the past 100 years is nothing new, neither qualitatively nor quantitatively. Natural climate variability is much more important than previously thought and solar activity changes and ocean cycles are some of the key drivers. It turns out that the IPCC has made major mistakes in the attribution of the 20th century warming which leads to grossly incorrect radiative forcing values in the IPCC reports.”

The two authors also point to the latest UK Met Office report which shows we may be heading into a new cold phase due to low solar activity.

NASA data are “suspicious”

The two prominent German skeptics are also distrustful of NASA GISS temperature data, claiming the temperature “corrections” are “suspicious” because “they always result in amplification of the warming trend, never the opposite. Artificial cooling of the past and artificial warming of the present-day.”

What to expect from Paris

On what we can expect from Paris later this year, the two co-authors write that there will be some sort of treaty “but likely without a lot of substance and with lots of vagueness and loopholes.”

Also pick up a copy of Climate Change: The Facts:



German Scientists: Solar Cycle 24 On Track To Be 3rd All-Time Weakest …And McCarthy Paper Points To Tame CO2 Climate Sensitivity

The Sun in May 2015, and Atlantic Waves

By Frank Bosse and Fritz Vahrenholt
[Translated, edited by P Gosselin]

Our primary “fusion reactor” remains in a weak phase in its current solar cycle, number 24 since systematic observations began in the year 1749. In May sunspot activity was below normal. The observed sunspot number (SSN) was 58.8. The mean of all previous cycles for the current 78th month into the cycle is SSN=79. Thus May saw 75% of the usual activity.

Figure 1: The current cycle 24 (started in December 2008) is shown in red and is compared to the mean cycle (blue) and to cycle no. 5 (black).

A pronounced lull

Figure 1 shows that current solar cycle 24 has never exceeded the mean (blue) at any time since it began. In the 78 months since the it began, SC 24 has always been below normal. This has never been observed for any previous cycle.  The low solar activity since December 2008 is unique when it comes to its consistency when compared to the other cycles since observations began!

Even when activity reached a maximum in October 2011 in the sun’s northern hemisphere, and in February 2014 for the southern hemisphere, it remained just below the mean value. Together with the delayed start of the cycle we now have a record 10 years of quiet solar activity.

Figure 2: The accumulated sunspot anomaly of all cycle up to the 78th solar cycle month.

Figure 2 depicts a comparison of all the cycles with respect to solar activity. So far the current cycle is in 4th place in terms of low activity. But 3rd place is very reachable because SC 7 saw high sunspot values in its last third of the cycle, and so the chances are good that the total activity of SC 24 will be quieter than the last cycles of the Dalton Minimum.

Atlantic waves…

…are really high when it’s stormy. In early May off the coast of Portugal one of the co-authors of this article came to realize this in a 14-meter long sail boat. But the Atlantic also created other types of waves in the past month. A team of scientists led by Gerard D. McCarthy of the University of Southampton went on the search for internal North Atlantic variability, see www.nature.com/nature/journal.html. They determined that the Atlantic Multidecadal Oscillation (AMO) not only has ups and downs in sea surface temperature (SST) in the extratropic Atlantic region, but that these temperature variations lead to changes in sea level (SSH) along the east coast of the USA. The pattern appears as follows:

Figure 3: The “circulation series” shown in blue. In the paper the SSH variation is determined by comparing the sea level south of and north of Cape Hatteras. The AMO is black. Source: Figure 3 of the cited McCarthy publication.

The relatively long time series of tide measurements at the East Coast is thus a proxy for the ocean heat content (OHC) of the North Atlantic. Its direct measurement since the 1950 entails large uncertainty. But beginning in 2004 it has been much more precise thanks to the submerged ARGO measurement buoys and the RAPID network.

What implications does this study have? First of all, the existence of natural Atlantic Multidecadal Oscillations is confirmed, and not only as a variation in sea surface temperature (SST) as it was previously defined. It is now sure that the AMO is a large-scale North Atlantic water mass circulation pattern. It is an independent internal natural variability of our climate system, and not just one involving global temperature.

Already in January 2013 we pointed to falling North Atlantic ocean heat content (OHC) since 2007. What follows is the data plot:

 Figure 4: The ocean heat content (OHC) of the extratropical North Atlantic since 1979. Source: Climate4you.

In the paper and its accompanying press release it is explained that the current decline in the OHC means it is announcing that the probability of the North Atlantic cooling more than 10 years is very high. The AMO’s impact on temperatures in the northern hemisphere was major in the past, as the following plot shows:

Figure 5: The AMO (green) compared to temperature changes of the Northern Hemisphere (red).

If the AMO exists as an internal variability, as the McCarthy paper tells us, then that could imply that 0.5°C warming seen in the northern hemisphere since 1975 was due to the AMO and that the remaining 0.5°C of warming was due to impacts from greenhouse gases and other factors, such as varying solar activity.

For estimating climate sensitivity from greenhouse gases, this has far-reaching implications: Up to now we were not able to completely exclude the impact of aerosols on the cooling of temperatures between 1945-1975, but now it is appearing as increasingly improbable. Indeed it is becoming more evident that the cooling was due to the weakening AMO during that time period (see Figure 3).

If indeed aerosols have a lesser cooling effect than previously assumed, then the climate sensitivity with respect to greenhouse gases must be less.  Since 1975 for the northern hemisphere it was not 0.26 °C / decade increase, but rather only 0.13. This is close to being identical to the southern hemisphere. We’ve often discussed this 50:50 order here …and once again we are confirmed.

Surprise! Orbiting Carbon Observatory Shows Models Have Little Resemblance To Reality

By Ed Caryl

On 1 July 2014, NASA launched OCO-2, the second attempt at orbiting a global carbon dioxide observatory. In December, the first global map was released.


Compare this map with a frame from a previously released video showing a model of what the CO2 distribution was thought to be for roughly the same time period in 2006, the 1st of November.

Screen Shot 2015-06-10 at 9.04.51 AM

Note the differences. There is much more CO2 coming from the tropical rainforests than the model predicts, and there is a sink, where CO2 is taken up, over Russia that the model does not have.

Since January, there has been no public disclosure of any further maps from OCO-2 on the OCO-2 website, except for a slide (their figure 5, shown below) from a webinar presentation that took place in February. This covers the period from late November to late December.


Compare this with a corresponding frame from the model video.

Screen Shot 2015-06-10 at 9.26.46 AM

The reality measured by OCO-2 hardly resembles the model. The map released by NASA also seems to cut off CO2 sinks in the North Atlantic and Pacific. This may be excused by the sun angle in the Northern Hemisphere in December. NASA is surely learning a lot from OCO-2, but the findings may not be “politically correct”. We must await further data releases.

Again! 2nd Baltic Sea Report, Hans Von Storch, Show Medieval Warm Period 0.5°C Warmer Than Today!

How many times must a hockey stick be broken, before alarmists stop wetting their beds? … The answer my friend, is blowing in the wind.

Second climate status report on the Baltic Sea Region: Medieval Warm Period was Half A Degree Warmer Than Today
By Dr. Sebastian Lüning and Prof. Fritz Vahrenholt
[Translated, edited by P Gosselin]

In mid-May 2015 the second Climate Status Report on the Baltic Sea region was released. It was coordinated by the Helmholtz Center in Geesthacht, Germany. In a press release the institute explained:

The Second Assessment of Climate Change for the Baltic Sea Basin (BACC II), a recently published report, serves as a revision and expansion of the 2008 edition of the BACC book. ‘The current publication for the Baltic Sea area is a regional variant on the global report published by the Intergovernmental Panel on Climate Change (IPCC),’ says Prof. Hans von Storch, Director of the Institute of Coastal Research at the Helmholtz-Zentrum Geesthacht and initiator of the report. The comprehensive scientific survey includes work from 141 scientists from twelve countries. The project team was coordinated by the International Baltic Earth Secretariat at the Helmholtz-Zentrum Geesthacht and consists of meteorologists, hydrologists, oceanographers and biologists.

Warming continues

The current study takes into consideration observed climate changes for approximately the last two hundred years as well as possible changes that might occur by the year 2100. These projections are obtained from computer models. Warming air temperature in the Baltic Sea region has already been verified based on measurements, but the increase is seasonally and regionally different. The most drastic recorded increase in warming to have occurred in the northern Baltic Sea region was 1.5 degrees Celsius between 1871 and 2011 during the spring seasons. This number is well above the global warming estimates of up to one degree Celsius documented in the last IPCC report.”

The folks in Geestacht indeed forgot to mention a small detail in the press release, as you will soon see. The first two chapters of the report deal mainly with the climate development of the last 12,000 years and the last 1000 years:

Pages 25-49: Climate Change During the Holocene (Past 12,000 Years)
Irena Borzenkova, Eduardo Zorita, Olga Borisova, Laimdota Kalniņa, Dalia Kisielienė… Download PDF (1004KB)

Pages 51-65: The Historical Time Frame (Past 1000 Years)
Tadeusz Niedźwiedź, Rüdiger Glaser, Daniel Hansson, Samuli Helama, Vladimir Klimenko… Download PDF (912KB)

Obviously the Baltic Sea study goes far beyond the claimed 200 years. So out of curiosity, we examined the first two chapters. In the abstract of the 12,000-year chapter we discovered something interesting (emphasis added):

The Holocene climate history showed three stages of natural climate oscillations in the Baltic Sea region: short-term cold episodes related to deglaciation during a stable positive temperature trend (11,000–8000 cal year BP); a warm and stable climate with air temperature 1.0–3.5 °C above modern levels (8000–4500 cal year BP), a decreasing temperature trend; and increased climatic instability (last 5000–4500 years). The climatic variation during the Late-glacial and Holocene is reflected in the changing lake levels and vegetation, and in the formation of a complex hydrographical network that set the stage for the Medieval Warm Period and the Little Ice Age of the past millennium.”

The pre-industrial climate of the Baltic Sea region was everything but stable. According to the study during the period of 8000-4500 years before today, it was about 1 to 3.5 degrees Celsius warmer than it is today. This corresponds to the so-called “mid-Holocene climate optimum”. This is a warm period that is practically unknown to the public and not very well-liked by the media outlets. Suddenly we find a completely new meaning in the press release’s subheading “Warming continues”. It is getting warmer – but nowhere near as warm as it was during the 8000-4500 year before present period.

At the end of the abstract the attention shifts to the Medieval Warm Period, which is a part of the subsequent chapter by Tadeusz Niedźwiedź and colleagues. In the text describing the last 1000 years we find the well-known climate cycle that the IPCC tried to discard: the Medieval Warm Period, Little Ice Age, Modern Warm Period. The chapter writes:

According to the scientific literature, there are four climatic periods of the past millennium: the Medieval Warm Period (MWP 900-1350), the Transitional Period (TP 1350-1550), the Little Ice Age (LIA 1550-1850), and the Contemporary Warm Period (CW after 1850).”

Just how warm was it during the Medieval Warm period in the Baltic Sea region? Also here with this inconvenient question the authors do not shy away (emphasis added):

Recent investigations of Fennoscandia by Ljungqvist (2010) showed that the MWP [Medieval Warm Period] occurred between 800 and 1300. At that time, warm-season (May-September) temperatures exceeded the contemporary warming of the end of twentieth century by about +0.5°C. The start of the warming was noted between the ninth and tenth centuries, and the peak temperature appeared at the beginning of the second half of the twelfth century. In a winter temperature simulation over the Baltic Sea region (Schimanke et al. 2012) during that time anomalies reached their highest value of+0.8°C for the MWP.”

The text above is a clear statement. The Baltic Sea region was 0.5°C warmer 1000 years ago.

No one wanted in any way that this important condition get mentioned in the press release. How could it have been warmer 1000 years ago than it is today at a time when atmospheric CO2 concentration was extraordinarily low?

Björn Stevens’ New Paper In Nature Has Kevin Trenberth Trembling…Lindzen’s Iris Sees Vindication

NOTE: I upgraded to the latest version of WordPress this morning. But now it appears readers are unable to leave comments because no “Reply” button is appearing. Will work on this….please be patient… -PG


Change in the climate debate? Climate models start converging towards reality
By Frank Bosse and Prof. Fritz Vahrenholt
[Translated, condensed by P. Gosselin]

On many occasions we have brought up the discrepancies between observed global data and the values of the latest IPCC climate models (CMIP5) – see here and here. The model values simply diverged well above the observations. A chart by Roy Spencer made this clear:

Figure 1: The deviation of model values from satellite observations up to 2013, Source: Roy Spencer.

Recently Thomas Mauritsen and Björn Stevens, a clouds and aerosols modeling expert of the Max Planck Institute for Meteorology in Hamburg, examined the matter. In a very recent paper appearing in Nature Geoscience (hereinafter MS15) titled “Missing iris effect as a possible cause of muted hydrological change and high climate sensitivity in models“, they took up the actual situation (a somewhat difficult to understand press release on this can be read here).

1) Apparently the models overreact to the greenhouse gas effect. Recent papers also have found from the observations a significantly lower value for forcing. The climate sensitivity (ECS) for a very long-term period after an increase in greenhouse gas is determined to be on average 3.3°C of warming while the observations indicate a value of about 2°C.

2) The global hydrological cycle is not taken correctly into account in the models (too small).

There’s also another phenomenon: According to the models, the troposphere in the tropics was supposed to warm up much more than at ground level. But this is not being observed, and we’ve written about this in an earlier post “Houston, we have a problem: We can’t find the hot spot“.

Here the authors searched for an explanation for the discrepancies. Perhaps a cooling effect was missing in the models. They found an explanation: an old hypothesis from Richard Lindzen and his colleagues from 2001: The earth has an “iris”, a negative feedback where more heat is released when there is more warming than when the temperature is cool. This mechanism, according to Lindzen, acts in the tropics and in the subtropics. It is illustrated as such in the paper:


Figure 2: At higher temperatures there are more thunderstorms over the ocean and the area without high level clouds (dry and clear) expands further and thus allows more heat to radiate off into space (strong OLR) than when temperatures are lower, i.e. when the iris is smaller. Source: Figure 1 from MS15.

This “self-regulation” of the earth’s temperatures proposed by Lindzen of course was rejected by “mainstream” climate science. Chambers et al. wrote in the Journal of Climate 2002:

As a result, the strength of the feedback effect is reduced by a factor of 10 or more. Contrary to the initial Iris hypothesis, most of the definitions tested in this paper result in a small positive feedback.”

Kevin Trenberth and colleagues rejected the iris-theory in the Geophysical Research Letters in 2010:

…and their [Lindzen et al.] use of a limited tropical domain is especially problematic. Moreover their results do not stand up to independent testing.”

The authors of the latest Nature article obviously had not been impressed by Trenberth’s claim, and made the effort of integrating Lindzen’s iris effect into an advanced model. The result is somewhat surprising.

The Tropical Hot Spot

Figure 3: The temperatures in the atmosphere according to the models, Source: Figure S7 from MS15.

The increase in the tropical temperature trends between 7 and 14 km elevation in the model without the iris (Figure 3 left) and with the iris (right). The modeled hot spot with profound effect at about 10 km and above has disappeared…which is very much like what the observations have been telling us:

Figure 4: The hot spot in the models without the iris and the observations. Source: “Climate4you“. The 300 hPa atmospheric pressure corresponds to an elevation of approx. 8 km, the 200 hPa approx. 10 km above sea level. The measured trend (dark red or blue), the modeled trend in the model mean without iris is dotted red.

The impact of the iris effect on EC

Figure 5: Crossplot of the ECS with negative effect of iris as temperature rises. The red point is the ECS of the model without iris. The blue plots are the values for the ECS in various strengths of the Iris. Source: Figure 3a of MS15.

The iris shifts the ECS of the used models from 2.81 to 2.21. That’s 22% less sensitivity with respect to the warming effect by greenhouse gases. Also the discrepancy in the hydrological cycle (basically the increase in precipitation due to warming) could be resolved by taking the iris into account.

Even when the impression has been given here from time to time that we distrust models, we point out that precisely in climatology models can be very useful when they are fed with all the right information. We are not able to experiment with the atmosphere and so we have to rely on computers. It is essential that the models accurately reflect reality and do not produce a fictitious world that supplies catastrophe scenarios.

Climate models are extremely complex, and thus it is all the more critical that their programming be clean and that all the possible physical factors be correctly taken into account. The latest paper shows that modeling has taken a step in the right direction. It shows 22% less climate sensitivity due to the iris component.

Not everyone was happy to hear this news. Kevin Trenberth was quoted saying the following words (translated from the German):

The paper is poorly written and misleading. It wasn’t necessary to blow the iris horn.”

Trenberth seemed almost infuriated:

The authors even wrote it in the damn title.”

What fear has suddenly gripped the “climate establishment”? Could it be that among policymakers the word is out that the climate catastrophe has been called off?

Björn Stevens’ research results may go down as being the turning point in the history of climate science debate.


Science Under Siege: Max Planck Institute Study Shows Climate Models Severely Overstate Warming

Hamburg-Based Max-Planck-Institute for Meteorology: Aerosols Cool Less Than Previously Thought

By Sebastian Lüning, Fritz Vahrenholt
[Translated, edited by P Gosselin]

In our book “The Neglected Sun” we wondered a lot about the cooling effect of aerosols that was assigned in the climate models. Aerosols are tiny dust particles and droplets that act to diffuse sunlight and thus as a rule act to cool the earth. But by how much? In Chapter 5 of our book we wrote:

According to the IPCC, the cooling effect of aerosols offsets about two thirds of the power of CO2. In the IPCC’s view, aerosols reduce the warming generated by all greenhouse gases by 45 percent. But the uncertainty is large – it could be 15 percent, or even 85%, because we have only modest to low level of scientific “understanding of the relationships.”

Today very few are aware that the climate models generate far more warming than what we really produced over the last 100 years. The IPCC strategy: All the surplus heat is cancelled by aerosols until the models “fit”. The cooling joker is thus badly needed in order to maintain CO2’s high climate sensitivity.

In March 2015 we saw some progress in the aerosol discussion. One of the authors of the latest IPCC report claimed that the range of uncertainty concerning the effect of aerosols on climate had been greatly reduced thanks to new research findings, and in the meantime there’s been a lot of talk that the cooling potential of aerosols indeed had been significantly exaggerated in the past. The real cooling value is actually at the lower limits of the range assumed up to now by the IPCC.

The most important and boldest claims come from Bjorn Stevens, one of the three directors at the Hamburg-based Max Planck Institute for Meteorology (MPIM). That paper appeared in the Journal of Climate. What follows is the paper’s abstract:

Rethinking the lower bound on aerosol radiative forcing
Based on research showing that in the case of a strong aerosol forcing, this forcing establishes itself early in the historical record, a simple model is constructed to explore the implications of a strongly negative aerosol forcing on the early (pre 1950) part of the instrumental record. This model, which contains terms representing both aerosol-radiation and aerosol-cloud interactions well represents the known time history of aerosol radiative forcing, as well as the effect of the natural state on the strength of aerosol forcing. Model parameters, randomly drawn to represent uncertainty in understanding, demonstrates that a forcing more negative than −1.0 W m−2 is implausible, as it implies that none of the approximately 0.3 K temperature rise between 1850 and 1950 can be attributed to northern-hemispheric forcing. The individual terms of the model are interpreted in light of comprehensive modeling, constraints from observations, and physical understanding, to provide further support for the less negative ( −1.0 W m−2 ) lower bound. These findings suggest that aerosol radiative forcing is less negative and more certain than is commonly believed.

In general one should be careful not to overuse the word “sensational”. But here the word is most suitable. Surprisingly the German media has been deadly quiet on this. A Google news search reveals that there has not been a single article written about the paper. Undesirable news that the media prefer not to make public?

The implications of the paper were immediately recognized within the scientific community. On March 19, 2015, Nic Lewis explained the paper’s far-reaching implications at Steve McIntyre’s Climate Audit and Judith Curry’s Climate Etc.: Also the climate sensitivity gets further limited, and most likely is near the lower limit of the IPCC’s given range. Lewis’s calculations using the new Stevens value yield a most probable mean value for CO2 climate sensitivity (and indeed for the long-term “ECS”) of 1.45°C of warming for each doubling of CO2. The new total range suggested by Lewis ranges from 0.9 to 1.65°C per doubling of CO2. This is far below the IPCC’s latest range of 1.5 to 4.5°C per doubling of CO2.

Figure 1: Range of CO2 climate sensitivity according to calculations by Nic Lewis using the latest Stevens 2015 values. Source.

Bjorn Stevens was fully aware of the avalanche of reactions this would unleash. It is going to take awhile before his IPCC colleagues get over their indigestion and allow the new findings to flow into their modeling work. Until that happens, it is best to avoid any media storm. The MPIM intentionally did not issue a press release to announce the paper. As the English-language media busily discussed the logical consequences of the paper, the MPIM in Hamburg eventually found it necessary to put out a statement. On April 2, 2015, Stevens put out a statement saying that his paper only addressed aerosols and would not be appropriate for speculation on CO2 sensitivity. With it he buys himself a little public peace – for the time being. However the scientific community will not be able dodge the consequences of the paper over the mid to long-term.


CO2 Emissions And Ocean Flux: Long-Term CO2 Increase Due To Emissions, Not Ocean Temperature

If you take the annual CO2 atmospheric content, and differentiate it, that is calculate the year-to-year change, then you get a plot that looks a lot like the ocean temperature. This has led many people to think that the ocean is the source of the additional CO2. This is not the case.

The additional CO2 is from about one-half of our emissions. The year-to-year “noise” is from the year-to-year change in ocean temperature riding on the change from our emissions. Changes in our emissions are somewhat filtered by the time constant of CO2 biosphere absorption. Here is what happens.

Many of you will be familiar with the concept of “half-life” from radioactive decay. For a radioactive element, half of the radioactivity will decay in a certain period of time, then half again in the next period, and so on until the radioactivity can no longer be detected. The same principle applies to absorption. Half of a compound will be absorbed in a certain period, then another half in the next period, then another half, and so on. Here is the curve for a pulse of CO2 absorbed into the biosphere.

CO2 Fraction Absorbed

Figure 1 is the absorption curve for a single pulse of CO2 emitted into the atmosphere. The half-life of that pulse is a bit over 8 months. It is undetectable after about six years.

Imagine that you have just taken a deep breath, held it until the maximum CO2 has been exchanged, then exhaled. Half of the CO2 in that puff will be gone from the atmosphere in 8 months, 33% will remain in a year, only 11% will be around in 2 years, and so on. The series is actually 1/3, 1/9, 1/27, 1/81… Now add up all breathing for many years, or all the fossil fuel emissions.

Summation Half-Lives

In Figure 2 above the top gold trace is the summation of all the individual annual half-life traces. For instance, year one is the sum of the remainders from year -4 to year zero. It is the sum of 1/243 + 1/81 + 1/27 + 1/9 + 1/3 = very close to 1/2. The CO2 fraction that we observe is close to 0.5. If emissions completely ceased in year six, the extra CO2 added to the atmosphere would be nearly zero in year 10.

Annual Emissions and Remainder

Figure 3 is a plot of annual fossil fuel emissions and the amount of those emissions that annually remain in the atmosphere. The remainder plot has been corrected for half-life.

Remaining Fraction Compared

Figure 4 is a plot of the fraction of emissions that remain in the atmosphere, the ocean temperature anomaly (from UHA satellite data), and the change (delta) in emissions with the data corrected for half-life using the fraction change data. (The half-life has been decreasing over time by 2.8% per decade). The left scale is for both the remaining fraction and temperature anomaly. The right scale applies to the delta emissions (the annual change in added emissions). This is scaled to match the fraction change. On a year-to-year basis the ocean temperature changes overwhelm the emission changes, but not the emissions themselves.

SST Change CO2 Change

Figure 5 is a plot of  change: SST, CO2, and annual emissions added since 1980. This is the annual delta (differential) of all three. The Mt. Pinatubo cooling and the 1998 and 2010 El Niño warming is clearly visible in the CO2 data. It looks like the CO2 increase is due to ocean temperature, but this is an illusion. The annually added emissions are much larger than the ocean temperature CO2 flux change.


Figure 6 is a scatter plot of CO2 and SST anomaly with a linear trend applied. The trend is 17,239 million metric tonnes of CO2 emitted per degree C of SST change. Now look back at Figure 6. The long term change in temperature is about 0.3°C. This would be equivalent to about 5 billion tonnes of CO2. But the increase in CO2 over that time period was 483 billion tonnes, about 100 times that amount. The long-term CO2 increase is due to emissions, not ocean temperature. Temperature drives only the short-term changes.

About half of fossil carbon emissions appear to be responsible for the atmospheric CO2 rise, and that fraction is decreasing. The year-to-year changes in the CO2 rise are mostly due to ocean temperature changes, but those changes should be considered weather.


CO2 Emissions Have Been Flat For Four Years. What Does This Mean For The Future?

The CO2 concentration in the atmosphere has been steadily increasing since regular measurements began at the Mauna Loa Observatory. This increase is partially driven by fossil fuel use but the year to year rate of increase is driven by ocean temperature. This was discussed in October 2012 here.

The International Energy Agency (IEA) tracks fossil fuel use and has reported here that emissions due to fossil fuels have flatlined for the last two years. Actually, it has nearly flatlined for the last four years according to their own figures. The increase from 2011 to 2012 was less than 0.5%, and from 2012 to 2013 it decreased by 0.03%.

Total Emissions

Figure 1 is a plot of annual world fossil fuel CO2 emissions since 1980 from IEA.

In that time, CO2 in the atmosphere has grown from 338 parts per million (ppm) to 398 ppm. In the next figure, that quantity has been converted to metric tonnage.

Acumalated Emissions and CO2

Figure 2 is the accumulated emissions compared to accumulated CO2. 

Note that only about half the emissions have stayed in the atmosphere. The remainder has been absorbed somewhere else.

Annual emissions, uptake, & CO2

Figure 3 is a plot of annual Carbon Dioxide emissions, the annual uptake by the biosphere, and the resulting atmospheric CO2 concentration, with projections of each into the future.

Global Greening

Figure 4. Estimated changes in vegetative cover due to CO2 fertilization between 1982 and 2010 (Donohue et al., 2013 GRL). For a discussion of this image and other similar images see Roy Spencer here.

The “somewhere else” is the biosphere, the “greening” of global photosynthetic life along with absorption by the oceans. Each year, on average, those sinks take up 251.35 million extra metric tonnes of CO2 as the biosphere pulls things back into balance. That sounds like a lot, but keep in mind that the atmosphere contains more than 3 trillion tonnes of CO2, land based vegetation about 4 trillion tonnes, the surface ocean something like 5 trillion tonnes, and the deep ocean 150 trillion tonnes. The total annual flux from atmosphere to the biosphere and back is about 400 billion tonnes. So that annual difference is only about 0.5%. (I am using the American counting system: 106 is a million, 109 a billion, 1012 a trillion. I’m also using the weight of CO2, not just the carbon atoms.)

The uptake decreases with increasing temperature. The downward spike in uptake in 1998 was due to the El Niño of that year. The increase in uptake in 1992 and 1993 was due to the Mount Pinatubo cooling. The uptake also increases with increasing CO2. On average, the uptake is increasing. Over the last 35 years, it has increased by 251.35 million extra tonnes per year.

If this increase holds, and the global fossil fuel emissions remain constant at the same level as the last four years, the uptake will equal the emissions late in this century, in about 2083. At that time, CO2 in the atmosphere will reach a maximum at about 475 ppm and begin to decrease. In my humble opinion, this projection is pessimistic. If emissions are reduced below the current level, the date and level of the maximum will be earlier and lower. If ocean temperatures decrease, the same thing will happen. If warming occurs, and/or emissions increase, the date of the maximum will be pushed toward 2100, and the maximum level will be slightly higher. Ocean temperature is a positive feedback on CO2 increase and a negative feedback on CO2 uptake. A cooler ocean takes up more CO2, a warmer ocean less.

The current policies in most developed countries are toward lowering emissions. To date, these policies have been successful in holding emissions at the current level for the last four years. If governments allow the development of nuclear power, other renewables increase in a non-destructive fashion, and if natural gas continues to replace coal-fired power plants, carbon dioxide emissions will likely fall below the current level. This will pull the CO2 peak to an earlier date and a lower level. That date is where emissions cross the uptake trend line. Here is my optimistic view: If emissions fall back to the level of the year 2000 by 2050, looking at Figure 3, one can imagine that the CO2 level will peak at about 450 ppm near that year and fall after that.

What will this do to global warming? The slight increase in CO2 induced warming might just offset the coming sun-induced cooling. Perhaps we will have a century of constant temperature.

Solar Impulse 2 Flight-Around-The-World “Without A Drop Of Fuel” In Fact Will Burn Tens Of Thousands of Liters!


Sun-powered Solar Impulse 2 aircraft is to circumnavigate the globe “without a drop of fuel”. However it will in fact need thousands of litres of fuel from support planes. Photo credit: Brussels Airport, Creative Commons Attribution-Share Alike 2.0 Generic license.

There’s been a fair amount of hype surrounding the Swiss Solar Impulse 2 project where it is being attempted to go around the world in a purely solar-powered aircraft, “without using a drop of [fossil] fuel“. It is being billed as a landmark flight, signifying a milestone in green aviation. However, nothing could be further from the truth.

Hat-tip: Reader Konrad.

The fixed-wing aircraft departed Abu Dhabi on March 9 and has since landed in India. From there it will continue to China, Hawaii, Phoenix, New York, Morocco before finally coming full circle back to Abu Dhabi sometime in August, 2015 – “without emitting any climate gases”. Full planned route here.

The pilots Bertrand Piccard and André Borschberg will alternate as the craft makes a series of stops along its journey. The plane is able to carry only a single pilot and no passengers. The aim: “We want to show what’s possible with innovative technologies,” Piccard boasted.

The 2200-kg pioneering aircraft has a wingspan that is comparable to that of an Airbus A340. According to Wikipedia lithium polymer batteries will store and power 10 hp (7.5 kW) motors with twin-bladed propellers. The upper wings have 11,628 photovoltaic cells. The major design constraint is the capacity of the lithium polymer batteries. See plane specs here.

Of course the entire flight is supposed to be done “solely” using renewable energy from the sun, and not use a single drop of aircraft fuel. But when one examines the flight more closely it turns out that mission indeed involves a huge fossil fuel carbon footprint.

According to an audio report by SRF Swiss Radio and Television the Solar Impulse 2 mission involves the substitute pilot, a technical ground crew “of dozens of people” and tonnes of equipment and logistical supplies that have to be flown behind using conventional charter flights. The “fossil fuel-free” Solar Impulse 2 journey is in fact being made possible only with the use of tens of thousands of litres of aviation fuel. This is a fact that is being almost entirely ignored by the media.

The SRF reporter tells listeners:

It is so that the entire group, the team members, are multiple dozens of men and women, have to fly behind in charter planes. This naturally is the less sustainable aspect of the entire project, but it just isn’t possible any other way. This involves one cargo plane for transporting all the equipment, and a small passenger plane on which the entire group travels to the destinations.”

A promotion video here shows how the aircraft was transported from Europe to its start point in Abu Dhabi earlier this year: With a Boeing 747!


German Physical Chemistry Scientist On Nature Article Of Proof Of CO2 Forcing: “Measurements Show Exact Opposite”

A recent publication in Nature purported it had finally detected the radiative forcing of increasing atmospheric CO2.

German physical chemist Dr. Siegfried Dittrich slams the media’s assertions of proof that CO2 was guilty of the warming, claiming they are faulty and that they were passed on uncritically by German news weekly FOCUS ONLINE. Here’s the translation:

Climate Warming From Carbon Dioxide?
By Dr. Siegfried Dittrich, on the DPA German News Agency release appearing at Focus-Online 27 February 2015: “Climate warming through carbon dioxide: The proof: CO2 is indeed guilty of the greenhouse effect“.

‘The real guilt by CO2 for the greenhouse gas effect is finally proven.’ This was the subheading of a DPA release appearing at FOCUS Online on 27 February.

Later in the text it is written: ‘For the first time we are seeing the enhancement of the greenhouse effect in nature’, and at the Hamburg-based Max Planck Institute for Meteorology it was gleefully added that finally also the magnitude of the anthropogenic impact has become visible.

It all goes back to the latest surface radiation measurements recently published in an essay in Nature (details here and here). However no one seems to have noticed that the measurements actually showed the exact opposite of what is claimed to have been proven above, namely nothing other than what serious climate critics have always been saying about anthropogenic greenhouse effect.

The number for the increase in CO2-dependent back radiation given by Nature of 0.2 watt/m2 per decade is indeed in reality nothing more than trifle. Why would the earth be shocked when 1367 watts per square meter strikes the surface at noon along the equator? The ever-changing deviations from this so-called solar constant mean value are in fact considerably greater than the above given 0.2 watts/m2.

According to the IPCC, the surface radiative forcing increase in the event of a doubling of atmospheric CO2 concentration is exactly 3.7 Watt/m2, a figure that has been independently confirmed on multiple occasions. Over the last decade the atmospheric CO2 concentration increased some 20 parts per million. Currently it stands at about 400 ppm. Here any undergraduate student is able to compute that the resulting surface radiative forcing increase is approximately 0.2 watt/m2, which has been confirmed by the above mentioned measurements.

Also the resulting global temperature increase can be computed using one of the IPCC equations, which also can be derived from the Stefan-Boltzmann radiation law.

In Nature it is expressly remarked that the measured difference in surface radiative forcing of 0.2 watt/m2 is solely for cloud-free zones on earth. With an average 40% cloud cover and a 30% overlap between the present water vapor and CO2 absorption spectrum, the above calculated temperature value gets reduced from 0.06°C to 0.03°C. Here in reality we are talking about an effect that is barely measureable, and one that has no dramatic impact when combined with the fictional water vapor amplification, which incidentally the superfluous ‘Energiewende’ is based on ad absurdum. It is more than regrettable that FOCUS uncritically passed on these misinterpretations. A correction should be made immediately.

Dr. rer.nat. Siegfried Dittrich

Solar Cycle Weakening…And: German Analysis Shows Climate Models Do Overestimate CO2 Forcing

By Frank Bosse and Fritz Vahrenholt

In February the sun was very quiet in activity. The observed sunspot number (SSN) was only 44.8, which is only 53% of the mean value for this month into the solar cycles – calculated from the previous systematic observations of the earlier cycles.

Figure 1: Solar activity of the current Cycle No. 24 in red, the mean value for all previously observed cycles is shown in blue, and the up to now similar Cycle No. 1 in black.

It has now been 75 months since cycle No. 24 began in December, 2008. Overall this cycle has been only 53% of the mean value in activity. About 22 years ago (in November 1992) Solar Cycle No. 22 was also in its 75th month, and back then solar activity was 139% of normal value. The current drop in solar activity is certainly quite impressive. This is clear when one compares all the previous cycles:

Figure 2: Comparison of all solar cycles. The plotted values are the differences of the accumulated monthly values from mean (blue in Figure 1).

The solar polar magnetic fields have become somewhat more pronounced compared to the month earlier (see our Figure 2 “Die Sonne im Januar 2015 und atlantische Prognosen“) and thus the sunspot maximum for the current cycle is definitely history. It’s highly probable that over the next years we will see a slow tapering off in sunspot activity. Weak cycles such as the current one often follow. Thus the next minimum, which is defined by the appearance of the first sunspots in the new cycle 25, may first occur after the year 2020. The magnetic field of its sunspots will then be opposite of what we are currently observing in cycle 24.

The radiative forcing of CMIP5 models cannot be validated?

A recent paper by Marotzke/ Forster (M/F) is in strong discussion here at climateaudit.org with more than 800 comments. Nicolas Lewis pointed out the question: Is the method of M/F for evaluating the trends infected by circularity?

There is not only a discussion about the methods, but also about the main conclusion: “The claim that climate models systematically overestimate the response to radiative forcing from increasing greenhouse gas concentrations therefore seems to be unfounded.”

Is the natural variability really suppressing our efforts to separate the better models of the CMIP5 ensemble from not so good ones?

Here I present a method to find an approach.

Step 1

I investigated the ability of the 42 “series” runs of “Willis model sheet” (Thanks to Willis Eschenbach for the work to bring 42 anonymous CMIP5 models in “series”!) to replicate the least square linear trends from 1900 to 2014 (annual global data, 2014 is the constant end-date of these running trends). I calculated for each year from 1900 to 1995 the differences between the HadCRUT4 (observed) trends ending in 2014 and the trends of every “series” also ending in 2014. The sum of the squared residuals for 1900 to 1995 the differences between the HadCRUT4 (observed) trends ending in 2014 and the trends of every “series” also ending in 2014.

The sum of the squared residuals for 1900 to 1995 for every “series”:

Figure 3: The sum of the squared residuals for the running trends with constant end in 2014 from 1900,1901 and so on up to 1995 for every “Series” in “Willis sheet”. On the x-axis: the series 1…42.

Step 2

We describe the same procedure described in Step 1, but this time with the trends up to 2004, only 10 years before the end of the trend series:

Figure 4: The sum of the squared residuals for the running trends with constant end in 2014 from 1900, 1901 and so on up to 1995 for every “series” in the “Willis sheet”. On the x-axis: the series 1…42. The ordinate scale is the same as in Figure 3.

Here one sees that the errors for the trends until 2004 on average are much smaller (Figure 4) than they are for the trends up to 2014 (Figure 3). That is no wonder as the parameters of most models for the time period up to 2005 were “set“. Thus the depiction of the trends of the models up to 2014 are also well in agreement with observations:

Figure 5: The trends of the model mean (Mod. Mean, red) in °C/ year since 1900, 1901 etc. up to 1985 with the constant end-year 2004 compared to observations (black).

Obviously the setting of the model parameters no longer “hold” as the errors up to the year 2014 rise rapidly.

Step 3

We calculate the quotients of the errors for the 2014 trends divided by the errors for the 2004 trends (See Figure 4) for every single series and make a 2-dimensional check:

Figure 6: The single series as plotted points. The coordinates are determined by the trend error der until 2014 (X axis) and the ratio of the error up to 2014/2004 (Y axis). The red rectangle marks the “boundaries”, the “good“ series are inside, the “bad“ are outside.

The borders are represented by the standard deviations of both series.

The y-axis in Figure 6 above is the quotient of failures in trend estimations to 2014 (see Figure 5) divided by the trend estimations to 2004 (see Figure 4) with a standard deviation of 3.08; the x-axis is the accuracy of the series in trend estimation for the running trends with the constant end year 2014 (see Figure 5) with a standard deviation of 0.0038. The big differences of many series (up to a factor of 11) between the trend errors compared of 2004 and the trend errors to 2014 is impressive, isn’t it? The stability of the series with great differences seems to be in question, that’s why they are “bad”.

Step 4

Now comes the most interesting part: From the 42 runs of different series, I selected the “good” ones which are within the borders of the red rectangle in Figure 4 and calculated their average. The same procedure was done with all the “bad” ones.

Figure 7: The selected “good” series (see step 1-3), the series mean of all 42 series, the “bad” ones and the observations for rolling trends with constant end-year 2014 in K per annum.

The “good” (blue) series produce a remarkably better approach to the observations than the model mean (red) and the “bad”( green) show the worst performance.

Up to this point we didn’t know what model was behind what “series” in the “Willis sheet”. Thanks to the help from Willis Eschenbach and Nic Lewis we just learned the assignment and the properties of the models behind the “series”, also their corresponding sensitivity with respect to forcing by GHG. The mean value of the transient climate response (TCR), which is the expression for the calculated greenhouse gas effect, is approximately 1.6 for the “good“ models, the model mean (all models) is 1.8 and the “bad” model mean is 1.96.

As one observes is Figure 7, the selection of the “good” models “improves” the convergence towards the observations. For this a TCR of approximately 1.3 is assumed, compare to our blog post “Wie empfindlich ist unser Klima gegenüber der Erwärmung durch Treibhausgase? (How sensitive is our climate with respect to warming from greenhouse gases)“.


The mean of the models overestimates the radiative forcings in the global temperature to 2014. The objectively better models have a lower mean of TCR. The “bad” models have a higher mean of TCR. Many models are perhaps “over tuned” for the trends to 2005. The result is a dramatic loss in forecasting quality beyond the tuning period. Are Marotzke and Forster wrong? Will we ever hear them admit it? There are reasons for doubt.

The Sydney Morning Herald’s Peter Hannam Grossly Deceives His Readers Using Massively Doctored Photo

What follows is photo-shopping taken to an extreme.

It’s little wonder few people believe anything the mainstream dailies write when it comes to climate change. Time and again they’ve been exposed to be unloading barges of BS onto their readers.

Reader Jim sent an e-mail bringing attention to probably the most amateur photo-shopping work on behalf of global warming propaganda I’ve seen in a long time, all used by eco-journalist Peter Hannam of the Sydney Morning Herald in a piece about the coal-fired Liddel Power Station in Hunter Valley, NSW.

Not only is the photo totally manipulated with the aim of deceiving readers, but Hannam’s facts are just as misleading as the photo-shopped power plant image itself:

SMH propaganda2

Glaring photo-shopped image by Jonathan Carroll gets used by the SMH to produce impression that the Liddel coal power plant is causing disease at an epidemic proportion. Original photo: Jonathan Carroll.

Note how the steam emitted by the power plant’s cooling towers is a sinister black. Since when is water vapor black? Hannam obviously is unable to distinguish between the smoke stack and the cooling towers. The scant emissions from the single smokestack in the center of the image shows just how clean coal power plants have become.

The photo also tells the many readers residing in the northern hemisphere that skies down under have very weird colors.

Hannam reports on an activist group of doctors who using a dubious formula claim they have succeeded in putting a figure on the health damage the power plant causes: $600 million annually.

Hannam writes:

The Coal and Health in the Hunter report by the Climate and Health Alliance estimates that burning coal for electricity in the valley alone produces health damage in the order of $600 million annually from the resulting air pollution, including the release of small particles.”

However, as reader Jim points out, a quick search of health statistics from Hunter New England Health (which covers the Hunter Valley) shows that Hannam’s claims are dubious at best. What follows is a chart showing the causes of death for all respiratory ailments:

NSW Health chart

Source: Health Statistics New South Wales

The above chart shows that deaths due to respiratory ailments are primarily caused by lung cancer (smoking) and Chronic Obstructive Pulmonary Disease (COPD) which is in part related to pollution. Deaths from COPD, which to some extent are linked to the emissions from coal power plants, have been declining over the last 15 years.

Deaths due to asthma, a disease that could certainly be exacerbated by pollution, shows no trend.

Why would any reader believe anything Hannam writes? Using doctored images and dubious statistical methods, he is obviously attempting to fabricate a health crisis that does not exist.


BP 2035 Outlook Foresees Only 8% Renewable Energy By 2035! No End In Sight For Fossil Fuel Growth!

One of the biggest miscalculations that the global warming alarmists have made is claiming that global CO2 emissions must reach their peak by 2020 and then begin falling rapidly. If they don’t, there will be no chance of reaching the 2°C maximum warming target. Planetary catastrophe will ensue, the alarmists claim.

British energy behemoth BP has just released its BP Energy Outlook 2035, and it states in no uncertain terms that there is no chance of CO2 emissions beginning their decline by 2035, let alone 2020!

Good news: global GDP to double!

The BP reports states, “By 2035, the world’s population is projected to reach 8.7 billion, which means an additional 1.6 billion people will need energy.” and the globe’s “GDP is expected to more than double“.

That’s good news for humanity. More people enjoying the one-time gift of life and doing so in greater comfort. But that’s going to require energy, of course.

India 3rd largest economy in 2035

The BP report projects that India will go from being a third world country to being the world’s third largest economy.

That has major implications for the world’s energy market. The BP writes (my emphasis):

Primary energy consumption increases by 37% between 2013 and 2035, with growth averaging 1.4% p.a.. Virtually all (96%) of the projected growth is in the non-OECD, with energy consumption growing at 2.2% p.a.. OECD energy consumption, by contrast, grows at just 0.1% p.a. over the whole period and is actually falling from 2030.”

That’s strong growth, and today’s renewable energy technology will have no chance of economically meeting that kind of demand. Wind and solar are just too unreliable, and their storage is still a long way from being feasible. This is glaringly obvious in the BP report.

No end in sight for fossil fuel growth

The BP report features the following chart showing the breakdown of primary fuel consumption by 2035.

BP_2035 Energy

Source: BP.

The above figure foresees massive expansion of the traditional carbon based fossil fuels, especially oil and gas, with modest growth in coal consumption. That means global CO2 emissions will continue growing strongly, which would mean bad news if the CO2 greenhouse theory were correct. But so far, despite the massive rise in global CO2 emissions since the year 2000, global temperatures have not risen at all, and global warming scientists are now under extreme pressure to revise downwards their once lofty warming projections.

Emissions well above path recommended by scientists!

The future development of CO2 emissions bodes extremely ill for global warming alarmists. The BP Report writes on page 85: “Global CO2 emissions from energy use grow by 25% (1% p.a.) over the Outlook. Emissions remain well above the path recommended by scientists, illustrated by the IEA’s “450 Scenario”. In 2035, CO2 emissions are 18 billion tonnes above the IEA’s 450 Scenario.”

CO2 theory rapidly approaching its Waterloo

CO2 emissions growth clearly is not going to be curbed anytime soon, and temperatures really will have to start climbing in earnest if the AGW theory is to survive. (Un)fortunately there are no signs that is going to happen in the next 10-20 years.

Only 8% renewable energy by 2035

Page 14 of the BP Report shows strong growth in renewable energy, but it will be only about 8% of global energy supply. That’s light year’s away from the UN’s 50% target! Obviously, no one except a few token countries are taking renewable energies seriously. Their impracticality is their major obstacle.

On page 17 the BP states that “coal remains the dominant fuel, accounting for more than a third of the inputs to power generation.”

Planet awash in energy

The report shows a planet that is awash in energy and also projects strong growth in “new energy forms” such as shale and oil sands (p. 20) which “are thought to be abundant”. On page 95 the report states (my emphasis):

North America’s oil and natural gas supply outlook has been revised higher yet again (14%) due to the continued evolving expectations for shale gas and tight oil.”

The BP report summarizes on page 97:

Our Outlook shows more growth in non-OECD energy demand than the IEA NP; it also shows more growth for fossil fuels, especially for coal. This probably reflects differing views on the outlook for rapidly industrializing economies, in particular on the speed with which they can move to a less energy-intensive growth path.”

Read: BP Energy Outlook 2035.


Urban Warming – There is More To It Than Just A Heat Island

The descriptions of urban warming dwell on the heating of the air by the local infrastructure. There is more to it than simple conduction to the air mass from warmed surfaces.

In the far infrared, where the peak radiation wavelength is determined by the temperature, much of the energy from warmed surfaces is absorbed by the greenhouse gases in the atmosphere within a meter or so. These gases then re-radiate to any other nearby surfaces or gases further away and conduct by collisions with other air molecules to heat the air. There are also windows in the infrared spectrum that let warm surfaces radiate directly to other surfaces. Thus for a temperature measuring instrument, the temperature measured is a combination of the air temperature conducted to the thermometer by air flowing past it and IR heating from nearby surfaces. This is why in the polar regions when measuring very low temperatures, a person approaching the thermometer will raise the temperature reading.

This second source of increased temperature causes “urban warming” even where the location is strictly rural. A measuring station at an isolated research station or farm can have “urban warming” when the thermometer is in close proximity to just one heated building.


Figure 1 is a visible and FLIR IR image of the MMTS station at the Perry, Oklahoma Volunteer Fire Department. The image is from an article by Anthony Watts here, used by permission.

Painting the MMTS white only reduces direct heating by sunlight at visible wavelengths. In the long wavelength IR, any paint, black, white, or any other color, has the same emissivity, more than 0.9, and will absorb IR equally well. In Figure 1, the west-facing uninsulated door is very warm compared to the north-facing wall. It is being heated both by the sun and the building interior. The MMTS is slightly warmer (perhaps 2 or 3 degrees) than the mounting pipe. The pipe is unpainted, somewhat shiny, with a lower emissivity, reflecting more of the IR. Thus it appears black, where the MMTS is a warmer dark purple.

Pierre has posted on a German study of the temperature shifts with the installation of electronic thermometers here. This shift is due to the different way a glass thermometer in a wooden shelter responds to IR in the vicinity and the way a compact modern MMTS responds. There is also the issue of the need for cabling that leads to a distance bias to the nearest building.

Every weather station should be checked for IR emissions in the “view-shed”, the surrounding surfaces and buildings. This should be done at several hours of the day, to catch sunlight warming all the surfaces, and internal building heating variations.

German Analysis: “Current Warm Period Is No Anthropogenic Product” – Major Natural Cycles Show No Signs Of Warming!

Climate cycles and their extrapolation into the future

By Dr. Dietrich E. Koelle
(Translated/edited by P Gosselin)

As the reconstruction of the climate’s development in the past by proxy data shows, there’s a series of temperature cycles that appear to be unknown, or ignored by many climate scientists. Among these are the larger climate cycles of 150 million to 180 million years (see Part 1 and Part 2), but also the shorter and for us the more important following cycles:

1000 years (900-1100)    Suess cycle with +/-  0.65°C
230 years (230-250)        deVries cycle with +/-  0.30°C
65  years (60-65)              Ocean cycles with +/- 0.25°C

In principle these cycles are sinusoidal in behavior as depicted in Figure 1. Bob Tisdale has also shown how the temperature increase of the 65-year cycle from 1975 to 1998 led to the assumption that it is due CO2 emissions because they too happened to be parallel. This has been naively extended all the way to the year 2100 and forms the basis for the climate models and the invention of the so-called “climate catastrophe”.

Figure 1:  Sine wave characteristic of the 60/65-year ocean cycle (Source: Bob Tisdale at WUWT).

In this analysis we will attempt to see how the temperature development could be over the next 700 years, assuming of course that the mentioned climate cycles of the past will continue on into the future. This should not be (mis)understood as a forecast for the future climate. Up to now there is only the IPCC forecast that the global temperature will rise by 2 to 5°C by the year 2100 – based only on the expected CO2 increase. However that theory has failed to work over the last 18 ears because the various natural climate factors and cycles never got considered, or they were not allowed to be considered in the climate models. Included among these factors are the mean cloud cover (albedo) and the resulting effective solar insolation (watts per sqm) at the earth’s surface, or the sea surface, which is decisive for the temperature development.Next Figure 2 below depicts the 1000-year cycle and the 230-year cycle, which have been reconstructed from historical proxy data. They stem from a combination of results from various publications in the field of paleoclimatology over the last years. The diagram of the last 3200 years distinctly shows a 1000-year cycle; the last 2000 years of which are confirmed by historical documents. In fact this cycle goes back all the way to the end of the last ice age, i.e. some 9000 years. The reason for the cycle is still unknown today, yet its existence is undisputed.

The current warm period is no “anthropogenic product”, rather it is the natural result of a repeating 1000-year cycle that goes back far into the past. Today’s warm period does not even reach the temperatures seen in the past warm periods, which at times were 1 to 2°C higher. Moreover it is important to note that during both of the past temperature maxima of 1000 and 2000 years ago, the CO2 values were at 280 ppm while today they are at 400 ppm. This indicates that the earlier warmer periods likely were related to natural solar activity and not to a rise in CO2 because there was no CO2 rise during those warm phases.

Figure 2: Global temperature over the last 3200 years shows a distinct 1000-year cycle along with the 230-year cycle.

Of historical significance is the fact that over the course of human history warm periods were always times of economic and cultural prosperity. The cooler periods always led to serious problems that led to starvation and huge waves of human migration in Europe. Here it becomes undeniably clear that the alarmist claims that “the earth has a fever” made by politicians such as Al Gore are patentedly preposterous.

The “ideal” 1000-year cycle is varied by the 230-year cycle, which in turn gets varied by the 65-year oceanic cycle, which is depicted in Figure 1. Added to these cycles are the various typically non-cyclical events such as the ENSO, volcanic eruptions, etc. Figure 3 shows the temperature curve for the last 165 years along with the 230-year cycle and the effect of the 65-year ocean cycle. The current temperature values fluctuate by plus/minus 0.2°C due to the effects of ENSO, sunspot activity, volcanic eruptions, etc.

Figure 3:  The 230-year cycle over the last 165 years has been superimposed by a 65-year cycle as well as by other effects like the irregular ENSO events and large volcanic eruptions.

The temperature rise of 0.6°C during the 1975-1998 period, which has triggered all the current climate hysteria, was of the same magnitude as the previous increase that occurred in the 1910 to 1940 period, which in turn had nothing to do with CO2 because back then the concentration in the atmosphere rose by only some 10 ppm (from 297 to 308 ppm). Also the temperature increase of 1.5°C over the last 150 to 250 years is also nothing “out of the ordinary” or “dangerous”, as we are often told in the media. Instead it is only the natural recovery from the Little Ice Age (LIA) that had gripped the planet from 1400 to 1750. The LIA not only led to the Thames River and Baltic Sea freezing over, but resulted in severe hunger in Europe and caused a mass migration to America.

The figures also show that all three climate cycles reached their maximum shortly after the end of the last millennium. With that in mind, we actually should have expected even higher temperatures than those seen in previous warm periods. Here perhaps the fact that the global temperature has seen a negative overall trend since the Holocene Maximum plays a role. That means that the global temperature has fallen by 2°C over the past 8000 years.

Based on historical climate fact, it is possible to extend the trend into the future to form a possible climate scenario. Figure 4 depicts the extrapolation of the 1000-year and 230-year cycle along with the generally expected trend. Added to this are the fluctuations of the 65-year ocean cycles, the impacts of the ENSO-events, sunspot cycles and volcanic eruptions, which result in additional fluctuations of a few tenths of a degree – just as they have in the past.

Figure 4:  Extrapolating the 1000-year and 230-year cycles 700 years into the future.

Figure 4 shows the real global temperature development of the past 1000 years and its theoretical continuation over the next 700 years. This is not a forecast, but rather it is the extended possible course of the over all temperature trend, which over the mid-term in the next 100 years could see a drop of approx. 0.3°C  and a 2°C drop in global temperature in 350 years – which would mean conditions just like those seen in the Little Ice Age from 1450 to 1700. In about 1000 years the 1000-year cycle will again take on its warm phase and temperatures like those of today can be expected.

In the next 50 years there would be no temperature increase, but rather a slight temperature decrease is expected. In the decades before and after the year 2300 a powerful temperature drop could occur because both the 230-year cycle and 100-year cycle would be dropping rapidly together in parallel.


1 J.R.Petit et al.: Climate and atmospheric history of the past 420 000 years from the Vostok ice core, Antarctica, Nature Vol.399, June 1999

2 Th.Steuber et al.: Low-latitude seasonality of Cretaceous temperatures in warm and cold episodes, NATURE Vo.437, 27 Oct.2005

3 W.S.Broecker and G.H. Denon: What Drives Glacial Cycles ? Scientific American, Jan.1990

4 H.Kawamura et al.: Antarctic Dome C Temperature Reconstruction, Nature, 23 Aug.2007

5 J.Veizer et al.: Evidence for decoupling of atmospheric CO2 and global climate during the Phanerozoic eon, NATURE Vo.408, 7 Dec.2000

6 K.Kashiwaya et al.: Orbit-related long-term climate cycles revealed in a 12-MYr continental record from Lake Baikal, NATURE Vol410, 1 March 2001


Note from the Die Kalte-Sonne editors: The main point of this post is to provide any analysis of natural cycles and their logical extension into the future. Unaccounted in the projection shown in Figure 4 is the climate impact of CO2, whose role in climate today is hotly disputed. In our book “The Neglected Sun” we presented two CO2 climate sensitivity scenarios: 1.0°C and 1.5 °C warming for a CO2 doubling. Current studies have corrected the original IPCC value of 3°C strongly downwards (see our articles “Studies from 2014 provide hope: warming effect of CO2 is considerably over-estimated. Official correction is imminent“). It will be exciting to watch how research will develop with respect to climate sensitivity over the coming years.


German Scientist Calls For Founding And Funding Of Independent Climate Research Institute To Counter Alarmist Climate Claims

Geologist Dr. Sebastian Lüning believes it’s high time for the skeptic side to respond more forcefully to the often hyper-exaggerated claims launched by the government funded global warming alarmists and is calling for the founding of an independent Germany-based climate research initiative.

Lüning DkS

Dr Lüning co-authored together with professor Fritz Vahrenholt the leading skeptic book, Die kalte Sonne, which subsequently was translated in English as The Neglected Sun. Their book clearly provides overwhelming scientific evidence of governing natural factors like the sun and oceans driving climate change throughout history.

Lüning’s idea is dubbed the KlimaForschungsInitiative (KFI) – Climate Research Initiative – which he feels is necessary because the criticism of the apocalyptic end-of-world visions is coming from only a few courageous shoulders and citizens who have recognized the “faulty development” in climate science. The debate is completely lopsided, Lüning writes.

These climate realists as a rule receive no financial support for their work. To the contrary it is highly risky to challenge the IPCC line because positions outside the political mainstream are punished by scientific and societal marginalization. Inconvenient criticism of the climate alarmism line is undesirable. The German Federal Ministry of Environment even published a blacklist of [German and American] climate realists. It’s a real career blocker for scientists at the state-supported research institutes. A University of Graz music professor even suggested the death penalty for people who do not tow the IPCC line.”

Under such a hostile and intolerant climate, who on earth would want to raise a finger?

Climate of political intimidation and fear

Lüning says the pressure is in fact so strong that also western industry is visibly intimidated: “Too large is the fear of upsetting those with political power by presenting inconvenient facts.” Rather than rocking the boat, industry has opted to play along –  and to ship jobs overseas instead, or to shut down complete parts of their company, as is the case with German power giant E.On, Lüning says.

Recently the media reported that lawyers are gearing up to sue the major fossil fuel companies for causing extreme weather, like tropical storm Haiyan“ in the Philippines. Such claims, Lüning writes, are fully based on junk science which he describes as resembling superstition and Medieval witch-hunting.

Lüning comments that the world seems to have gone a bit hysterical. The Philippine tropical storm is a good example. From s scientific view it can be excluded that Haiyan resulted from climate change, a view that is supported by scientific literature. Yet, there are plenty demands being made for industry to pay for the damage.

Too often, Lüning writes, false scientific arguments and outright tricks are allowed to go insufficiently challenged. The German geologists says it is essential that an independent climate research initiative be founded by qualified climate-related scientists who are skeptical of the alarmist scenarios in order adequately to respond to the “wild climate claims” through the use of factual and scientific arguments.

Lüning writes that the main activities of an independent Climate Research Initiative would be determine:

1) The real value of CO2 climate sensitivity.

2) The real role of ocean cycles for the 1977-1998 warming phase.

3 If the correlation between solar activity and the temperature development over the last 10,000 years is just a coincidence, as the climate models like to suggest.

4) If extreme weather events are part of natural variability.

The German maverick geologist writes that here more climate-historical scientific studies are needed in order to better document natural climate variability over the past decades, centuriesand millenia. Important: “Which trends and cycles are really detectable and could they be useful for making climate forecasts?”

Lüning envisions a climate research initiative supported by private individuals and the business sector who are truly interested in finding out what really drives the climate. Lüning proposes the six main responsibilities:

1) Identifying the open, disputed climate issues.

2) Targeted support of research projects, publication of results in peer-reviewed journals.

3) Systematie evaluation of existing climate literature on natural variability and compilation of results.

4) Intensive communication with institutes and media concerning the results. Internet communication with the public.

5) Participation in German and international scientific conferences and workshops.

6) Training seminars for non-scientists, advising.

Lüning is convinced that what is required for a sustainable and rational debate is a “structured cooperation with an independent team of experts with a solid financial foundation” in order to address what the real fears are and which scenarios are unrealistic.

Parties interested in working for or supporting an independent German climate research initiative should contact Sebastian.Luening@kaltesonne.de or Sebastian.Luning@gmx.net.


German Analysis: Spreading Alarmism Over Mere Hundredths Of A Degree Is “Complete Hyperbolism”

Screaming bloody murder over nothing? Keep in mind that RSS recently released the satellite measured global temperature for 2014 and found it is not even close to a new record. Three days ago one of Germany’s leading climate science sites Science Skeptical issued the following comment.

Global Temperature Record 2014?
By Michael Krueger
(Translated, edited by P Gosselin)

A new temperature record for Germany has been announced by the DWD German Weather Service. With 10.3°C the warmest year since 1881 has been measured. Here are the facts.

2014 was the warmest year in Germany since 1881, but the warmest 12-months occurred from July 2006 to June 2007 with a mean of 11.3°C. Therefore the annual mean for 2006/2007 as 1°C over the current annual mean.


Chart depicts Germany’s temperature since 1761. Rose line is the annual mean temperature and the dark red line depicts the 5-year smoothing.

Moreover since 2000 Germany’s temperature has barely risen – in contradiction to atmospheric CO2 concentrations.

How does the global temperature appear?

There are different datasets available for global tempertaure. I’m selecting the most alarmist, which comes from NASA. The gray shading shows the monthly mean values and the red curve is the smoothed annual mean (over 12 months).


Since 1880 the global temperature has risen about 0.8°C, i.e. not even a full degree. Since 1998 (a powerful El-Nino-year) there’s been practically no rise. What follows is a blow-up for the recent period.


In 2007 and 2010 it was just as warm as in 2014, or even warmer. We’re talking about 1/100 °C, which is deep inside the range of uncertainty. Yet the concentrations of atmospheric CO2 continued their steady rise. Based on these data, spreading climate alarmism is complete hyperbolism.

Very likely in the days ahead NASA will be announcing a global temperature record that in reality never was.


Analysis Shows Claim That “CO2 Concentration Is Highest In 600,000 Years” Is Highly Dubious At Best

“CO2 Has Never Been This High In 600,000 Years!”… FALSE!
By Ed Caryl

One item on the list of catastrophes that the CAGW climatists claim is that at nearly 400 ppm, the CO2 concentration has never been as high in hundreds of thousands of years. The number quoted is flexible, sometimes 600,000, sometimes 800,000.

It is true that in the ice core figures, CO2 measures from 180 to 200 ppm during the coldest periods and peaks at around 300 ppm during the interglacial periods. But it is well known that the ice core measurement resolution is a few hundred years for recent times and spreads to a few thousand years for the most ancient measurements. Thus the ice core measurements can’t show short periods of high atmospheric CO2.

This was demonstrated in my last article on the brief spike of CO2 in the Younger Dryas period 12,800 years ago from the paper by Steinthorsdottir et al.


Figure 1

Figure 1 above is a plot of CO2 and Greenland temperature during the Younger Dryas. The purple diamond marks the time of the nano-diamond (ND) event as seen in the Greenland ice cores. The horizontal time error bars on the peak CO2 data bring the ND-event within the time  period of the CO2 peak. The stomata index is carbon 14 dated, which has a time error of ±150 years.


Figure 2 is a plot of temperature versus CO2 concentration from the Figure 1 data. The trend line shows that the relationship is negative; a temperature rise of 1°C occurs when CO2 falls by 2.5 ppm during the 2000 year period covered by the stomata proxy data. The R2 value is very low, indicating that this trend is very likely zero. It is apparent that the brief high CO2 concentration did not cause any warming, as it occurred when the temperature was approaching the lowest recorded by the ice core data.


Figure 3 is a plot of the Dome Concordia CO2 measurement over the last 22,000 years. The ND event time is marked by the red dot.

The ice core data can be seen to get smoother as it gets older. Only the stomata data shows the 400+ ppm peak at the ND event. The ice core data cannot show a brief spike in CO2, because at Dome C, the snow (firn) to ice transition takes 100 years or more to close the tiny bubbles that sample the atmosphere.

The ND event was probably caused by a kilometer-size comet that came into the atmosphere over what is now Canada. It likely came in at a shallow angle, like the Chelyabinsk object in February 2013. It is thought to have exploded over the Laurentide ice sheet, with some pieces impacting in what is now Quebec, New Brunswick and Nova Scotia, and others continuing on to impact as far away as the Pacific Ocean. The intense thermal flash ignited all the forests of North America, leaving a soot layer laced with impact-generated particles and raising the CO2 level to more than 400 ppm as seen in southern Sweden in the leaves of the following year.

There is another stomata study, covering eastern Canada. The trees in eastern Canada were burned away, so the stomata data from one location, Pine Ridge Pond, shows a lower peak, and another, Splann Pond, no peak at all. The trees furnishing the leaf stomata needed to re-grow, which took 20 to 40 years or longer, depending on the number of viable seeds in the ground and local conditions. During that time, CO2 was falling back to normal levels.


 Figure 4 is from Mcelwain, J. C., Mayle, F. E. and Beerling, D. J. 2002. Stomatal evidence for a decline in atmospheric CO2 concentration during the Younger Dryas stadial: a comparison with Antarctic ice core records. J. Quaternary Sci., Vol. 17 pp. 21–29. ISSN 0267-8179.

This data is from Pine Ridge Pond in New Brunswick. The top line is a summer temperature proxy from sub-fossil chironomid remains (Midges). The numbers 1 and 2 marks the peaks of the Bölling and Allerød warm oscillations. The lower traces are the stomata proxies with upper and lower 95% bounds. The number 2 here marks the ND event CO2 peak. The CO2 data point at 1 does not appear in the Figure 1 data from Sweden. The total time for the CO2 level to fall back from 400 ppm to 200 ppm appears to be 100 years or less.

Both Figures 1 and 4 show temperature slowly rising during the Younger Dryas as CO2 concentration is slowly falling.

Figure 1 has a ±150-year error in the carbon 14 age data. The spike in CO2 does not line up with the ND event. In Figure 5, a 150-year correction is applied to line up these dates.


Figure 5 is a plot of the southern Sweden stomata data shifted 150 years to the right to align with the ND event.

Figure 6 below is the corresponding XY plot of Figure 5. In the Swedish data there is no stomata data on the temperature rise out of the YD period between 11,500 and 11,750 years ago. There appears to be no relationship between CO2 and temperature. The trend is very close to zero with extremely low R and R2 values.


There should be a CO2 increase as temperature rises and the oceans begin to out-gas dissolved CO2. We see this increase in the longer Canadian stomata data in Figure 4.


Figure 7 is an XY plot from the Figure 4 data. Here we see that CO2 rises at about 4 to 8 ppm for each degree of summer temperature rise at the Canadian latitudes. There appears to be a small delay of up to 150 years between temperature rise and CO2 rise as CO2 peaks always appear after temperature peaks by about this amount in both stomata data sets.

From these two papers we learn the following:
– A large sudden rise in CO2 decays away in 100 years or less.

– A large sudden rise in CO2 does not cause a rise in temperature.

– A large rise in temperature causes CO2 to rise, not the other way around. All the rises in CO2, including in modern times, came after temperature increases.

– The delay between temperature rise and CO2 rise is somewhere between zero and 150 years.

We can see that delay in modern times. The rise in temperature after the little ice age began in the late 19th century, accelerating after 1910. The rise in CO2 began about the time Keeling started measuring it in 1959, accelerating after that, a delay of about 50 years.

Is the rise in CO2 all due to temperature rise? No. It is a combination of temperature change and increased fossil carbon emissions. Will those emissions cause temperature to rise further? No. The large rise at the ND event caused NO temperature rise in either of the data sets above.

For further information on the Nano-Diamond event see:

www.jstor.org/stable/10.1086/677046 Nanodiamond-Rich Layer across Three Continents Consistent with Major Cosmic Impact at 12,800 Cal BP. Charles R. Kinzie, et al. 2014




2014 Sees Record Harvests Worldwide…Demolishing Gloomy Myth Global Warming Would Lead To Acute Crop Failures

It’s early November and now is a good time to look at some of this year’s global crop harvest results. Let’s recall that global warming models projected poor harvests and hunger in the future due to droughts (and floods).

But that is hardly the case…at least certainly for this year. And recall how Joe Bastardi last spring projected a “Garden of Eden” harvest for the US Great Plains. Looks like he was right. The story is similar many places worldwide, and not just the US.

10-foot corn

For example Bloomberg here reports of a record US corn harvest in 2014, writing:

From Ohio to Nebraska, thousands of field inspections this week during the Pro Farmer Midwest Crop Tour show corn output in the U.S., the world’s top producer, will be 0.4 percent above the government’s estimate. Months of timely rains and mild weather created ideal growing conditions, leaving ears with more kernels than normal on 10-foot (3-meter) corn stalks and more seed pods on dark, green soy plants.”

All-time high of 3.631 billion bushels of soybean

Bloomberg also writes here that the US production of soybean “will jump 10 percent this year to an all-time high of 3.631 billion bushels, and inventories before the 2015 harvest will be double a year earlier.”

In Europe the story is the similar. Last May the online marktkompass here already wrote of record wheat harvests:

In all regions of Central and Eastern Europe the weather for growth was close to being optimal and the yield potential has drastically improved.”

“All-time records” in Europe

In Germany’s agricultural state of Mecklenburg West-Pomerania, corn and barley reached record harvests. The online bauwesta reports that both winter and summer barley harvests set all-time records. Overall across Europe Crop Site reports this year’s cereal harvest “has generally been strong in Europe and Ukraine“.

Doom and gloom media silent on bumper crop yields

Moreover, numerous analysts report of falling grain and commodity prices. All of this, of course, is great news for consumers and a planet that still has close to a billion people who do not get enough to eat. Yet the good news is generally not getting reported by the doom-and-gloom obsessed media.

“Bounty of wheat, barley and oats”

Almost every country one looks at in Europe, one is finding record bumper crops this year. The usually gloom-obsessed UK Guardian also reported in September on UK 2014 harvests:

Long sunny spells after a mild winter and early spring delivers a bounty of wheat, barley and oats. […] 2014 could be the biggest yield ever for wheat when the final data is released in October.”

If climate change is supposed to be resulting in poor harvests, higher food prices and acute hunger for the poor, as many experts have warned incessantly, then the opposite must mean that climate change is not happening at all, or that it is having a profoundly beneficial effect for man instead.

Glut of apples

The Guardian also reports of bumper apple harvests and that “growers still face losses due to glut of apples and supermarket price wars.” The Guardian adds, “A cold winter gave the trees a good rest, then plenty of rain – especially in August – helped plump up the fruit, and then a dry September allowed the picking to get started early.”

If anything, all the bumper crops are leading to only one single food crisis: the rock bottom prices farmers are getting for their crops!

“bumper world harvest this year”

thompsonslimited.com here reports that the bumper-crop low-price crisis has also not spared Canada for almost everything from apples to zucchini. It writes that the “world commodity prices are worryingly low for arable farmers following a bumper world harvest this year.”

Russia “in awash in grain”

www.martellcropprojections.com here writes that Russia “is awash in grain from a bumper harvest in the growing season just ended.  The 2014 grain harvest increased to 105 million tonnes threatening to break a record.”

The Crop Site also reports of record rice production in Bangladesh, and bumper maize harvests in Pakistan. Even Scotland’s 2014 cereal harvestis estimated to be the largest in 20 years, with favourable conditions expected to produce more than three million tonnes of cereals.”

So, if you are not moping about all the good news on this year’s global harvest, and failed predictions of catastrophe, and wish instead to celebrate the good news with glasses of cheer, the wine-searcher here reports that France is “looking forward to a bigger and better wine harvest“. Indeed all the natural ingredients needed for fermenting or brewing your favorite spirit appear to be in bountiful supply this year.

Visions of Ehrlichian-style widespread crop failures and mass starvations postponed yet again. And they show absolutely no signs of ever materializing any time soon.

In fact one could easily argue that the world is better fed today than at any time in human history. We can in part thank higher CO2 concentrations and warmer climate for that.

Data Contradict Warming Hypothesis: Relative Emissivity Is Not Declining As IPCC Models Predicted!

An Empirical Review of Recent Trends in the Greenhouse Effect

By Robin Pittwood, Kiwi Thinker


The core of the human caused global warming proposition is that an increasing level of greenhouse gases acts to reduce heat loss from the planet making the atmosphere here warmer. The amount of warming anticipated by the IPCC models is from about one to several degrees C for a doubling of CO2 concentration.

But a conundrum has arisen lately:  While CO2 has continued rise significantly the temperature has not.  There has been no global warming since about 1997. Scientists on both sides of the debate have noticed this and have offered something like 55 explanations as to why this could be so. Some of those explanations lock into the dogma built into the IPCC models, taking for certain that the greenhouse effect is increasing, but because there is no atmospheric temperature rise, they then have to explain the retained heat is somewhere else.

Is the greenhouse effect occurring as the IPCC models propose?

This study analysed two important factors directly associated with the greenhouse effect, atmospheric temperature and outgoing radiation and finds that outgoing radiation has not declined. The missing heat has gone back to space as usual.  But more importantly the (lack of a) trend observed in an empirical derivation of the Stefan Boltzmann relative emissivity factor directly contradicts the greenhouse theory built into the IPCC models.


Regular readers at any of the main climate change blogs will be aware that since about 1997 there has been nearly no global temperature rise. And they will know too, that this is despite atmospheric CO2 concentration continuing to rise. To date there are some 55 ideas to explain this slowdown in global warming. Some of the ‘explanations’ presume the so-called ‘greenhouse effect’ must still be increasing as the IPCC models calculate; it’s just that the heat has been hidden elsewhere, maybe deep in the ocean.

This study, based on 34 years of satellite data; outgoing long-wave infrared radiation (OLWIR) and temperature, demonstrates otherwise.

I used three data sets, OLWIR from NOAA, and the average of both UAH and RSS for global temperature.

I obtained monthly average OLWIR (W/m2) for each 2.5 degree latitude by 2.5 degree longitude area of the globe. After converting the netCDF files to Excel, I scaled each 2.5*2.5 area’s OLWIR to account for the varying size of its area, resulting in a global average OLWIR.  (There was some missing data mid 1994 to early 1995. I populated this by a linear interpolation).  The resulting annual average OLWIR is shown in the graph below for the years 1979 to 2012. A linear regression fit shows a generally increasing trend in OLWIR over this period.


The temperature data is also plotted on the graph below. A linear regression fit shows a generally increasing trend for the years 1979 to 2012.

The relationship between temperature and emitted radiation follows a universal law of physics, Stefan Boltzmann’s law states the emitted radiation is the product of the fourth power of absolute temperature and an emissivity factor. A reduction in the emissivity factor means less outgoing radiation for a given temperature.  That would indicate a stronger greenhouse effect.  An increase in the emissivity factor means more outgoing radiation for a given temperature.  That would indicate a more transparent atmosphere.  The study derived earth’s emissivity factor for each of the 34 years and the results displayed.

Using an average global temperature of 287 Kelvin added to the temperature anomaly, the relative emissivity has been derived for each year using the formula:

j / (k*T^4)

where j is OLWIR, k is the Stefan Boltzmann constant, and T is the temperature.

If the greenhouse effect was increasing, relative emissivity should be declining. A quick look at the graphs shows clearly this is not the case.


Our planet’s relative emissivity has been flat-lining, despite increasing CO2 concentration over the study period. The derived emissivity factor, being basically constant, directly contradicts all of the IPCC models. No increased greenhouse effect is observed.


The two primary findings of this empirical study are:

    • Outgoing radiation has not declined over this period as expected by IPCC models. The missing heat has gone back to space – as usual and as per Stefan Boltzmann’s law, via OLWIR, and,
    • The increasing greenhouse effect expected by IPCC models, has not exposed itself. There has been no increased greenhouse effect over this period. [A closer inspection of the relative emissivity trend shows the atmosphere is even becoming a little more transparent – though little should be made of this given the variability of the data].


The core of the human caused global warming proposition is that an increasing level of greenhouse gases acts to reduce heat loss from the planet making the atmosphere here warmer. But is the greenhouse effect occurring as the IPCC models propose? This study analysed two important factors directly associated with the greenhouse effect, atmospheric temperature and outgoing radiation and finds that outgoing radiation has not declined. The missing heat has gone back to space as usual.

But more importantly the (lack of a) trend observed in an empirical derivation of the Stefan Boltzmann relative emissivity factor directly contradicts the greenhouse theory built into the IPCC models.

The original post on this study may be found here.

Data Table:

Robin 3