Germany CO2 Reduction Fails Again For 9th Year Running! …Merkel Exposed As Fake Climate Warrior

Sometimes you have to wonder which are the biggest fraud: Germany’s claim that its cars are clean, or its claim of being a leader in climate protection. Both, it turns out, are very fake and even downright frauds.

While German Chancellor Angela Merkel and German activists like going around and scolding Donald Trump for his “irresponsible” stance on “greenhouse” gas emissions, it is coming to light that Germany’s climate posturing is indeed a total swindle.

While USA’s greenhouse gas emissions have declined impressively over the past decade, Germany’s have gone nowhere.

Flat for 9 years

And now Cleanenergywire.org here reports that Germany again this year (2017) will again fail to reduce its CO2 equivalent emissions for the 9th year running. Ironically one of the reasons cited for this year by Cleanenergy.org is “cold weather” (again).

Germany’s CO2 equivalent CO2 greenhouse gas emissions in metric tonnes since 2009. This year (2017) CO2 equivalent emissions are expected to be slightly over those seen in 2016. Data taken from German Ministry for environment.

The reality is that Trump and America has nothing to learn from the Green-preaching Germans, except on how to deceive and mislead the public. Cleanenergy.org writes that the higher energy demand is “triggered by economic growth and colder weather“.

Cleanenergy.org cites the AG Energiebilanzen (AGEB), which said in a press release: “From January until September, energy demand was 1.9 percent higher than in the same period last year.” And thus the AG Energiebilanzen expects “energy-related CO2 emissions will rise slightly in 2017”.

“A disaster”

Merkel’s glaring failure, however, did not prevent her from taking a swipe at Trump, the Handelsblatt reports. Unfortunately for Merkel there is no disguising Germany’s failure to meet its own imposed targets. The Environment Ministry says the 2017 emissions figures are “a disaster for Germany’s international reputation as a climate leader.”

CO2 emissions reduction is pie-in the sky, another hoax

Whatever gets decided in Bonn will be pure meaningless self-deception. The fact remains that China, India and the rest of the developing world are going to continue boosting their fossil fuel energy consumption and CO2 emissions are going to keep rising for quite awhile. The recent OPEC report makes that very clear.

Some advice for Merkel: Forget the CO2 reductions. Cutting Germany’s puny 2% global share would theoretically lead to a temperature reduction of 1 or 2 hundredths of a degree Celsius, meaning some 100 trillion euros per °C. It’s pure economic insanity. Take the idea and discard it quickly into the dustbin for good.

 

Scientists Affirm: ‘No, The Arctic Is Not Melting’ … ‘Nothing Has Changed Since 1900’

Global Warming Theory ‘Completely

Disconnected From the Observations’


Extensive analysis of temperature trends in the Arctic reveals that there has been no detectable long-term change since the beginning of the 20th century, and thus predictions of a sea ice-free Arctic in the coming decades due to dramatically rising temperatures are not rooted in observation.


Butina, 2015  

IS THE ARCTIC MELTING? 

THEORY VS. OBSERVATIONS

Abstract

[T]he Arctic Circle is the most extreme place on our planet where seasonal changes can range from +35.0°C in July and -65.0°C in February; […] on average 75% of the year is spent below the melting point of water [and] on average the Arctic will be covered by ice/snow for the same proportion of time, i.e., 75% or 9 months of the year.
The same seasonal extreme variations in air temperatures are also observed in ice cover variations observed in the Arctic where the winter‘s ice cover can be between 14- 16 million km2, while during summer the area covered can vary between 4 and 8 million km2.  Based on observations, dating back to 1900, it can be concluded that it is physically impossible for the Arctic to be ice/snow free in the foreseeable future since the air temperatures were as cold in 2013 as they were in 1900.
Since ice cannot melt below 0.0°C, all these observations point towards the Arctic remaining ice-covered for the next 100 years. It must also follow that any theory predicting imminent melting of the Arctic ice cap cannot be based on thermometer-recorded data and, therefore, must be wrong and will merely be an artefact of using the term temperature where there is no true association with the calibrated thermometer, the instrument used to measure temperature in all physical, medical and engineering sciences.

Conclusion

So, what are the hard facts about Arctic that are based on the observations made by calibrated thermometers at 20 stations across the Arctic Circle and which conclusions can be made based on those observations?
1. Temperatures in the Arctic between 1900 and the present day are a long distance below 0.0°C for at least 9 months per year and can be as low as -64.0°C
2. It is impossible to separate the youngest from the oldest years using thermometerbased daily or monthly Tmax/Tmin data
3. The total ranges observed in daily Tmax/Tmin data can be as high as 100.0°C and as low as 75.0°C making the Arctic Circle the most variable and extreme area on our planet therefore making any accurate forecasting of future temperature patterns and trends impossible
4. The switches between the extreme hot to extreme cold temperatures are very frequent and very unpredictable and can occur within the same month, same year or between two consecutive years
5. The large observed ice gain/loss variations are pre-determined by the large observed variations in air temperatures
6. Since the air temperatures are chaotic in nature it must follow that the extent of the ice cover has to be chaotic as well and, since we cannot predict future events of a chaotic system, we cannot predict future trends of either air temperatures or ice cover patterns
Based on the facts above only one conclusion can be made in reference to the putative melting of the Arctic: historical thermometer-based data tells us that between 1900 and 2014 arctic temperatures were for 75% of the time consistently long distance below 0.0°C; the ice cover in the winter months is still consistently more than 14,000,000km2 and, therefore, it is physically impossible for the Arctic to be already melting since nothing has changed since 1900 till present day. The only sensible forecast for the future would be to expect the same extreme events to continue until thermometer-based evidence tell us otherwise.
Let me conclude this paper by answering the question asked in the first part of the title by a categorical No, the Arctic is not melting. As long as temperatures remain the same as they have been for the last 100 years the Arctic will remain frozen in the long winter months and partly melt during very short summer months.
The answer to the second question is that the theory of global warming is completely disconnected from the observations since their definition of temperature is based on some theoretical number that has nothing to do with the temperature that is measured by calibrated thermometer and, most importantly, used as an international standard by the scientific community. Since the theory is clearly wrong about forecasting the temperature patterns in the Arctic, all other predictions made by the theory must be wrong too.


New Paper: No  Greenland Temperature Or Sea Ice Changes Since 1600 Either


Kryk et al., 2017     

“Our study aims to investigate the oceanographic changes in SW Greenland over the past four centuries (1600-2010) based on high-resolution diatom record using both, qualitative and quantitative methods.  July SST during last 400 years varied only slightly from a minimum of 2.9 to a maximum of 4.7 °C and total average of 4°C. 4°C is a typical surface water temperature in SW Greenland during summer. … The average April SIC was low (c. 13%) [during the 20th century], however a strong peak of 56.5% was recorded at 1965. This peak was accompanied by a clear drop in salinity (33.2 PSU).”


Greenland Ice Sheet In BalanceMelt Added  Just 1.5 cm (0.6  Inch) To Sea Levels Since 1900


Fettweis et al ., 2017

“[T]he integrated contribution of the GrIS [Greenland Ice Sheet] SMB [surface mass balance] anomalies over 1900–2010 is a sea level rise of about 15 ± 5 mm [1.5 cm]with a null contribution from the 1940s to the 2000s

Like The Arctic, Antarctica Has Not Warmed In The Last Century Either


Stenni et al., 2017

[N]o continent-scale warming of Antarctic temperature is evident in the last century.”

Antarctica’s Ice Sheet Has Been Gaining Mass Since 1800


Thomas et al., 2017

Our results show that SMB [surface mass balance] for the total Antarctic Ice Sheet (including ice shelves) has increased at a rate of 7 ± 0.13 Gt decade−1 since 1800 AD…”

Antarctica’s Mass Gains Have Reduced Sea Levels By -0.04 mm-¹ Per Decade Since 1900


Thomas et al., 2017

“…representing a net reduction in sea level of0.02 mm decade−1 since 1800 and ∼ 0.04 mm decade−1 since 1900 AD.  The largest contribution is from the Antarctic Peninsula (∼ 75 %) where the annual average SMB during the most recent decade (2001–2010) is 123 ± 44 Gt yr−1 higher than the annual average during the first decade of the 19th century.”

In sum, there is nothing thermally unusual occurring today in either the Arctic or Antarctic, precluding the clear detection of an anthropogenic temperature or ice-melt signal in the polar regions.

German Scientists Call Recent Sea Level Rise Claims “Fijigate”, …Hyped Up To Generate Money

Dr. Sebastian Lüning and Prof. Fritz Vahrenholt show that sea level rise at the Fiji Islands is being hyped up in order to generate money.
====================================================

Fijigate

Dr. Sebastian Lüning and Prof. Fritz Vahrenholt
(Translated/edited by P. Gosselin)

The COP23 climate conference in Bonn had originally been planned to take place in the Fiji Islands. But in order to comfortably accommodate the approximately 25,000 representatives(!) from every country in the world, it was decided to hold it in Bonn.

It was reported in Spiegel about how the islands are becoming victims. At the start of the article author Axel Bojanowski referred to the rise in sea level and linked to an NOAA-website. But later throughout the rest of the article there was no mention of climate change submerging the islands.

Bojanowski was completely correct to emphasize that the most important reasons for the erosion of the islands is solely the fault of the island inhabitants. The uncontrolled deforestation reduces stability and resistance to the sea. Even persons who sail in the area report that there are 3-meter waves even in the absence storms in the region.

But let’s get back to sea level rise in the area. At the NOAA website there is also the possibility to download the data. And that is what we did.

Figure 1: Sea level rise at the Fiji Islands, 1990-2011. Data: NOAA. The linear trend is 6 mm per year.

The available NOAA data go back (with some gaps) to 1972, but the station was moved in March 1989 and this led to an upwards jump of about 10 cm. Thus we look only at the period from 1990 to the end of 2011, where unfortunately the data series ends. However we supplement the data for the area from satellites (see here:

Figure 2: Sea Surface Height (SSH at the Fiji Islands from satellite measurement. Source here.

After a peak in 2012 the level went down by about 10 cm by mid 2017. It is very much related to natural variations, in sync with the El Ninos (low levels) and La Ninas (high levels).

So what remains of the climate change horror stories in connection to the Fiji Islands? In the article, a 40-year old woman tells about her youth (i.e. around 1990), when she viewed the water as her friend and how today (2017) she regards it as an enemy. But just what should an approximately 8 cm rise (and not the often cited 17 cm that was generated by the powerful 2011/12 La Nina) lead to in 27 years with waves of 3 meters?

The contribution to erosion coming from climate change is certainly hardly noticeable by the residents. However for PR work, it works great for shaking down money.

Renowned Sea Level Expert: “NO TRACES OF A PRESENT RISE IN SEA LEVEL; On The Contrary: Full Stability”

A new paper by renowned Swedish sea level expert Prof. Axel Mörmer published in the International Journal of Earth & Environmental Sciences dumps lots of cold water on the premise that today’s sea level rise is caused by man and is unusual.

Mörner’s paper looks back at the last 500 years of sea level rise and shows that natural variables are the major drivers, and not man-made CO2-driven global warming.

Previously no study in the Fiji Islands had been devoted to the sea level changes of the last 500 years and so no serious prediction can be made. What was needed was a good understanding of the sea level changes today and in the past centuries. Mörner’s study helps to fill that gap and to answer questions concerning today’s sea level rise.

The Swedish scientist summarizes in the paper’s abstract that there is a total absence of data supporting the notion of a present sea level rise; on the contrary all available facts indicate present sea level stability.

Source: Mörner, Int J Earth Environ Sci 2017, 2: 137, https://doi.org/10.15344/2456-351X/2017/137

Sea level changes over the past 500 years at Ysawa Islands, Fiji, show that sea level was +70 cm high in the 16th and 17th centuries, -50 cm low in the 18th century and that stability (with some oscillations) prevailed in the 19th, 20th and early 21st centuries.

This, Mörner writes, is almost identical to the sea level change documented in the Maldives, Bangladesh and Goa (India), and thus would point to a mutual driving force.

The pattern is the same at other locations:

Source: Mörner, Int J Earth Environ Sci 2017, 2: 137, https://doi.org/10.15344/2456-351X/2017/137

The paper also states that the recorded sea level changes are anti-correlated with the major changes in climate during the last 600 years. Therefore, Mörner concludes that glacial eustasy cannot be the driving force.

The explanation behind the sea level changes, Mörner believes, seems to be rotational eustasy with speeding-up phases during Grand Solar Minima forcing ocean water masses to the equatorial region, and slowing-down phases during Grand Solar Maxima forcing ocean waster massed from the equator towards the poles.

The paper summarizes:

This means there are no traces of a present rise in sea level; on the contrary: full stability.”

About the author:

Nils-Axel (”Niklas”) Mörner took his Ph.D. in Quaternary Geology at Stockholm University in 1969. Head of the institute of Paleogeophysics & Geodynamics (P&G) at Stockholm University from 1991 up to his retirement in 2005. He has written many hundreds of research papers and several books. He has presented more than 500 papers at major international conferences. He has undertaking field studies in 59 different countries. The P&G institute became an international center for global sea level change, paleoclimate, paleoseismics, neotectonics, paleomagnetism, Earth rotation, planetary-solar-terrestrial interaction, etc.  Among his books; Earth Rheology, Isostasy and Eustasy (Wiley, 1984), Climate Change on a Yearly to Millennial Basis (Reidel, 1984), Paleoseismicity of Sweden: a novel paradigm (P&G-print, 2003), The Greatest Lie Ever Told (P&G-print, 2007), The Tsunami Threat: Research & Technology (InTech, 2011), Geochronology: Methods and Case Studies (InTech, 2014), Planetary Influence on the Sun and the Earth, and a Modern Book-Burning (Nova, 2015).

 

New AGU Presentation: ‘No Increase In Earth’s Surface Temperature From Increase In CO2’

Source: American Geophysical Union Fall Meeting

At next month’s American Geophysical Union (AGU) Fall Meeting in New Orleans (US), an independent researcher named Trevor Underwood will be presenting an equation-rich analysis that thoughtfully undermines the perspective that increases in CO2 concentrations are a fundamental variable affecting climate.

Instead, Underwood argues that the absorption band where CO2 emissivity could have an effect is likely already saturated, precluding the capacity of increased CO2 concentrations to produce atmospheric warming.

He also advances the position that solar irradiance changes can explain modern temperature variations, which is consistent with other recent analyses.

It seems that more and more of these papers questioning the “consensus” view on the efficacy of the CO2 within the greenhouse effect are being considered in scientific circles.  Several previous examples are listed below.

The volume of contrarian analyses would seem to suggest that the climate’s specific sensitivity to CO2 concentration changes is not yet settled.

And so the debate rages on.


•   Another New Paper Dismantles The CO2 Greenhouse Effect ‘Thought Experiment’
•   New Paper: CO2 Has ‘Negligible’ Influence On Earth’s Temperature
•   3 Chemists Conclude CO2 Greenhouse Effect Is ‘Unreal’, Violates Laws Of Physics, Thermodynamics
•   Ph.D. Physicist Uses Empirical Data To Assert CO2 Greenhouse Theory A ‘Phantasm’ To Be ‘Neglected’
•   Swiss Physicist Concludes IPCC Assumptions ‘Violate Reality’…CO2 A ‘Very Weak Greenhouse Gas’
•   Recent CO2 Climate Sensitivity Estimates Continue Trending Towards Zero
•   A Swelling Volume Of Scientific Papers Now Forecasting Global Cooling In The Coming Decades
•   Russian Scientists Dismiss CO2 Forcing, Predict Decades Of Cooling, Connect Cosmic Ray Flux To Climate
•   2 New Papers: Models ‘Severely Flawed’, Temp Changes Largely Natural, CO2 Influence ‘Half’ Of IPCC Claims
•   Leading Heat Transfer Physicists/Geologists Assert The Impact Of CO2 Emissions On Climate Is ‘Negligible’
•   New Atmospheric Sciences Textbook: Climate Sensitivity Just 0.4°C For CO2 Doubling
•  U of Canberra Expert: Doubling Atmospheric CO2 Would Increase ‘Heating By Less Than 0.01°C’
•   Uncertainties, Errors In Radiative Forcing Estimates 10 – 100 Times Larger Than Entire Radiative Effect Of Increasing CO2
•   New Paper Documents Imperceptible CO2 Influence On The Greenhouse Effect Since 1992

Underwood, 2017

No Increase in Earth’s Surface Temperature From Increase in Carbon Dioxide

A critical look at these different in situ measures of the Earth’s surface temperature identified a divergence between land and marine surface temperatures, with land surface air temperatures showing a significant and increasing rate of warming of around 0.5°C between 1880 and 1981, and 0.7°C between 1982 and 2010, whilst marine air temperatures show little if any change between 1880 and 2010 (Underwood (1) 2017). Recent academic literature is also beginning to question the accuracy of the adjusted in situ data (Kent et al. 2017).
In order for an increase in carbon dioxide or other greenhouse gas concentration in the atmosphere to result in an increase in the surface temperature of the Earth, it must be able to increase the absorption of infrared radiation emitted from the surface. This would result in an increase in the absorption factor, f. However, as seen above f is currently around 0.9444. Absorption of infrared radiation by molecules of greenhouse gases, involves increasing the internal energy of the molecule by changing the quantum state of the molecules, which can only occur at particular wavelengths, known as absorption bands.
These absorption bands can be extended by what is referred to as pressure broadening (Strong and Plass 1950; Kaplan 1952), but when all of the emitted infrared radiation within these absorption bands has been absorbed by greenhouse gas molecules in the atmosphere, no further absorption of the terrestrial radiation is possible. The radiation with wavelengths falling outside of the absorption bands passes through the atmosphere and escapes into space.
Absorption of solar radiation in in the stratosphere is almost 100% efficient in the ultraviolet due to electronic transitions of oxygen (O2) and ozone (O3) and a significant amount of solar radiation is absorbed by water vapor (H2O) in the lower atmosphere. It is primarily the visible radiation that is absorbed at the Earth’s surface. In the infrared, absorption is again almost 100% efficient because of the greenhouse gases, but there is a window between 8 and 13 mm, near the peak of terrestrial emission, where the atmosphere is only a weak absorber except for a strong ozone feature at 9.6 mm. This atmospheric window allows direct escape of radiation from the surface of the Earth to space and is of importance in determining the temperature of the Earth’s surface (Jacob 1999).
Additional leakage could occur if the greenhouse gas concentration in the atmosphere were insufficient to absorb all of the infrared radiation in the absorption bands emitted by the Earth’s surface, but due to the extent of the atmosphere and its known unsaturated state, it is more likely that the current leakage corresponds to radiation in the part of the infrared spectrum that does not fall in the greenhouse gas absorption and emission bands, referred to as the “infrared window”. As a consequence, even in the case where there is leakage of infrared radiation from the Earth’s surface directly into space, as long as the atmosphere is able to absorb all of the upwelling infrared radiation in the greenhouse gas absorption bands, neither the amount of this leakage nor the amount of the absorption will depend on concentration of greenhouse gases in the atmosphere. From the emission spectra (a) and absorption percentages (b) in the diagram above (Fig. 2.6, Yang 2016), where the 255°K blackbody curve represents the terrestrial radiation, it appears that at the current surface temperature and absorption factor of 0.9444 all of the radiation within the emission bands is fully absorbed, and that the remaining 5.56 percent of the infrared emission represents radiation with wavelengths within the atmospheric window. If this is true, there can be no further increase in f [absorption], and no increase in the surface temperature with an increase in carbon dioxide.

Increase In Solar Forcing Explains Recent Warming

The difference between the minima showed an increase of 0.2812 W/m-2 for VIRGO; 0.4701 W/m-2 for ACRIM; and 0.2650 W/m-2 for ACRIM + TIM over the 11.6 year solar cycle 23 beginning in May 1996 and ending in January 2008; or 0.24 W/m-2 per decade for VIRGO, 0.40 W/m-2 per decade for ACRIM, and 0.23 W/m-2 per decade for ACRIM + TIM (pmodwrc website
2016).
These decadal increases in TSI [Total Solar Irradiance] from ACRIM, SARR, VIRGO and ACRIM + TIM are sufficient to explain the whole of the increase in surface temperature estimated from in situ data during the last 100 years. They compare with the six published model-based estimates of forcing examined in Schwartz (2012) that showed forcing by incremental greenhouse gases and aerosols over the twentieth century ranging between 0.11 and 0.21 W/m-2 per decade.

Summary

Solution of the Greenhouse Effect equations based on a more realistic atmospheric model that includes absorption of solar radiation by the atmosphere, thermals and evaporation, and an examination of the fraction of terrestrial infrared radiation absorbed whilst passing through the atmosphere, suggests that the contribution of greenhouse gases to the surface temperature is close to its upper limit. Any further contribution would depend on an increase in the infrared absorption factor of the atmosphere from its current level of around 0.9444, which seems unlikely. As this appears to correspond to total absorption of all black body infrared emission from the Earth’s surface at wavelengths at which there are greenhouse gas absorption bands, including for water vapor, it seems likely that we are close to the thermodynamic limit of greenhouse warming for the current luminosity of the sun, and that any further increase in carbon dioxide concentration in the atmosphere will have little or no effect on the surface temperature of the Earth. Questions about the reliability of in situ measurements of surface temperatures also raise questions about current estimates of global warming. Moreover, recent evidence from satellite measurements of solar irradiance, indicate that any recent warming could be due to increasing solar irradiance.

A conference paper with a similar conclusion regarding the emissive/warming limitations of increased CO2 concentrations was presented by a molecular physicist, Dr. N. Doustimotlagh, at the World Conference On Climate Change in October, 2016.


Doustimotlagh and Mirzaee, 2016

So because of the limited values of electromagnetic waves that come from Earth and limitation of  absorption of greenhouse gasses, the greenhouse effect of greenhouse gasses should be limited.  In other words, after absorbing of all the IR waves that come from Earth by greenhouse gasses in the atmosphere, there are no IR waves to cause greenhouse effect. It means that “two things that cause greenhouse effect are greenhouse gasses and IR waves in absorption spectrum of these gasses, so if greenhouse gasses increases but there are no IR waves, it is natural that there is no greenhouse effect”.
If the concentration of CO2 in atmosphere increases until absorbs all the values of  electromagnetic waves that are absorbable for CO2, additional values of CO2 should not have greenhouse effect.

While Record Cold Grips North America, Meteorologists Forecasting Severe 2017/18 Winter For US / Europe

Over the recent days we’ve been hearing about record snowfall in Montana, record low temperatures in Minnesota and Ontario, New York City “blowing away” a 103-year old record, vicious cold gripping Lebanon, PA. Moreover, Arctic sea ice and Greenland ice have surprised climatologists with a comeback over the past year.

UPDATE: And now the UK braces for 3 weeks of unusual cold.

What is more, the NOAA has just officially announced the return of the La Niña, after earlier this year an El Niño had been forecast instead.

“Gangbusters cold” in the works

And lately we’ve been hearing a number of meteorologists warning of a harsh coming winter for both Europe and USA.

At his Friday Daily Update, veteran meteorologist Joe Bastardi warns of a “fierce cold” being on the table for this winter for the United States. Bastardi says he does not believe the US climate models at all, and instead could even be as harsh of the notoriously cold 2013 winter.

Bastardi believes the month of December will in fact turn out to be the opposite of what was projected by the US weather models, which foresaw a warm December. See his latest Saturday Summary at Weatherbell.

In his Friday Daily Update Bastardi thinks the cold will be “gangbusters” and that there is a real chance for a “December to remember”. All in all, the Pennsylvania State University meteorologist sees an excellent ski season in the works for the USA.

-40°C for Christmas In Central Europe?

Over in Europe, there have been a growing number of warnings of a “winter of the century“. For example the German nachrichten.de here reports Christmas temperatures of -40°C!

Naturally such forecasts need to be viewed with great skepticism as they are highly speculative. Long-term prediction methods may indicate which way a winter is tending, but I wouldn’t believe any forecast 6 to 8 weeks out.

Nachrichten.de cites information from the Augsburg Meteorological Institute and the Hamburg Weather Warning station. The Federal Office for Weather Observations advises citizens “to prepare for an ice cold Christmas.”“Snowmageddon”?

The express.co.uk warns of a “SNOWMAGEDDON”, thanks to a La Niña bringing a “Big Freeze”. The express.so.uk  writes that “winter weather 2017 is set to be the harshest for years“.

For the winter of 2017/2018 the PDO will bring a winter of deadly blizzards and killer freezes as a perfect storm of catastrophic weather systems gather. Analytical meteorologist Tyler Sotock suggested America was facing Snowmaggedon. And spokesman for another YouTube weather channel Hurricane and Winter Tracker warned of chaos.”

While wild speculation swirls, one thing seems certain: just as Joe Bastardi shows at his Saturday Summary, the US forecasts of a warm November/December period across Canada and the US East are looking more flawed than ever.

 

German Media Report: Power Grids In Distress…Highly Unstable Due To Wind And Sun Power!

Recently German SAT1 television broadcast a documentary on the state of the European and German increasingly green power grid: “How secure are our power grids?” Due to the volatile and unpredictable supply of wind and solar energy, the grid has become far more unstable, the documentary warns. The news is not good:

Chances of German power grid blackout increasingly on the rise, experts warn.

At best: the consumers are getting a far lousier product at a much higher price.

At the 17-minute mark, Bernd Benser of GridLab-Berlin tells viewers that while grid operator Tennet had to intervene only 3 times in 2002 to avert grid instability, last year he says the number was “over 1000” times — or “three times daily”.

These intervention actions, known as redispatching, cost the consumer about a billion euros last year alone, says Benser.  The SAT 1 voice-over warns that more power transmission lines are urgently needed if the Energiewende is to avoid “becoming a sinking ship“. However over the years acceptance by citizens has swung from a generally warm welcome to ferocious opposition. Politicians need to start noting that green energies have overstayed their welcome.

Major grid instability

And as wind and solar power capacity gets added to the grid without expanding transmission capability to offset the ever more wild fluctuations, grid operators are now constantly scrambling to keep the grid from spiraling out of control. At the 21-minute mark, Klaus Kaschnitz of the operations management of Austrian Power Grid remarks:

These fluctuations in the system that we now see have increased dramatically and are ultimately a product of weather events.”

The fluctuations are having a profound impact, Kaschnitz explains. It is especially difficult to keep the grid at 50 Hz frequency, which makes keeping the grid from collapsing harder and the powering of modern industrial systems highly challenging. At the 25-minute mark, the report then switches the focus to the grids’ vulnerability to hackers.

Swiss daily: “danger of a blackout rising”

Also the Swiss online Baseler Zeitung (BaZ) here reports on major power grid woes in Switzerland, warning: “The danger of a blackout is rising” and that power grid operator Swissgrid “must intervene increasingly more often in the power grid“.

According to the BAZ, in 2011 Swissgrid had to intervene only twice over the entire year. But since then the grid has become far more unstable, and that at the current rate it will be necessary to intervene 400 times in 2017!

In summary, the green energies have resulted in two outcomes for citizens: 1) a supply that is now far more unstable and 2) power that is far more expensive. In a nutshell: Far less quality for a lot more money.

That’s the expected result whenever you have the wrong people (activists and politicians) deciding how to run complex technical systems.

 

OPEC Projects Global Fossil Fuel Demand To Keep Rising! …Obliterating Dreams Of Carbon Emissions Reductions

While global warming alarmists continue to fantasize crude oil use getting drastically reduced already starting next year, OPEC sees it totally differently. German online. center-left weekly Die Zeit reportsOPEC anticipates growing oil demand until 2040.”

This poses a huge dilemma for the activists and alarmists who are urgently pressing to transform society –based on the fear and belief that the globe will warm rapidly if we don’t act now.

The other fear is that rising oil consumption over the next 25 more years accompanied stalling global temperatures will forever expose climate science as a ruse.

Still decades away from peak oil consumption

Die Zeit writes that OPEC is sure that the planet is still years away from peak oil demand, and the reason is because cars will continue to be mostly powered by petroleum even beyond 2040. Obviously a number of leading energy experts believe electric cars will remain a pipe dream for quite some time.

Chart: OPEC

Oil consumption will climb almost 20%

Firstly OPEC sees the world population growing from 7.6 billion today to 9.2 billion by 2040 and global GDP growing by a whopping 126%. That’s all going to require lots of reliable and efficient energy. The good news is that overall energy efficiency will increase, as only 96% more energy will be needed to power the 126% GDP boost.

Most of the energy increase will come from India and China. Moreover, the global average standard of living will also grow strongly as the per capita energy use will be much higher. OPEC projects that much of the added energy demand will be supplied by natural gas (see chart above). Part of the increased demand will be met by coal and oil.

OPEC also foresees continued domination by the internal combustion engine for passenger cars, even in 2040:

Chart: OPEC

Electric mobility still decades away

OPEC also projects there will be more than 2 billion automobiles on the world’s roads by 2040, almost double today’s number. Over 80% of these will continue to be powered by internal combustion engines. The electric vehicle age appears to be something for the late 21st century. Visions of a near zero carbon society within the next 30 years are more fantasy than reality.

Not surprisingly, most of the increased energy demand will come from Asia and Africa. The richer, developed OECD countries will see little growth in energy demand.

The OPEC video summarizes by saying that it is “committed to reducing energy poverty“. That target of course would make a huge contribution to alleviating the overall misery that still plagues many poor countries. Oil, coal and gas will continue to play the leading role.

Of course, jet-setting climate activists and alarmists would like to deny poor countries access to affordable and reliable energy.

Good luck with that.

 

New Paper: Most Modern Warming, Including For Recent Decades, Is Due To Solar Forcing, Not CO2

‘Two-Thirds Of Climate Warming’

Since 1750 Due To ‘Solar Causes’

Dr. Alan D. Smith, Geoscientist

Though advocates of the dangerous anthropogenic global warming (AGW) narrative may not welcome the news, evidence that modern day global warming has largely been driven by natural factors – especially solar activity – continues to pile up.

Much of the debate about the Sun’s role in climate change is centered around reconstructions of solar activity that span the last 400 years, which now include satellite data from the late 1970s to present.

To buttress the claim that solar forcing has effectively played almost no role in surface temperature changes since the mid-20th century, the IPCC has shown preference for modeled reconstructions of solar activity (i.e., the PMOD) that show a stable or decreasing trend since the 1970s.  Why?  Because if the modeled results can depict steady or decreasing solar activity since the last few decades of the 20th century – just as surface temperatures were rising – then attributing the post-1970s warming trend to human activity becomes that much easier.

The trouble is, satellite observations using ACRIM  data (which have been affirmed to be accurate by other satellite data sets and are rooted in observation, not modeled expectations) indicate that solar activity did not decline after the 1970s, but actually rose quite substantially.  It wasn’t until the early 2000s that solar activity began to decline, corresponding with the denouement of the Modern Grand Maximum.


ACRIM Composite Is ‘Data Driven’, While The PMOD Composite Is ‘Model Driven’


Willson, 2014

• Comparison of the results from the ACRIM3, SORCE/TIM and SOHO/VIRGO satellite experiments demonstrate the near identical detection of TSI variability on all sub-annual temporal and amplitude scales during the TIM mission.   A solar magnetic activity area proxy [developed in 2013] for TSI has been used to demonstrate that the ACRIM TSI composite and its +0.037 %/decade TSI trend during solar cycles 21–23 [1980s-2000s] is the most likely correct representation of the extant satellite TSI database.
• The occurrence of this trend during the last decades of the 20th century supports a more robust contribution of TSI variation to detected global temperature increase during this period than predicted by current climate models.
• One of the most perplexing issues in the 35 year satellite TSI database is the disagreement among TSI composite time series in decadal trending. The ACRIM and PMOD TSI compostite time series use the ERB and ERBE results, respectively, to bridge the Gap. Decadal trending during solar cycles 21–23 is significant for the ACRIM composite but not for the PMOD.  A new [2013] TSI-specific TSI proxy database has been compiled that appears to resolve the issue in favor of the ACRIM composite and trend. The resolution of this issue is important for application of the TSI database in research of climate change and solar physics.
• The ACRIM TSI composite is data driven. It uses ACRIM1, ACRIM2, ACRIM3 and Nimbus7/ERB satellite results published by the experiments’ science teams and the highest cadence and quality ACRIM Gap database, the Nimbus7/ERB, to bridge the ACRIM Gap.
The PMOD TSI composite, using results from the Nimbus7ERB, SMM/ACRIM1, UARS/ACRIM 2 and SOHO/ VIRGO experiments, is model driven. It conforms TSI results to a solar-proxy model by modifying published ERB and ACRIM results and choosing the sparse, less precise ERBS/ERBE results as the basis for bridging the ACRIM Gap (Frohlich and Lean 1998).
• The Earth’s climate regime is determined by the total solar irradiance (TSI) and its interactions with the Earth’s atmosphere, oceans and landmasses. Evidence from 35 years of satellite TSI monitoring and solar activity data has established a paradigm of direct relationship between TSI and solar magnetic activity. (Willson et al. 1981; Willson and Hudson 1991; Willson 1997, 1984; Frohlich and Lean 1998; Scafetta and Willson 2009; Kopp and Lean 2011a, 2011b)  This paradigm, together with the satellite record of TSI and proxies of historical climate and solar variability, support the connection between variations of TSI and the Earth’s climate.   The upward trend during solar cycles 21–23 coincides with the sustained rise in the global mean temperature anomaly during the last two decades of the 20th century.

Assessment Of The Sun’s Climate Role Largely Depends On The TSI Model Adopted


Van Geel and Ziegler, 2013

• [T]he IPCC neglects strong paleo-climatologic evidence for the high sensitivity of the climate system to changes in solar activity. This high climate sensitivity is not alone due to variations in total solar irradiance-related direct solar forcing, but also due to additional, so-called indirect solar forcings. These include solar-related chemical-based UV irradiance-related variations in stratospheric temperatures and galactic cosmic ray-related changes in cloud cover and surface temperatures, as well as ocean oscillations, such as the Pacific Decadal Oscillation and the North Atlantic Oscillation that significant affect the climate.
[T]he cyclical temperature increase of the 20th century coincided with the buildup and culmination of the Grand Solar Maximum that commenced in 1924 and ended in 2008.
• Since TSI estimates based on proxies are relatively poorly constrained, they vary considerably between authors, such as Wang et al. (2005) and Hoyt and Schatten (1997). There is also considerable disagreement in the interpretation of satellite-derived TSI data between the ACRIM and PMOD groups (Willson and Mordvinov, 2003; Fröhlich, 2009). Assessment of the Sun’s role in climate change depends largely on which model is adopted for the evolution of TSI during the last 100 years (Scafetta and West, 2007; Scafetta, 2009; Scafetta, 2013).
• The ACRIM TSI satellite composite shows that during the last 30 years TSI averaged at 1361 Wm-2, varied during solar cycles 21 to 23 by about 0.9 Wm-2, had increased by 0.45 Wm-2 during cycle 21 to 22 [1980s to 2000s] to decline again during cycle 23 and the current cycle 24 (Scafetta and Willson, 2009).
• By contrast, the PMOD TSI satellite composite suggests for the last 30 years an average TSI of 1366, varying between 1365.2 and 1367.0 Wm-2 that declined steadily since 1980 by 0.3 Wm-2.

Total Solar Irradiance Increased By 3 W m-2 Between 1900 And 2000


Van Geel and Ziegler, 2013 (continued)

• On centennial and longer time scales, differences between TSI estimates become increasingly larger. Wang et al. (2005) and Kopp and Lean (2011) estimate that between 1900 and 1960 TSI increased by about 0.5 Wm-2 and thereafter remained essentially stable, whilst Hoyt and Schatten (1997) combined with the ACRIM data and suggest that TSI increased between 1900 and 2000 by about 3 Wm-2 and was subject to major fluctuations in 1950-1980 (Scafetta, 2013; Scafetta, 2007).
• Similarly, it is variably estimated that during the Maunder Solar Minimum (1645- 1715) of the Little Ice Age TSI may have been only 1.25 Wm-2 lower than at present Wang et al., 2005; Haig, 2003; Gray et al., 2010; Krivova et al., 2010) or by as much as 6 ± 3 Wm-2 lower than at present (Shapiro et al., 2010; Hoyt and Schatten, 1997), reflecting a TSI increase ranging between 0.09% and 0.5%, respectively.

Graph Source: Soon et al., 2015

After Removing Instrumental ‘Adjustments’, Urban Bias, Temperatures Follow Solar Activity


The combined Hadley Centre and Climatic Research Unit (HadCRUT) data set — which is featured in the Intergovernmental Panel on Climate Change (IPCC) reports — underwent a revision from version 3 to version 4 in March of 2012.  This was about a year before the latest IPCC report was to be released (2013).  At the time (early 2012), it was quite inconvenient to the paradigm that HadCRUT3 was highlighting a slight global cooling trend between 1998 and 2012, as shown in the graph below (using HadCRUT3 and HadCRUT4 raw data from WoodForTrees).  So, by changing versions, and by adjusting the data, the slight cooling was changed to a slight warming trend.

Source: WoodForTrees

As recently as 1990, it was widely accepted that the global temperature trend, as reported by NASA (Hansen and Lebedeff, 1987), showed a “0.5°C rise between 1880 and 1950.”

Pirazzoli, 1990

This 0.5°C rise in global temperatures between 1880-1950 (and 0.6°C between 1880 and 1940) can clearly be seen in the NASA GISS graph from 1987:

Schneider, S. H. 1989. The greenhouse effect: Science and policy. Science 243: 771-81.

Today, it is no longer acceptable for the NASA global temperature data set to graphically depict a strong warming trend during the first half of the 20th century.  This is because anthropogenic CO2 emissions were flat and negligible relative to today during this abrupt warming period.

So as to eliminate the inconvenience of a non-anthropogenic warming trend in modern times, NASA has now removed all or nearly all the 0.5°C of warming between 1880 and 1950.

NASA GISS graph

 

Soon et al., 2015   

[B]etween 65-80% of the apparent warming trend over the 1961-2000 period for the Beijing and Wuhan station records was probably due to increasing urban heat islands.  [T]he temperature trends increase from +0.025°C/decade (fully rural) to … +0.119°C/decade (fully urban). … If we assume that the fully rural stations are unaffected by urbanization bias, while the other subsets are, then we can estimate the extent of urbanization bias in the “all stations” trends by subtracting the fully rural trends. This gives us an estimate of +0.094°C/decade urbanization bias over the 1951-1990 period [+0.38°C of additional non-climatic warmth]– similar to Wang & Ge (2012)’s +0.09°C/decade estimate.
We have constructed a new estimate of Northern Hemisphere surface air temperature trends derived from mostly rural stations – thereby minimizing the problems introduced to previous estimates by urbanization bias.  
• Similar to previous estimates, our composite implies warming trends during the periods 1880s-1940s and 1980s-2000s. However, this new estimate implies a more pronounced cooling trend during the 1950s-1970s. As a result, the relative warmth of the mid-20th century warm period [1930s-1950s] is comparable to the recent [1980s-2000s] warm period – a different conclusion to previous estimates. Although our new composite implies different trends from previous estimates, we note that it is compatible with Northern Hemisphere temperature trends derived from (a) sea surface temperatures; (b) glacier length records; (c) tree ring widths.
• However, the recent multi model means of the CMIP5 Global Climate Model hindcasts failed to adequately reproduce the temperature trends implied by our composite, even when they included both “anthropogenic and natural forcings”. One reason why the hindcasts might have failed to accurately reproduce the temperature trends is that the solar forcings they used all implied relatively little solar variability. However, in this paper, we carried out a detailed review of the debate over solar variability, and revealed that considerable uncertainty remains over exactly how the Total Solar Irradiance has varied since the 19th century.
• When we compared our new composite to one of the high solar variability reconstructions of Total Solar Irradiance which was not considered by the CMIP5 hindcasts (i.e., the Hoyt & Schatten reconstruction), we found a remarkably close fit. If the Hoyt & Schatten reconstruction and our new Northern Hemisphere temperature trend estimates are accurate, then it seems that most of the temperature trends since at least 1881 can be explained in terms of solar variability, with atmospheric greenhouse gas concentrations providing at most a minor contribution.
• This contradicts the claim by the latest Intergovernmental Panel on Climate Change (IPCC) reports that most of the temperature trends since the 1950s are due to changes in atmospheric greenhouse gas concentrations (Bindoff et al., 2013).


New Paper: Since 1750, About 0.8°C – 0.9°C Of CET Increase Is Due To Solar Forcing


Smith, 2017

Yearly mean temperatures in the CET [Central England Temperature] record show an increase in temperature of approximately 1.3°C degrees from the end of the 17th Century to the end of the 20th Century/beginning of 21st Century.  …  Subtle difference in timing between the warming/cooling phases between the Central England record and the other localities may reflect local climate variation, but the similarity in events between continents suggests the CET [Central England Temperature] record is recording global temperature patterns.
Records of sunspot numbers began in 1610 such that detailed estimates of solar variation for the years covered by the CET record can be made without resort to the use of proxy data. Reconstructions of TSI [e.g. 16-18] differ in magnitude (Table 1), but there is agreement in form with 4 peaks and 4 to 6 troughs occurring over the time-scale of the CET record (Fig. 4). These are: a minimum in TSI associated with the Maunder Sunspot Minimum in the latter half of the 17th Century; a peak, possibly bi-modal approaching modern TSI values during the 18th Century; a well-defined trough corresponding with the Dalton Sunspot Minimum between 1800- 1820; a poorly defined TSI peak in the mid 19th Century; a reduction in TSI during the late 19th Century; increasing TSI during the early 20th Century; a decrease in TSI from around 1950- 1975; and a second phase of TSI increase in the late 20th Century [1980s-2000s]. There is good correspondence with TSI throughout the CET record, with warm events correlating with high TSI and cool phases correlating with plateaus or decreases in TSI .
However, for temperature increases from the beginning of the Industrial Revolution (Maunder Minimum and Dalton Minimum to end of 20th Century), high TSI models can account for only 63-67% of the temperature increase. This would suggest that one third of Global Warming/Climate Change can be attributed to AGW. … Approximately two-thirds [0.8°C to 0.9°C] of climate warming since the mid-late 18th Century [1.3°C] can be attributed to solar causes, suggesting warming due to anthropogenic causes over the last two centuries is 0.4 to 0.5°C.


All Over The Globe, Trends In Solar Forcing Correlate With Temperature Changes


Christiansen and Lungqvist (2012)


Stoffel et al., 2015


Schneider et al., 2015  and  Wilson et al., 2016


Kim et al., 2017


Yamanouchi, 2011


Box et al., 2009


Southern Hemisphere


Schneider et al., 2006


De Jong et al., 2016

“[T]he period just before AD 1950 was substantially warmer than more recent decades.”


de Jong et al., 2013


Zinke et al., 2016


Turney et al., 2017


Elbert et al., 2013


“[I]n the framework of empiric [observable] models, the estimate of the solar activity contribution in the variation in the air global temperature in the 20th century is about 70%.” – Kovalenko and Zherebtsov, 2014

New Quebec Study: Cold Kills …0.7% Increase Of Heart Failure For Each 1°C Drop!

I thought the following paper was interesting.
No, lead-author Prof. Pierre Gosselin is not me from NTZ. But he very likely is a descendent the same family line. The first Gosselin (Gabriel) left Normandy-France and landed in Quebec City way back in 1653. As a devout Catholic, Gabriel earnestly started what was the population of Gosselins over North America and beyond over the next 364 years.
=============================================

Effects of climate and fine particulate matter on hospitalizations and deaths for heart failure in elderly: A population-based cohort study

In a recent study a team of scientists led by Prof. Pierre Gosselin assessed 112,793 people aged 65 years and older who had been diagnosed with heart failure in Quebec between 2001 and 2011. Over an average of 635 days, the researchers measured the mean temperature, relative humidity, atmospheric pressure and air pollutants in the surrounding environment and studied the data to see if there was any relationship.

Their results: for each decrease of 1°C in the daily mean temperature of the previous 3 and 7 days, the risk of heart failure events is increased of about 0.7%. In other words, a drop of 10°C in the average temperature over 7 days, which is common in the province of Quebec because of seasonal variations, is associated with increased risk to be hospitalized or to die for the main cause of heart failure of about 7% in elderly diagnosed with this disease.

The paper’s abstract:

We measured the lag effects of temperature, relative humidity, atmospheric pressure and fine particulate matter (PM2.5) on hospitalizations and deaths for HF in elderly diagnosed with this disease on a 10-year period in the province of Quebec, Canada.

Our population-based cohort study included 112,793 elderly diagnosed with HF between 2001 and 2011. Time dependent Cox regression models approximated with pooled logistic regressions were used to evaluate the 3- and 7-day lag effects of daily temperature, relative humidity, atmospheric pressure and PM2.5 exposure on HF morbidity and mortality controlling for several individual and contextual covariates.

Overall, 18,309 elderly were hospitalized and 4297 died for the main cause of HF. We observed an increased risk of hospitalizations and deaths for HF with a decrease in the average temperature of the 3 and 7 days before the event. An increase in atmospheric pressure in the previous 7 days was also associated with a higher risk of having a HF negative outcome, but no effect was observed in the 3-day lag model. No association was found with relative humidity and with PM2.5 regardless of the lag period

Lag effects of temperature and other meteorological parameters on HF events were limited but present. Nonetheless, preventive measures should be issued for elderly diagnosed with HF considering the burden and the expensive costs associated with the management of this disease.

Lower risk of death in summer

The authors also found:

The results showed a higher risk of hospitalization or death in the winter period of the year (October to April) compared to the summer period (May to September).”

German-Spanish Wind Energy Giant To Lay Off 6000 Workers, Citing “Changing Market Conditions”

The online German business daily Handelsblatt here writes that European wind energy company Siemens Gamesa will eliminate 6000 jobs.

That means the German-Spanish company will shed more than a fifth of it 26,000 workers. This is the latest bad news slamming the green energy industry in Germany and Europe. Over the years Germany has seen almost every major solar panel and equipment manufacturer become insolvent. Spain too has been hit hard by renewable energy insolvencies.

Once ballyhooed as the sector for the future, German solar and and wind energy industry has taken huge hits. The country’s last remaining major solar manufacturer, Bonn-based Solarworld, earlier this year announced it would file for bankruptcy. Solarworld’s demise was the last of a spectacular series of solar manufacturer bankruptcies that swept across Germany over the past years, with names like Solon, Solar Millenium and Q-Cells going under.

Now the bloodbath is expanding to the wind industry, a branch of green energy that looked far more feasible in Germany than solar energy did.

The announcement by Siemens-Gamesa coincides with the COP 23 Bonn climate conference now taking place, which is calling for more wind and sun energy at a time the industry is collapsing at full speed in Germany.

According to Siemens-Gamesa Board Chairman Markus Tacke: “Our business result is still not at the level where we would like to see it.”

Last year Spanish Gamesa and German Siemens combined their wind power operations to form one of the world’s largest producers of wind turbines.

Handelsblatt writes the Siemens daughter company was reacting to “changing market conditions” and that the move will impact 6 countries.

The company also expects to see its revenue for the coming fiscal year fall to 9 billion euros from almost 11 billion.

The Handelsblatt also reports that “no improvement is foreseen in the new fiscal year“.

Other links in English:
www.expatica.com/de/news/country-news/Germany-layoffs
http://uk.businessinsider.com/r-siemens-gamesa-to-6000-jobs=T

 

New Atmospheric Sciences Textbook: Climate Sensitivity Just 0.4°C For CO2 Doubling

CO2 Contribution:

0.15°C Since 1959

 – Dr. B. Smirnov, Microphysics of Atmospheric Phenomena


Purveyors of the viewpoint that rising CO2 emissions pose a grave threat to the planet via dangerous global-scale warming presuppose that the surface temperatures of the Earth are highly sensitive to parts per million (ppm) variations in atmospheric CO2 concentrations.

And yet the accumulation of scientific publications documenting a far less consequential role for CO2 in the climate system just keeps growing and growing — especially in recent years.

An incomplete compilation of at least 75 scientific publications now document a very low climate sensitivity to CO2 concentration changes.   Summarizing, doubling modern era CO2 concentrations from 280 ppm to 560 ppm may only raise surface temperatures by tenths of a degree – if that.

75 Scientific Papers Affirm Very Low Sensitivity To CO2

Expanding upon a 2016 scientific paper published in Europhysics Letters, Dr. Boris M. Smirnov, an atomic physicist, uses his field expertise in authoring another textbook entitled Microphysics of Atmospheric Phenomena.  The volume is one of 20 physics books Smirnov has published over the last two decades.

In chapter 10, Smirnov asserts that infrared emission from water vapor dwarfs the atmospheric contribution from CO2 within the greenhouse effect, as CO2 onlycontributes in small portions“.

In fact, Smirnov finds that doubling the modern era CO2 concentration will only result in a temperature increase of 0.4°C.

He further calculates that the increase in CO2 concentration from 1959 (316 ppm) to the present (402 ppm) has only contributed 0.15°C to surface temperatures.  This means, of course, that the bulk of the temperature changes that have occurred in the last 55 to 60 years are not of human origin.

Steadily and gradually, the “consensus” position that says the climate is highly sensitive to variations in anthropogenic CO2 emissions continues to unravel.


Smirnov, 2017

It is shown that infrared emission of the atmosphere is determined mostly by atmospheric water. One can separate the flux of outgoing infrared radiation of the atmosphere from that towards the Earth. The fluxes due to rotation-vibration transitions of atmospheric  CO2  molecules are evaluated. Doubling of the concentration of  CO2 molecules in the atmosphere that is expected over 130 years leads to an increase of the average Earth temperature by (0.4±0.2) K mostly due to the flux towards the Earth if other atmospheric parameters are not varied.


Smirnov, 2016

[W]e take into account that CO2 molecules give a small contribution to the heat Earth balance and, therefore, one can use the altitude distribution of the temperature for the standard atmosphere model [1], and a variation of the CO2 concentration does not influence this distribution.  …  [I]njection of CO2 molecules into the atmosphere leads to a decrease of the outgoing radiation flux that causes a decrease of the average Earth temperature. But this decrease is below 0.1K that is the accuracy of determination of this value.  Thus, the presence of carbon dioxide in the atmosphere decreases the outgoing atmospheric radiative flux that leads to a decrease of the Earth temperature by approximately (1.8 ± 0.1) K. The change of the average temperature at the double of the concentration of atmospheric CO2 molecules is determined by the transition at 667cm−1 only and is lower than 0.1K.
In particular, doubling of the concentration of CO2 molecules compared to the contemporary content increases the global Earth temperature by ΔT = 0.4 ± 0.2K. … From this we have that the average temperature variation ΔT = 0.8 ◦C from 1880 up to now according to NASA data may be attained by the variation of the water concentration by 200ppm or Δu/u ≈ 0.07, Δu = 0.2. Note that according to formula (2) the variation of an accumulated concentration of CO2 molecules from 1959 (from 316ppm up to 402ppm) leads to the temperature variation ΔT = 0.15°C. One can see that the absorption of a water molecule in infrared spectrum is stronger than that of the CO2 molecule because of their structures, and the injection of water molecules in the atmosphere influences its heat balance more strongly than the injection of CO2 molecules.

Computational Software Expert Blasts National Climate Assessment Report: “Wildly Fraudulent”…”Scientific Garbage”

Tony Heller

Expert software engineer and climate science blogger/critic Tony Heller just posted a video commenting on the just newly released National Climate Assessment (NCA) report:

“Utter garbage”

Heller starts by pointing out that “there’s a huge number of problems” with the report and that the content contradicts itself, even on the same page.

Heller says that the claim that the number of record hot days is on the rise “is simply not true” and he wonders how peer review “allows this sort of utter garbage to get through“.  Heller makes it clear that this report belongs in the garbage can of science.

Wildly fraudulent

Heller methodically explains how the NCA report used faulty computational methods to make it appear as if more record hot days have been occurring, when in fact the trend has been the opposite.

The National Climate Assessment graph is wildly fraudulent. Not only have record maximum temperatures plummeted in the US, but record minimum temperatures have plummeted too. The US climate is getting milder, not more extreme.”

 Record minimum daily temperatures (blue) and record daily maximum temperatures (red) have been falling “tremendously”, thus contradicting NCA report claims. Chart: Tony Heller here.

The computational methodology used by the NCA report authors is so dubious that Heller even goes on to wonder how “this sort of scientific garbage” ever got through peer-review.

Consistent poor quality

Heller then singles out one of the authors of the report, known Texas Tech University climate activist/alarmist Katharine Hayhoe, reminding viewers that he has examined her work many times in the past and that the NCA report’s poor quality is consistent with that he has seen from Hayhoe in the past.

Heller’s makes his frustration with government funded climate science clear: “Government funded climate science is the biggest fraud in history.”

That statement by Heller could be disputed, however, as the history of government science fraud is long, tragic and has cost tens of millions of lives just in the last century alone. Other examples of government science fraud include eugenics, science of race, Lysenkoism, and the carbohydrate-promoting nutritional guidelines of the late 20th century, to name a few.

History indeed tells us to be very careful and skeptical about government-funded science.

One for the dustbin

President Trump should send this report to a thorough review by a panel of independent critical scientists.

 

Expert Climate Scientists Conclude From Historical Trends That Anthropogenic Factors Are Overweighted In Models

What follows is another paper to add to the list of 400 peer-reviewed papers published this year which show claims surrounding man-made global warming are in fact hyped up.
==================================

By Dr. Sebastian Lüning and Prof. Fritz Vahrenholt
(German text translated/edited by P Gosselin)

On July 4 in the journal Atmospheric and Climate Sciences an article by Maxim Ogurtsov, Markus Lindholm  and Risto Jalkanen appeared and addressed an important issue. The warming of the last 150 years after the Little Ice Age is often gladly viewed as having 100% anthropogenic causes. Of course this makes little sense because the Little Ice Age was the coldest period of the last 10,000 years and was caused by natural factors such as solar weak phases and volcanic eruptions.

“IPCC makes a large mistake”

And when the sun again strengthens and volcanic eruptions remain at low levels, the earth warms up again. That there is an anthropogenic warming component of course should not be disputed. The warming of the last 150 years is due to a mix of natural and anthropogenic causes. Here the IPCC makes a large mistake when it assumes that the warming has been 100% anthropogenic.

In the study, Ogurtsov and fellow scientists compare the modern warming to the temperature history of the last 10,000 years. Here they find that there had been only a few similar warming phases when looking at the warming of the past 135 years. If one however considers that only half of that warming was natural, then similar warming phases occur on average every few centuries. The authors conclude that the natural component in the models must be taken into account much more. And conversely the role anthropogenic factors must be reduced correspo0ndingly so that a more accurate picture is attained when compared to the real temperature development.

The Abstract:

On the Possible Contribution of Natural Climatic Fluctuations to the Global Warming of the Last 135 Years
A number of numerical experiments with artificial random signals (the second order autoregressive processes), which have important statistical pro- perties similar to that of the observed instrumental temperature (1850-2015), were carried out. The results show that in frame of the selected mathematical model the return period of climatic events, analogous to the current global warming (linear increase of temperature for 0.95˚C during the last 135 years) is 2849-5180 years (one event per 2849-5180 years). This means that global warming (GW) of the last 135 years can unlikely be fully explained by inherent oscillations of the climatic system. It was found however, that natural fluctuations of climate may appreciably contribute to the GW. The return period of climatic episodes with 0.5˚C warming during the 135 years (half of the observed GW) was less than 500 years. The result testifies that the role of external factors (emission of greenhouse gases, solar activity etc.) in the GW could be less than often presumed.”

 

Academia Stunned As Science Anti-Free Speech Neurosis Flares…”Eminent Scientists” Sued Over Dissident Paper!

At his Environmental Progress site here, Michael Shellenberger writes that Stanford University Professor Mark Z. Jacobson is suing a “prestigious team of scientists for Debunking 100% renewables.

This reminds us of the 17 or so attorneys general who sought to use the US Racketeering Influenced and Corrupt Organizations Act (RICO) in order to go after and silence climate science skeptics, accusing them of perpetuating fraud by daring to question the flakey science underpinning the manmade global warming theory.

Shellenberger wrote that professor Mark Z. Jacobson had “filed a lawsuit, demanding $10 million in damages, against the peer-reviewed scientific journal Proceedings of the National Academy of Sciences (PNAS) and a group of eminent scientists (Clack et al.) for their study showing that Jacobson made improper assumptions in order to claim that he had demonstrated U.S. energy could be provided exclusively by renewable energy, primarily wind, water, and solar.”

From Jacobson's complaint 

Cropped from Jacobson’s complaint. Source Environmental Progress.  

“Speechless”…climate-energy debate needs to get “a direction of sanity”

Climatologist Prof. Judith Curry at her Climate Etc site wrote that she was “speechless”, adding:

In many ways, this is much worse than any of Michael Mann’s lawsuits alleging defamation of character [link] — Jacobson’s lawsuit seeks to settle a genuine scientific disagreement in the courts.”

However, Prof. Curry does not see “a good ending” for Jacobson: “There will undoubtedly be a countersuit and he stands to lose a lot of money (not just his lawsuit)”.

Many observers feel this is just the latest attempt to shut down free speech within the realm of science — a highly worrisome development that reminds us of brutal totalitarian regimes.

Curry notes: “Possibly, there will be sufficient backlash against this that will steer the overall climate-energy debate back towards a direction of sanity.”

This can only be hoped for.

Blogger Shellenberger calls the move by Jacobson “unprecedented”, adding:

Scientific disagreements must be decided not in court but rather through the scientific process. We urge Stanford University, Stanford Alumni, and everyone who loves science and free speech to denounce this lawsuit.”

This of course would be expected from almost any higher education institution in any other free-speech-abiding, democratic state. But with flakey, culturally neurotic California, nothing can be ruled out.

 

Deconstruction Of The Critical YouTube Response To Our 400+ ‘Skeptical’ Papers Compilation

Below is a commentary addressing the YouTube response to the late October Breitbart headline that claimed the 400 papers (now 415) compiled here at NoTricksZone say that “Global Warming Is A Myth“.

While the headline at Breitbart was presumably assembled for the expressed purpose of attracting readership (mission accomplished, if so), it will be explicitly stated here that this compilation certainly does not assert that “Global Warming Is A Myth”.  It isn’t.  Large regions of the Earth have undergone a warming trend in the last century, rising out of the depths of the Little Ice Age.

It is also true that these papers are not claimed to literally “debunk” any positions currently held by those who advocate for the main “consensus” positions related to anthropogenic global warming.    That particular d-word was used in another headline.  Instead of using such ambitious and affirmative language, the nuanced words used to describe what this list is proposed to accomplish were carefully chosen so as not to assert it does more (or less) than actually claimed.

What the papers and graphs in this compilation actually do is support many of the main skeptical positions which question climate alarm.   Namely, they support the position(s):

N(1) natural mechanisms play well more than a negligible role (as claimed by the IPCC) in the net changes in the climate system, which includes temperature variations, precipitation patterns, weather events, etc., and the influence of increased CO2 concentrations on climatic changes are less pronounced than currently imagined;

N(2) the warming/sea levels/glacier and sea ice retreat/hurricane and drought intensities…experienced during the modern era are neither unprecedented or remarkable, nor do they fall outside the range of natural variability, as clearly shown in the first 100 graphs (from 2017) in this volume;

N(3) the computer climate models are not reliable or consistently accurate, and projections of future climate states are little more than speculation as the uncertainty and error ranges are enormous in a non-linear climate system; and

N(4) current emissions-mitigation policies, especially related to the advocacy for renewables, are often ineffective and even harmful to the environment, whereas elevated CO2 and a warmer climate provide unheralded benefits to the biosphere (i.e., a greener planet and enhanced crop yields).

In sharp contrast to the above, the corresponding “consensus” positions that these papers do not support are:

A(1) Close to or over 100% (110%) of the warming since 1950 has been caused by increases in anthropogenic CO2 emissions, leaving natural attribution at something close to 0%;

RealClimate.org: “The best estimate of the warming due to anthropogenic forcings (ANT) is the orange bar (noting the 1𝛔 uncertainties). Reading off the graph, it is 0.7±0.2ºC (5-95%) with the observed warming 0.65±0.06 (5-95%). The attribution then follows as having a mean of ~110%, with a 5-95% range of 80–130%. This easily justifies the IPCC claims of having a mean near 100%, and a very low likelihood of the attribution being less than 50% (p < 0.0001!).”

A(2) Modern warming, glacier and sea ice recession, sea level rise, drought and hurricane intensities…are all occurring at unprecedentedly high and rapid rates, and the effects are globally synchronous (not just regional)…and thus dangerous consequences to the global biosphere and human civilizations loom in the near future as a consequence of anthropogenic influences;

A(3) The climate models are reliable and accurate, and the scientific understanding of the effects of both natural forcing factors (solar activity, clouds, water vapor, etc.) and CO2 concentration changes on climate is “settled enough“, which means that “the time for debate has ended“;

A(4) The proposed solutions to mitigate the dangerous consequences described in N(4) – namely, wind and solar expansion – are safe, effective, and environmentally-friendly.

The 400+ papers compiled so far support the N(1)-N(4) positions, and they undermine or at least do not support the “consensus” A(1)-A(4) positions.  The papers do not do more than that.   Unreasonable expectations that these papers should do more than support skeptical positions and undermine “consensus” positions to “count” are rooted in straw man argumentation.

Specifically, claiming that a scientific paper must assert that CO2 does not play a major role in climate to be characterized as a paper supporting a skeptical positions in N(1)-N(4) is disingenuous at best.   The opposite wouldn’t ever stand, of course.   Let’s say an author of a scientific paper did not explicitly state that she disagrees that natural factors play a significant role in modern climate change.  Would that mean that we could say the paper affirms that climate changes are significantly natural?  Of course not.  And yet this very same non-sequitur is employed here with regularity when disingenuously arguing that these papers do not do what they claim to do – especially since what they are claimed to do has not been accurately characterized.

As an aside, if we were to look at the papers that Cook et al. (2013) used to concoct the 97% “consensus” document we would find that Cook and his colleagues actually classified papers (and magazine articles) about cooking stoves in Brazil, phone surveys, asthma-related ER visits in Montreal, TV coverage . . . as scientific papers “endorsing” the position that all or nearly all of the global warming occurring since 1950 has been human-caused (the “consensus” statement).  Of course, none of the papers identified in the link below that were categorized as “endorsing” the clearly defined anthropogenic/post-1950 “consensus” statement actually used those specific words.   And yet they were curiously counted anyway.

http://www.joseduarte.com/blog/cooking-stove-use-housing-associations-white-males-and-the-97
“The Cook et al. (2013) 97% paper included a bunch of psychology studies, marketing papers, and surveys of the general public as scientific endorsement of anthropogenic climate change.”

With that lengthy (but necessary) introduction, I will now take the time to carefully construct a response to the YouTube video critique of the 400+ papers list as authored by potholer54, who I shall hereafter refer to as PH54 for lack of a better title.


1. After having thoroughly criticized James Delingpole’s analysis and style for the first few minutes, PH54 digs in and correctly suggests that the NoTricksZone headline and emboldened first paragraph is more “nuanced” than Breitbart‘s.   He attempts to summarize what the 400+ papers represent by claiming they are meant to cast doubt on the conclusion that CO2 is a major driver of climate change — and no more.  Of course, as described above, there is far more to it than that, but soundbites are to be expected in short videos like this.


2. PH54 then, for reasons that are not clear, returns to using the Breitbart interface instead of using the NoTricksZone article and paper reference lists — which have far more detail and may include graphs that correspond to the paper.   Perhaps the reasons why will become apparent.


3. PH54 spends some time providing visuals of electric heaters warming an indoor room.   CO2 and the Sun are assumed to be just like two equally powerful heaters.   The Sun drives climate when the CO2 is stable, which it was during much of the Holocene.   Low solar activity causes cooling and high solar activity causes warming.   And in modern times, PH54 asserts, the Sun has been “turned down” just as the CO2 heater has been turned up.   So, during the modern era, CO2 drives climate.  The Sun used to drive warming and climate changes, but it no longer does.


4. Li et al. 2017 is the first paper directly discussed.  PH54 identifies what he calls the key words in the paper: Late Holocene.  He writes that the paper only addresses the last 2200 years, and it does not address the impacts of solar activity on modern climate.  He notes that solar forcing is not even mentioned in the title.   (CO2 isn’t either.)

PH54 then starts in on his main theme (as introduced in 2. above).  Yes, the Sun drives climate in the Late Holocene, and not CO2.   How do we know this?   Because CO2 was stable during the last 10,000 years – coasting between 250 ppm and 280 ppm.   So PH54 agrees, apparently, that the cold temperatures occurring during centennial-scale solar minima (Maunder, Dalton) would allow us to conclude that the Sun was a main driver of climate during those periods.  Likewise, the Medieval Maximum, a period of high solar activity, led to warm temperatures during the Medieval Warm Period, or Medieval Climate Anomaly (as it is preferred).

But it’s at that point – the Late Holocene – where the Sun apparently stops driving temperatures.   Why?  Because the CO2 heating machine took over.


5. But let’s get back to the Li et al. (2017) paper.  Now, because PH54 used the Breitbart article for a reference instead of the more detailed NoTricksZone visuals, he apparently missed the graphs shown below that appeared in the paper.  The top graph (red trend line) is a solar activity reconstruction for the last millennium.   Notice the sharp uptick in solar activity during the modern era.  This is referred to as the Modern Grand Maximum, with the levels of solar activity exceeding those occurring the Medieval Warm Period.  Now notice the bottom graph (gold).   It’s a graph of Northern Hemisphere temperatures.   Interestingly, there appears to be a very close correlation between the solar activity and the hemispheric temperature, including during the 20th century, when CO2 is said to have been the temperature driver.

Vieira et al., 2011


6. Another aspect of this Li et al. (2017) paper that was ostensibly missed by PH54 (apparently because he chose to use Breitbart‘s one- or two-sentence summary rather than NoTricksZone’s much more detailed summary complete with graphics) is the commentary about the impact of CO2 forcing relative to solar forcing.  The authors conclude that CO2 may play a role in “partly affecting climate variability” in North China, but that the overall long-term control on temperature is “solar-dominated.”

High volumes of greenhouse gases such as CO2 and CH4 during the recent warming periods, may also play a role in partly affecting the climatic variability in NC, superimposing on the overall solar-dominated long-term control (e.g., Wanner et al., 2008; Tan et al., 2011; Kobashi et al., 2013; Chen et al., 2015a,b).”

7. The Li et al. (2017) paper also contains a graph of North China that shows the modern temperatures (which have been flat since about 1950) are no warmer now than they were during Medieval times.  And, like the Northern Hemisphere in general, they appear to follow the general pattern of solar activity.  In other words, this paper supports both N(1) and N(2), and it does not support A(1) and A(2).   That’s why it was included on the list.


8.  Then, continuing the Holocene-only theme introduced in 2. and 4. above, PH54 addresses the second paper on the Breitbart list (again ignoring what is shown on NoTricksZone), Yndestad and Solheim (2017) .  He again claims these scientists were only talking about the Holocene in their paper, not the modern period.  He even includes a visual of the abstract with underlined red lines over the years 1000 AD and 1700 AD.  Had PH54 decided to read the rest of the paper – or even look at the NoTricksZone summary – he would have seen that the authors clearly referred to the modern period (multiple times), and they even referenced the coming solar minimum for the coming decades.  They especially made note of the millennial-scale uniqueness of the very high solar activity for the 1940 to 2000 period, referring to it as a rare event with levels exceeding all but the grand maximum events of 4,000 and 8,000 years ago.

Deterministic models based on the stationary periods confirm the results through a close relation to known long solar minima since 1000 A.D. and suggest a modern maximum period from 1940 to 2015.  Studies that employ cosmogenic isotope data and sunspot data indicate that we are currently leaving a grand activity maximum, which began in approximately 1940 and is now declining (Usoskin et al., 2003; Solanki et al., 2004; Abreu et al., 2008). Because grand maxima  and minima occur on centennial or millennial timescales, they can only be investigated using proxy data, i.e., solar activity reconstructed from 10Be and 14C time-calibrated data. The conclusion is that the activity level of the Modern Maximum (1940–2000) is a relatively rare event, with the previous similarly high levels of solar activity observed 4 and 8 millennia ago (Usoskin et al., 2003).   Periods with few sunspots are associated with low solar activity and cold climate periods. Periods with many sunspots are associated with high solar activity and warm climate periods.”

9.  Here is the solar activity reconstruction featured prominently in the Yndestad and Solheim paper (and in NoTricksZone):

Notice how well solar activity correlates with reconstructions of Northern Hemisphere temperature (Stoffel et al., 2015):


Schneider et al., 2015 also show an oscillation (warming-cooling-warming) in Northern Hemisphere temperatures for the 20th century.

Using 126 tree ring records, Xing et al. (2016) reconstruct Northern Hemisphere temperatures that follow trends in solar activity, as shown below.

The Extratropical Northern Hemisphere Temperature Reconstruction during the Last Millennium Based on a Novel Method (Xing et al., 2016)


10. Ignoring the Yndestad and Solheim TSI graph from the paper itself, which alternatively shows a net +3 W m-2 increase in solar forcing between 1900 and 2000, PH54 produced a graph showing declining sunspot numbers that did not appear in the Yndestad and Solheim paper.  Why not use the Yndestad and Solheim reconstruction?  Probably because it showed the opposite of what his graph of declining solar activity showed: That we have experienced a Modern Grand Maximum of solar activity, +3 W m-2 of forcing, since the beginning of the 20th century and continuing on through to about 2000.  It’s rather odd that the author of a video “exposing” that a paper doesn’t say what is claimed would proceed to refuse to actually read the paper itself (that references the modern period), or that he would avoid using the graph that was provided in the paper or by NoTricksZone.   Instead, PH54 chose to comment using a preferred graph of solar activity that supports his own viewpoints…and a summary provided by Breitbart.


11.  PH54 concludes: “[Yndestad and] Solheim doesn’t debunk the theory that CO2 is a major driver of climate.  It’s quite consistent with it.”

This is odd.  The authors don’t comment on CO2 as the “major driver of climate” in their paper.


12.  Then, after commenting on just those two papers (which were selected by James Delingpole), both of which suggest that solar activity has indeed contributed to modern climate in a significant way, PH54 states: “You get the point.  The highlighted papers just looking at past warming…when CO2 levels were stable.”

This is false.  While it’s true that many of the papers on the list do indeed only refer to past climates in asserting that solar activity drove centennial-scale temperature changes, there are also many that refer to the significant influence of solar activity on the modern climate, including the first two discussed by PH54.


13. PH54: “You can check the [Delingpole] list yourself.  It’s not that hard.  All you have to do is look at Delingpole’s summary.”

Delingpole only provided a handful of the examples from the papers.   The full list of 110 solar-influence papers, with more complete summaries and temperature graphs, are found on the NoTricksZone list.   It’s interesting that PH54 accuses Delingpole of not reading the papers himself, or relying on others to do the reading and summarizing for him…and then he goes ahead and relies on Delingpole for summaries of what the papers say.


14. PH54: Delingpole writes “Modern climate in phase with natural variability.  But the two papers he cites are talking about precipitation.”

Interestingly, PH54 has ostensibly decided that precipitation patterns are not sufficient to count as climate.  Apparently climate is about temperature, and temperature only.    Drought periodicity isn’t indicative of climate.   Decadal-scale flood events aren’t about climate.   Variability in the East Asian Monsoon and their connection to ENSO events don’t qualify as climate.   In ice cores, precipitation levels are often used as a proxy for temperature, with warmer/cooler temperatures corresponding to more/less precipitation.  How odd to take this stance.


15. PH54: “Neither paper [chosen by Delingpole] is talking about global temperature.”

Of course these two papers weren’t talking about global temperature.  The two papers selected from the compilation on natural variability were addressing regional rainfall patterns and their robust connections to solar activity.  Nor was it ever claimed that these two papers were talking about global temperature.   According to “consensus” science, though, drought and flood events and precipitation in general are expected to undergo significant shifts…due to changes in CO2 concentrations.

Miralles et al., 2014     “The hydrological cycle is expected to intensify in response to global warming. Yet, little unequivocal evidence of such an acceleration has been found on a global scale.”

These two papers, which do not support the A(2) “consensus” position, instead support the N(2) position that there is nothing unusual about the modern climate (precipitation) relative to past periods, when CO2 concentrations were much lower.


16. PH54: “During a period of La Nina, the Pacific ocean sucks in heat from the atmosphere, and during El Nino, it spits it back out.”

This is actually an incorrect way of putting it.  The heat for ENSO events isn’t sourced by atmospheric heat.  The heat source is from the ocean itself, and the ocean is heated by the Sun.  The atmosphere contains just 1% of the Earth system’s heat.  Therefore, the heat flux sequence is almost always from ocean to atmosphere, and not the other way around.  The heat redistribution during ENSO events are from the deeper waters to the ocean surface and vice versa.


17. PH54: “None of these papers [Delingpole selected] suggest CO2 is not a major driver of global temperature.”

This is the same non-sequitur referred to in the introduction.   Unless a paper expressly states that CO2 is not a major climate driver, it does not count as a paper supporting a skeptical position on climate alarm.  This does not follow.


18. PH54: “Further down the list, the papers get more bizarre.  Papers about bats being harmed by wind turbines and blade disposal.   I’m struggling to see how any of these papers are casting doubt on CO2’s role in global warming.”

The non-sequitur, repeated.   But this comment appears most disingenuous, as PH54 should probably understand that these particular papers addressing the harm to the environment and ineffectiveness of renewables-promoting policies were not selected from the literature to cast doubt on CO2’s role in climate change.   Instead – and one would assume that most readers would understand this – these papers were selected because they support the position that the “consensus”-endorsed response to the perspective that humans are the dominant cause dangerous global warming is to promote  wind and solar energies, and these may not be either effective or environmental friendly.

All PH54 needed to do was look at the introduction to the NoTricksZone article that addressed what these papers were designed to do, or to support.  These papers have nothing to do with CO2’s role in global warming.  They shouldn’t be expected to.

Current emissions-mitigation policies, especially related to the advocacy for renewables, are often costly, ineffective, and perhaps even harmful to the environment.  On the other hand, elevated CO2 and a warmer climate provide unheralded benefits to the biosphere (i.e., a greener planet and enhanced crop yields).”

19. Addressing the Tejedor et al. (2017) paper, PH54 once again wrongly claims that the paper only addresses the past climate, and makes no reference to the current one.  Had he read the entire paper, or even the summary provided by NoTricksZone, he would (should) have noticed (since it is highlighted in bold red) that the paper does, in fact, mention the  high solar activity of the last few decades.  It also mentions that high solar activity is  associated with periods with high temperatures, such as the warming occurring during 1986-2012.

“Reconstructed long-term temperature variations match reasonably well with solar irradiance changes since warm and cold phases correspond with high and low solar activity, respectively. … The main driver of the large-scale character of the warm and cold episodes may be changes in the solar activity. The beginning of the reconstruction starts with the end of the Spörer minimum. The Maunder minimum, from 1645 to 1715 (Luterbacher et al., 2001) seems to be consistent with a cold period from 1645 to 1706. In addition, the Dalton minimum from 1796 to 1830 is detected for the period 1810 to 1838. However, a considerably cold period from 1778 to 1798 is not in agreement with a decrease in the solar activity. Four warm periods – 1626–1637, 1800–1809, 1845– 1859, and 1986–2012have been identified to correspond to increased solar activity.”

Then, after asserting the paper fails to address the modern era, PH54 highlights (using red underlining) the mention of “anthropogenic forces” in the paper.  Curiously, he claims that this particular sentence, as it reads, asserts that CO2 is a “major driver” of climate change.

The study area [is] a potentially vulnerable region to anthropogenic climatic changes by anthropogenic forces, i.e,., increasing concentrations of greenhouse gases.”

But if this one rather mild sentence from the paper supports the position that CO2 is a major driver of climatic changes (and perhaps it does), then surely one can agree that the statement asserting the “main driver” of “warm periods” may be increases in solar activity (and the 1986-2012 period is specifically referred to in the paper as a warm period that “corresponds to increased solar activity”) could also be interpreted as support for the position that the Sun has more than a negligible role in modern temperatures for the region.


20.  The NoTricksZone compilation contains two graphs from the Tejedor et al. (2017) paper, both of which would appear to support the N(1) position that the modern climate has been impacted by the high solar activity (notice how the warming and cooling events match up rather fittingly with solar activity)…

…and that there is nothing unusual or unprecedented about modern temperatures in the Iberian region to suggest that they have fallen outside the range of natural variability, an N(2) position.


21. Interestingly, the Rydval et al. (2017) paper contains several graphs from the Northern Hemisphere, all of which correspond quite well to the changes in solar activity, including the Medieval Maximum and Modern Grand Maximum.   They generally show no net warming since the middle of the 20th century due to a severe cooling period between the 1940s and 1960s (wiping out much of the early 20th century warming), which is consistent with the pattern of solar activity shown earlier.


22. The strongest part of the video response is the section citing 4 or 5 other papers that, in addition to concluding that the Sun plays a role in climate changes, also conclude that CO2 concentrations play an important role too.  Some of the identified papers even say that CO2 plays a larger role than natural factors do.   While this may appear to fully destroy the position that these papers “debunk” global warming as a myth — a claim which has not been made here — these statements still do not seem to assert that the only, 100% cause of climate changes since 1950 is anthropogenic CO2 emissions.  Indeed, even as they claim a significant role for CO2, they do not dismiss natural factors as having any role in climate changes.  This is, after all, the “consensus” position as espoused by the IPCC.   To support the consensus, then, there should be effectively no role for natural factors in climate change after 1950.  Papers that allow natural factors to at least contribute some to the climatic changes would therefore not be supporting the “consensus”.   That is why papers like these may still be included, despite the apparent inconsistency.


23. Williams et al. (2017) assert that temperatures warmed more and warmed faster (1.1°C, 0.008°C per year) from the 1660s to the 1800s — when CO2 did not change — than they did during the 1860 to 2007 period (0.8°C, 0.005°C per year).  This would not be consistent with the perspective that CO2 emissions are a more dominant climate forcing factor than the non-CO2 factors eliciting temperatures changes during the 17th to 19th centuries.   It undermines the A(2) position, and it would be consistent with N(2).  Also, while Williams et al. (2017) do state that global temperatures “are exceeding estimates of natural variability”, this is not remotely the same thing as concluding that natural factors do not play a role in climate changes after 1950.  Indeed, the opposite is said: natural factors are included as factors playing a role in climate changes “for the last 342 years.”

Reconstructed SSTs significantly warmed 1.1 ± 0.30°C … from 1660s to 1800 (rate of change: 0.008 ± 0.002°C/year), followed by a significant cooling of 0.8 ± 0.04°C …  until 1840 (rate of change: 0.02 ± 0.001°C/year), then a significant warming of 0.8 ± 0.16°C from 1860 until the end of reconstruction in 2007 (rate of change: 0.005 ± 0.001°C/year).”
“[T]hese data suggest a complex combination of solar irradiance, volcanic activity, internal ocean dynamics and external anthropogenic forcing explain the variability in Aleutian SSTs for the past 342 years.”

24 Zawiska et al. (2017) write that human emissions of CO2 are “considered” to be “the most important factor” in modern climate change.  They do not conclude that CO2 emissions are effectively the only factor.   Furthermore, they conclude that profound temperature changes for the region occurred far more abruptly between 1800 and 1875 than they have since, when the temperature changes have been slower and largely flat for the past 100 years (despite rising CO2 emissions).  The abrupt warming event — 4.3°C within 75 years — for the region was said to be forced by increased solar activity and the NAO.  Again, this would appear to support the significant role of natural factors in climate changes, and less so the anthropogenic influence, thus supporting both N(1) and N(2).   In the graph, notice how closely temperatures correspond to increases and decreases in solar activity.

“The temperature reconstruction from Lake Atnsjøen indicates that recent and ongoing climate warming began already in 1800 CE following the LIA. Temperatures increased very fast, from 8.5 to 12.8 °C during the first 75 years [1800-1875], but in the 20th century the increase became less pronounced.”


25.  Abrantes et al. (2017), refer to the Modern Grand Maximum (1940-2000) of very high solar activity, and, like the other authors above, suggest solar activity is a driver of cooling and warming events.

“The coldest SSTs are detected between 1350 and 1850 CE, on Iberia during the well-known Little Ice Age (LIA) (Bradley and Jones, 1993), with the most intense cooling episodes related with other solar minima events, and major volcanic forcing and separated by intervals of relative warmth (e.g. (Crowley and Unterman, 2013; Solanki et al., 2004; Steinhilber et al., 2012; Turner et al., 2016; Usoskin et al., 2011). During the 20th century, the southern records show unusually large decadal scale SST oscillations in the context of the last 2 millennia, in particular after the mid 1970’s, within the Great Solar Maximum (1940 – 2000 (Usoskin et al., 2011))”

It would not appear that Abrantes et al. (2017) are dismissing solar activity as having any role at all in climate changes, which is what the “consensus” asserts.

Also, this paper provides multiple reconstructions from the region and for the entire Northern Hemisphere that would support N(2), and would not support A(2), as they show that modern temperatures do not fall outside the range of natural variability.   All three graphs below even show a cooling trend beginning in the late 20th century, which would appear to be inconsistent with the perspective that CO2 changes are driving climate synchronously on a global scale.   Most of the modern warming is shown to have occurred during the first half of the 20th century, when CO2 emissions were but a fraction of what they were after 1950.  Again, this would not be consistent with the perspective that CO2 is driving up post-1950 temperatures at an unprecedented rate.

 

26.  Wang et al. (2017) characterize the impact of GHGs on the regional signal for the last 1000 years as a “reasonable speculation”.   “Reasonable speculation” that the millennial-scale changes may have been affected by GHGs would not appear to be a ringing endorsement.  Furthermore, millennial-scale changes would appear to be distinct from the changes after 1950, as human emissions could not have been driving climate on that timescale.    The authors also agree that their findings are consistent with Dr. Scafetta’s work, a scientist who has taken the position that the Sun has played a major role in climate changes, including during the modern era.

“The driving forces of climate change were investigated and the results showed two independent degrees of freedom —a 3.36-year cycle and a 22.6-year cycle, which seem to be connected to the El Niño–Southern Oscillation cycle and the Hale sunspot cycle, respectively. … Solar variability has been shown to be a major driver of climate in central Europe during the past two millennia using Δ14C records. Furthermore, this result is essentially in good agreement with the findings of Scafetta e.g. refs 171819, who found that the climate system was mostly characterized by a specific set of oscillations and these oscillations (61, 115, 130 and 983 years) appeared to be synchronous with major astronomical oscillations (solar system, solar activity and long solar/lunar tidal cycles).”

27. Other than natural vs. anthropogenic attribution for warming temperatures, PH54 does not address any of the other positions detailed in the 400+ papers compilation…other than to poke fun at the bat extinction and turbine waste issues.  The papers asserting the inadequacies of the models go unaddressed, as do the papers on the cryosphere, cloud variations and surface solar radiation.  The much higher temperatures and sea levels of years past, when CO2 concentrations were in the “safe” range, would appear to be a topic with some cogency when discussing global-scale warming.   None of the ~140  papers on the 2nd list were even touched on.

Apparently it is believed that all that is needed to “debunk” a compilation such as this is to point out that only a handful of the papers (Smirnov, 2017Hertzberg et al., 2017Kramm et al., 2017Nikolov and Zeller, 2017Harde, 2017Lightfoot and Mamer, 2017Blaauw, 2017,  Allmendinger, 2017Abbot and Marohasy, 2017 ) explicitly denounce anthropogenic CO2 as a main climate driver.

I disagree…for reasons that are nuanced.


04.11.17 Update

A Response To potholer54’s Response


A commenter here has brought to my attention that potholer54 (for brevity, I will continue using PH54) has replied to my response to his rather underwhelming YouTube critique of our list of 400+ papers (which will likely reach 500 by year’s end).

Apparently PH54 still is under the impression that he has (a) correctly interpreted what the papers say about CO2 emissions as the main climate driver (even though many don’t even mention CO2 or anthropogenic influences), and (b) he doesn’t have to address what the bulk of the papers and graphs are supporting: that there is nothing unusual or remarkable about modern day climate changes (temperatures, glacier melt, sea level rise, weather events, precipitation patterns, etc.), and thus many of the modeled expectations that indicate there should be a clearly recognizable anthropogenic signal in the above parameters have been thoroughly undermined.   I’ll address both points here.

PH54 writes: “Richards shows quite neatly in his rebuttal that they [the papers] don’t even say what Richards himself claims they say.”

Obvioiusly I will need to again address PH54’s claim that he has correctly interpreted what the papers say.  We’ll start with his claim that Yndestad and Solheim (2017) have only referred to the Holocene in their paper.  (In the video, he underlines in red the years 1000 A.D. and 1700 A.D. found in the abstract so as to support this contention.)  PH54  claims these scientists do not refer to the current era.  To support his claim that Yndestad and Solheim were only referring to the Holocene, PH54 decided to omit or ignore the part of the abstract where it says:

Deterministic models based on the stationary periods confirm the results through a close relation to known long solar minima since 1000 A.D. and suggest a modern maximum period from 1940 to 2015. The model computes a new Dalton-type sunspot minimum from approximately 2025 to 2050 and a new Dalton-type period TSI minimum from approximately 2040 to
2065.

So the authors not only refer to the 20th and 21st century in the paper’s abstract, they forecast a solar minimum and cooler temperatures for the coming decades.   Nowhere in the paper are there references made to CO2 or anthropogenic influences on climate, as PH54 falsely contends.  Furthermore, one of the scientists’ main points is that the very high solar activity that we have enjoyed in recent decades is only now declining.   The Modern Grand Maximum did not begin to decline in the 1950s, as claimed, but only after the year 2000.  In fact, the year 2000 is characterized as the peak of the current grand maximum.

[T]he activity level of the Modern Maximum (1940–2000) is a relatively rare event, with the previous similarly high levels of solar activity observed 4 and 8 millennia ago (Usoskin et al., 2003). … A cold period was also observed during the time of the Dalton minimum. The Maunder and the Dalton minima are associated with less solar activity and colder climate periods. …  Studies that employ cosmogenic isotope data and sunspot data indicate that we are currently leaving a grand activity maximum, which began in approximately 1940 and is now declining (Usoskin et al., 2003; Solanki et al., 2004; Abreu et al., 2008).  … A visual inspection of the TSI wavelet spectrum reveals the dominant periods in the TSI data series in the time window between 1700 and 2013. The long wavelet period has a maximum in 1760, 1840, 1930, and 2000, with a mean gap of approximately 80 years”

Not only this, but contrary to PH54’s claims that Yndestad and Solheim do not connect solar activity to modern-day temperatures, these scientists indicate that the Hoyt-Schatten/Scafetta-Wilson TSI reconstruction shows  astrong correlation with Northern Hemisphere temperatures extending from 1880 to 2013, citing the work of Soon et al. (2015) in affirming a “a strong solar influence on the temperature of the Northern Hemisphere“.

Periods with few sunspots are associated with low solar activity and cold climate periods. Periods with many sunspots are associated with high solar activity and warm climate periods.  … The Hoyt-Schatten irradiance model has been calibrated and extended with the newest version of ACRIM TSI observations (e.g. Scafetta and Willson, 2014, Fig. 16); it is employed in this analysis. In the following section, this reconstruction is referred to as TSI HS. A mostly rural Northern Hemisphere composite temperature series 1880–2013 shows strong correlation with the TSI-HS reconstruction, which indicates a strong solar influence on the temperature of the Northern Hemisphere (Soon et al., 2015).”

It should be noted that Soon et al. (2015) found a very strong correlation between solar activity and Northern Hemisphere rural temperatures since 1880 – as the rural instrumental data are less affected by artificial or non-climatic warming.

“Finally, we compare our new composite to one of the solar variability datasets not considered by the CMIP5 climate models, i.e., Scafetta & Willson, 2014’s update to the Hoyt & Schatten, 1993 dataset. A strong correlation is found between these two datasets, implying that solar variability has been the dominant influence on Northern Hemisphere temperature trends since at least 1881.”


PH54 writes:

“The first paper Richards cites in his rebuttal – a paper by Li et al. – is itself a good example of this.  Citing that paper, Richards concludes: “There appears to be a very close correlation between the solar activity and the hemispheric temperature, including during the 20th century, when CO2 is said to have been the temperature driver.  “There APPEARS to be” means this is what Richards thinks – it is not the conclusion of the paper.”


PH54 also once again claims that Li et al. (2017) support his claim that CO2 emissions from humans are what have driven temperature increases for the region since about 1950, and that solar activity has not driven the temperature increase.  But, as mentioned above, the authors cite a graph of temperatures for North China that, yet again, does not show any net warming since about 1940.   While this is somewhat problematic in supporting the contention that CO2 has driven the warming trend there, the authors also point out that the increase in greenhouse gases may “partly” play a role in the variability during “the recent warming periods”, but that partial role in influencing variability is only superimposed on the solar-dominated control.   Once again, this would not appear to support PH54’s claim that Li et al. (2017) endorse the position that CO2 increases are the main, close-to-100% control on temperature changes for the region during the “recent warming periods”.

[H]igh volumes of greenhouse gases such as CO2 and CH4 during the recent warming periods, may also play a role in partly affecting the climatic variability in NC, superimposing on the overall solar-dominated long-term control (e.g., Wanner et al., 2008; Tan et al., 2011; Kobashi et al., 2013; Chen et al., 2015a,b).”
“[T]he 103, 50, and 22 year periods for TANN [annual temperatures] correlate well with the 100, 50, 23 and 22 year cycles for the solar activity observed in various solar parameters (e.g., Wilson et al., 1996; Li et al., 1996; Chowdhury et al., 2009; Zhang et al., 2014), therefore implying an in-phase relationship between the climatic oscillation in NC [North China] and solar activity.”

PH54: “The reason Richards got it wrong is that he was trying to discern solar fluctuations over the last 50 years in a graph that spans 1,000 years, so the curves over that narrow period are very small and indistinct. Perhaps Richards wasn’t wearing his glasses.”

As shown in the initial response above, several other reconstructions of Northern Hemisphere temperature show strong consistencies with trends in solar activity.  The above criticism assessing an inability to discern solar fluctuations would not appear to be valid here.

Stoffel et al., 2015

Christiansen and Lungqvist (2012)


But these graphs, and the main problem with PH54’s analysis in general, is that he thinks that by directly referencing a grand total of 9 papers in his video, he has comprehensively assessed and correctly interpreted the entire volume of 400+ papers accurately, and that they all affirm his presuppositions that in about 1950 CO2 emissions became the dominant cause of climate change, and the natural factors that used to be the dominant causes of climate change figuratively took a backseat.  That’s why he curiously writes, without obviously having even read even close to “many” of these papers, that:

PH54: “Many of the 400 papers explicitly ENDORSE the conclusion that CO2 is a powerful greenhoue gas.”   

Of course, this claim is an assumption, not rooted in actual analysis of the 400+ papers.  It’s also an example of how the definition of what is necessary to affirm an endorsement of the “consensus” changes mercurially to fit the presupposition.  Now all that’s needed to affirm that these papers are invalid as evidence supporting a skeptical position on climate alarm is that an author need only write (explicitly) that CO2 is a “powerful greenhouse gas.”   PH54 claims that there are “many” such papers here…after having “analyzed” about 9 of them.  Of course, whether or not CO2 is a “powerful greenhouse gas” — interestingly, water vapor is also a “powerful greenhouse gas” — is not even the question being affirmed or questioned here.  It’s just moving the pea.


The question is this: To what extent, or how much, are trends in weather extremes, surface temperatures, ocean heat content, glacier melt, sea level rise, floods, droughts, etc., influenced by parts per million (0.000001) changes in atmospheric CO2 concentrations and human emissions vs. the extent to which identifiable trends vary naturally, or without anthropogenic influence?

The “consensus” position, as endorsed by the IPCC, is that close to 100% of the climate change that has occurred since 1950 is human caused.

If that’s the case, then human-caused climate change doesn’t look much different than the one produced by natural variations.  The changes in modern climate indices are so negligible, so trivial, that finding an anthropogenic signal amid the noise of natural variability is quite difficult.   That’s what most of these papers say.  And that’s exactly what PH54 writes nothing about in his “critique” of our list.

And yet PH54 apparently thinks that all he must do is claim that “many” of the papers support (actually, “explicitly ENDORSE”) the position that CO2 is a “powerful greenhouse gas”, and, just like that, the 400+ papers will . . . go away.

No, that’s just not how it works.

The following are just a tiny fraction of the papers from 2017 supporting a skeptical position on climate alarm . . . that PH54 never even bothered to read.


350 Non-Hockey Stick Graphs, With The First ~120 From 2017

• There has been no detectable long-term acceleration in sea level rise (Parker and Ollier, 2017), and sea levels may only be rising between 0.25 and 1.04 mm/year.

“[L]ocal sea-level forecasts should be based on proven local sea-level data. Their naïve averaging of all the tide gauges included in the PSMSL surveys showed ‘‘relative’’ trends of about + 1.04 mm/year (570 tide gauges of any length). By only considering the 100 tide gauges with more than 80 years of recording, the average trend was only + 0.25 mm/year [2.5 centimeters per century].”
“….does not support the notion of rapidly changing mass of ice in Greenland and Antarctica
loud divergence between sea level reality” and “the climate models [that] predict an accelerated sea-level rise driven by the anthropogenic CO2 emission.”

The melting of the Greenland ice sheet that could possibly be attributed to humans is still “too small to be detected”. (Haine, 2016Orsi et al., 2017)

“Notably, the three studies [Jackson et al., 2016;  Böning et al., 2016Robson et al., 2016] report an absence of anthropogenic effects on the AMOC, at least so far: the directly observed AMOC weakening since 2004 is not consistent with the hypothesis that anthropogenic aerosols have affected North Atlantic ocean temperatures. The midlatitude North Atlantic temperature changes since 2005 have greater magnitude and opposite sign (cooling) than those attributed to ocean uptake of anthropogenic heat. The anthropogenic melt from the Greenland ice sheet is still too small to be detected.
The recent warming trend in North Greenland  … We find that δ 18O [temperature/climate proxy] has been increasing over the past 30 years, and that the decade 1996-2005 is the second highest decade in the 287-year record. The highest δ 18O [temperature/climate proxy] values were found in 1928, which is also an extreme year in GISP2 and NGRIP ice cores, and in a coastal South Greenland composite (Vinther et al., 2006; Masson-Delmotte et al., 2015), but the decadal average (1926-1935) is not statistically different from the decade (2002-2011). … The surface warming trend has been principally attributed to sea ice retreat and associated heat fluxes from theocean (Serreze et al., 2009; Screen and Simmonds, 2010a, b), to a negative trend in the North Atlantic Oscillation (NAO) since 1990, increasing warm air advection on the West Coast of Greenland and Eastern Canada (Hanna et al., 2012; Fettweis et al., 2013; Ding et al., 2014), and to an increase in the Greenland Blocking Index [Hanna et al., 2013]. These latter mechanisms could be dominated by natural variability rather than forced response to the anthropogenic increase in greenhouse gases (Fettweis et al., 2013; Screen et al., 2014).”

• The Greenland ice sheet (GIS) has been melting so slowly and so negligibly in recent decades that the entire ice sheet’s total contribution to global sea level rise was a mere 0.39 of a centimeter (0.17 to 0.61 cm) between 1993 and 2010 (Leeson et al, 2017).

• The Western Antarctic Peninsula has been rapidly cooling since 1999 (-0.47°C per decade), reversing the previous warming trend and leading to “a shift to surface mass gains of the peripheral glacier” (Oliva et al., 2017).

• Since 1800, the Surface Mass Balance for the entire Antarctic Ice Sheet has increased.  (Thomas et al., 2017).

“Our study suggests an overall increase in SMB [surface mass balance] across the grounded Antarctic ice sheet of ~ 44 GT since 1800 AD, with the largest (area-weighted) contribution from the Antarctic Peninsula (AP).”

• There has been no continent-scale warming trend for Antarctica since CO2 emissions began rising. (Stenni et al., 2017)

“[N]o continent-scale warming of Antarctic temperature is evident in the last century.”

• According to Fettweis et al. (2017), the Greenland ice sheet contributed a grand total of 1.5 centimeters of sea level rise between the years 1900 and 2010, with most of that contribution coming prior to 1940 (since there was no contribution at all between 1940 and 2000). The ice sheet gained mass between 1961 and 1990, or during the same period of time that CO2 emissions were skyrocketing.

“The period 1961–1990 has been considered as a period when the total mass balance of the Greenland ice sheet was stable (Rignot and Kanagaratnam, 2006) and near zero. However, at the last century scale, all MAR reconstructions suggest that SMB [surface mass balance] was particularly positive during this period [1961-1990] (SMB was most positive from the 1970s to the middle of the 1990s), suggesting that mass gain may well have occurred during this period, in agreement with results from Colgan et al. (2015). … Finally, with respect to the 1961–1990 period, the integrated contribution of the GrIS SMB anomalies over 1900–2010 is a sea level rise of about 15 ± 5 mm [1.5 cm], with a null contribution from the 1940s to the 2000s

• Greenland is currently about 3 degrees C colder than it was just a few thousand years ago (Kobashi et al., 2017).


“Greenland temperature reached the Holocene thermal maximum with the warmest decades occurring during the Holocene (2.9 ± 1.4 °C warmer than the recent decades [1988-2015]) at 7960 ± 30 years B.P.”

• The Greenland ice sheet has been cooling (slightly) since 2005 (Kobashi et al., 2017).

“For the most recent 10 years (2005 to 2015), apart from the anomalously warm year of 2010, mean annual temperatures at the Summit exhibit a slightly decreasing trend in accordance with northern North Atlantic-wide cooling.”

• The Southern Ocean has been cooling since 1979 (Turney et al., 2017Kusahara et al., 2017).

Occupying about 14% of the world’s surface, the Southern Ocean plays a fundamental role in ocean and atmosphere circulation, carbon cycling and Antarctic ice-sheet dynamics. … As a result of anomalies in the overlying wind, the surrounding waters are strongly influenced by variations in northward Ekman transport of cold fresh subantarctic surface water and anomalous fluxes of sensible and latent heat at the atmosphere–ocean interface. This has produced a cooling trend since 1979.”

Concomitant with this positive trend in Antarctic sea ice, sea surface temperatures (SSTs) over the Southern Ocean south of approximately 45°S have cooled over this period [since 1979].”

• Sea ice for the entire Southern Hemisphere has been growing, defying climate models (Comiso et al., 2017).

The Antarctic sea ice extent has been slowly increasing contrary to expected trends due to global warming and results from coupled climate models.”

• Sea ice for the Northern Hemisphere has undergone an oscillation in the last 80 years, consistent with temperature trends for all of the Arctic (Connolly et al., 2017, Hanhijärvi et al., 2013).

“According to this new dataset, the recent period of Arctic sea ice retreat since the 1970s followed a period of sea ice growth after the mid 1940s, which in turn followed a period of sea ice retreat after the 1910s. Our reconstructions agree with previous studies that have noted a general decrease in Arctic sea ice extent (for all four seasons) since the start of the satellite era (1979). However, the timing of the start of the satellite era is unfortunate in that it coincided with the end of several decades during which Arctic sea ice extent was generally increasing. This late-1970s reversal in sea ice trends was not captured by the hindcasts of the recent CMIP5 climate models used for the latest IPCC reports, which suggests that current climate models are still quite poor at modelling past sea ice trends.”

HadCRUT4 Data – Graph Source: climate4you

•The North Atlantic has been rapidly cooling since 2005 (Piecuch et al., 2017)

“The subpolar North Atlantic (SPNA) is subject to strong decadal variability, with implications for surface climate and its predictability. In 2004–2005, SPNA decadal upper ocean and sea-surface temperature trends reversed from warming during 1994–2004 to cooling over 2005–2015.”

Storm, Price-Collapse “Expose Madness Of Energiewende” …Thousands Of Turbines To Be Dismantled As Subsidies Expire

The online Die Welt here reports that storm “Herwart” which swept across Germany late last month – with wind gusts of up to 140 kilometers per hour  – led to a wholesale electricity “price collapse” and thus “exposed the madness of the Energiewende“.

“Negative prices”

As storm Herwart waged, wind parks across Germany over-flooded electricity grids with power that was not needed, and thus forced electricity prices on the exchange to go deep into negative levels within just minutes. In a nutshell: grid operators were forced to pay to get rid of the surplus power. But that payment won’t go to consumers, as Die Welt writes:

The consumers get no benefit from this. For them it will even be more expensive.”

This is because as grid operators are forced to pay large buyers to accept the power that no one needs or wants, they will incur added costs, which of course will be passed on to the regular German consumers. Germans are already saddled with almost the highest rates in the world. This is despite the preposterous comments made by some media outlets suggesting that German consumers could even get paid for the disposal of waste power.

Market forces disabled

Die Welt presents a chart depicting wholesale power prices. It shows the price falling to -52.11 euros per megawatt hour. Moreover, the chart shows that these extreme negative prices have become more frequent over the past 18 months. Die Welt comments:

‘Herwart’ shows in a sobering manner the astounding design deficiency of the German Energiewende [transition to green energies]. “

Die Welt blames Germany’s Energiewende and the green energy feed-in act, which “systematically disable market forces”.

Thousands Of Older Wind Turbines To Be Dismantled

On another note, the Kiel, German-based Kieler Nachrichten (KN) reportsthousands of wind turbines will be supposedly dismantled over the coming decade because the state subsidies will run out“. “And according to the Berlin-based specialty company Energy Brainpool, these turbines will not be replaced if energy prices do not increase.

Subsidies running out

Economists of the Helmholtz Center for Environmental Research also expect a decommissioning of a considerable number of older turbines. The 2020s could be a decade where Germany may start to see the end of the wind energy madness. It all depends on the price of electricity in beginning in 2021. Wind turbines originally were guaranteed fixed feed-in rates for a period of 20 years. Now that these early systems are approaching that lifetime and the feed-in tariffs expire, it is questionable that they will continue to operate.

Too expensive to keep in operation

Many old turbines are likely to be put out of commission because they require greats amounts of costly maintenance and repairs, and so likely will not be profitable to operate. The KN writes:

The current wholesale electricity price of 3 euro-cents per kilowatt-hour will not be enough to keep the turbines in operation…”.

The KN cites the Bundesverband Windenergie (German Association for Wind Energy) which estimates that some 14,000 MW of installed capacity face being shut down by 2023. “That would be more than a quarter of the currently installed onshore wind energy capacity getting removed.”

A monument to an industrial folly

The question that remains is what will happen to these thousands of shut down turbines. Will they be abandoned and thus leave the country’s idyllic landscape a mass junkyard – a monument to one of the greatest industrial follies man has ever witnessed?

 

Again And Again: Experts And New Findings Show No Link Between European Storm Activity And CO2

DWD: For the past there is no clear evaluation showing change in the strength and intensity of storms over Germany

By Dr. Sebastian Lüning and Prof. Fritz Vahrenholt
(German text translated/edited by P. Gosselin)

This month two major North Sea storms have hit Europe rather severely, and not surprisingly the usual climate ambulance chasers were out in force to try to pin the blame on man’s activity, and in doing so ignored the climate history that provides us with the proper perspective. We look at some analyses of past German storm activity.

Two years ago Uwe Kirsche of the German DWD national Weather Service warned RP-Online against jumping to conclusions:

Germany appears to be plagued by storms. Over the past tens years there’s been on average one hurricane-force storm each year. Are these natural events occurring more often than they did in the past? Will storms accompany us on a daily basis during certain months?

‘That’s a difficult topic,’ answered Uwe Kirsche of the German DWD Weather Service. ‘For the past there are not such evaluations showing a change in the strength or frequency of storms over Germany,’ he clarified. While temperature curves and rainfall amounts are well documented over decades, the DWD must hold back when it comes to storms.”

At ScienceSkepticalBlog,Michael Krüger reported in 2014:

Storm activity over the North and Baltic Seas (storm index at the North and Baltic Sea / geotropical wind speeds since 1880) has not been increasing, but rather have been falling off since 1880. A temporary peak was reached around 1990 and since then activity has fallen further.”

Here Kruger shows two storm index curves, but without citing a source. Researchers at the Institute for Coastal Research at the Helmholtz Center in Geesthacht have tracked storms in Germany and neighboring countries for a long time, and they too have not been able to detect any worrisome trends. At the online shz.de in 2014 we read:

Storm ‘Christian’ was no offspring of climate change
[…] Together with colleagues of the German Weather Service and the Danish Meteorological Institute, coastal researchers of Geesthacht evaluated the data on storm ‘Christian’ and other intense storms. Von Storch experienced the October 28, 2013, storm up close as while attempting to visit his home island of Föhr, he became stranded in Dagebüll. He and his colleagues found fluctuations in storm intensity over many decades. ‘Detectable is a reduction in storm intensity from the 1880s until the mid 1960s, followed by a rise until the mid 1990s,’ said von Storch. Since the 1990s, activity has fallen off once again. ‘Unlike heat waves, these fluctuations can be attributed solely to natural variability,’ explained the scientist. […]”

Hans von Storch also was interviewed in der Zeit in 2015 (behind a paywall):

‘Sometimes it just gets more active…’
Waiting for the intense storm: Meteorologist Hans von Storch knows what extreme weather events have to do with our everyday climate.”

Noteworthy is also an article appearing in proplanta: in January, 2015:

Climate experts warn against rushing to blame storms and flooding on climate change
‘Single events cannot be linked to climate change,’ told Florian Imbery, climate expert at the German DWD Weather Service in Offenbach, to the German Press Agency on Monday. Reliable statements can be made only when we compare 30-year intervals. We can determine relatively well changes in the temperature. With precipitation it’s already more difficult, and it’s practically impossible to do with storms. The difference: ‘Temperature is a stable magnitude, precipitation and wind are highly variable in terms of space and time.’ For Imbery it is relatively clear that it is getting warmer: ‘We are getting heat periods more often.’ But that is the only significant change in climate – for wind in rain their are only ‘signs’.”

Read at proplanta

Four recent papers find no trend

Now let’s look at long-term observations. Bierstedt et al. (2016) looked at the variability in daily wind speed over northern Europe for the past 1000 years in computer simulations. The results are easily summarized: Every model shows something different. The study ended up with a large contradiction and the finding that the models are still unable to simulate wind and storms. That’s a pity. Abstract:

Variability of daily winter wind speed distribution over Northern Europe during the past millennium in regional and global climate simulations
We analyse the variability of the probability distribution of daily wind speed in wintertime over Northern and Central Europe in a series of global and regional climate simulations covering the last centuries, and in reanalysis products covering approximately the last 60 years. The focus of the study lies on identifying the link of the variations in the wind speed distribution to the regional near-surface temperature, to the meridional temperature gradient and to the North Atlantic Oscillation. Our main result is that the link between the daily wind distribution and the regional climate drivers is strongly model dependent. The global models tend to behave similarly, although they show some discrepancies. The two regional models also tend to behave similarly to each other, but surprisingly the results derived from each regional model strongly deviates from the results derived from its driving global model. In addition, considering multi-centennial timescales, we find in two global simulations a long-term tendency for the probability distribution of daily wind speed to widen through the last centuries. The cause for this widening is likely the effect of the deforestation prescribed in these simulations. We conclude that no clear systematic relationship between the mean temperature, the temperature gradient and/or the North Atlantic Oscillation, with the daily wind speed statistics can be inferred from these simulations. The understanding of past and future changes in the distribution of wind speeds, and thus of wind speed extremes, will require a detailed analysis of the representation of the interaction between large-scale and small-scale dynamics.”

Another study by Bett et al. 2017 examined wind in Europe over the past 142 years, apparently based on homogenized data. The scientists were not able to find any really significant long-term trend, but were able to see systematic fluctuations on decadal scales, likely in connection with ocean cycles. Abstract:

Using the Twentieth Century Reanalysis to assess climate variability for the European wind industry
We characterise the long-term variability of European near-surface wind speeds using 142 years of data from the Twentieth Century Reanalysis (20CR), and consider the potential of such long-baseline climate data sets for wind energy applications. The low resolution of the 20CR would severely restrict its use on its own for wind farm site-screening. We therefore perform a simple statistical calibration to link it to the higher-resolution ERA-Interim data set (ERAI), such that the adjusted 20CR data has the same wind speed distribution at each location as ERAI during their common period. Using this corrected 20CR data set, wind speeds and variability are characterised in terms of the long-term mean, standard deviation and corresponding trends. Many regions of interest show extremely weak trends on century timescales, but contain large multidecadal variability. Since reanalyses such as ERAI are often used to provide the background climatology for wind farm site assessments, but contain only a few decades of data, our results can be used as a way of incorporating decadal-scale wind climate variability into such studies, allowing investment risks for wind farms to be reduced.”

Next is a paper by Rangel-Buitrago et al. 2016 in the Journal of Coastal Research. The authors examined wave and storm data from a buoy off the coast of South Wales. Near the end of the 20th century they observed a high level of storm activity, but this subsided during the early part of the 21st century. Here the researchers were able to see clear relationships with ocean cycles, especially the Arctic Oscillation and the North Atlantic Oscillation. Abstract:

Wave Climate, Storminess, and Northern Hemisphere Teleconnection Patterns Influences: The Outer Bristol Channel, South Wales, U.K.
This paper investigates potential climate-change impacts on the Outer Bristol Channel (Wales, U.K.) by analysing a 15-year wave-buoy dataset (1998–2013) to characterise wave climate and storms. The research showed that the increasing storminess experienced during the latter half of the 20th century did not, as expected, continue into the first decades of the 21st century; however, the wave climate showed clear cyclic variation in average monthly significant wave height (Hs), with low values occurring between May and August (Hs < 1.4 m, Hsmax < 6 m) and a minimum in August (Hs = 1.3 m, Hsmax = 5.2 m). Monthly mean wave power was 27.4 kwm−1, with a maximum of 951 kwm−1 during December. The 267 storm events were recorded during the assessment period. Storm-severity distribution presented a log-normal trend, with weak and moderate events making up 73% of the record (125 and 69 events, respectively); significant (18%), severe (4%), and extreme (6%) storms resulting in 73 events that are more destructive made up the remainder of the record. Fifty-five percent of the monthly averaged wave variations, wave power, and storminess indices are linked to several teleconnection patterns, the most relevant being the Arctic Oscillation, with 23.45%, the North Atlantic Oscillation, with 20.65%, and the East Atlantic with 10.9%. This kind of characterization is essential for design considerations to any proposed developments within the Bristol Channel that affect the coastal zone, e.g., the proposed design of the Swansea Bay Tidal Lagoon, which is capable of generating over 542,000 MWhyr−1 of renewable energy.”

Also see CO2Science.

Finally we go to Krakow, where Bielec-Bakowska & Piotrowicz 2013 analyzed the storm past of the last 100 years. Summary: There is no detectable trend. Abstract:

Long-term occurrence, variability and tracks of deep cyclones over Krakow (Central Europe) during the period 1900–2010
This article discusses patterns in the long-term and seasonal occurrence of deep cyclones over Krakow. This study analysed the frequency of occurrence of air pressure values equal to or lower than the 1st percentile (equivalent to ≤995.3 hPa) of all air pressure values recorded at 12:00 UTC over a period of 110 years (1900/1901–2009/2010). Special attention was devoted to the tracks of deep cyclones. No distinct changes were found in the frequency of occurrence of deep cyclones during the study period. Overall the frequency peaked in December, but in recent years there has been an increase in frequency towards the end of winter and beginning of spring. A similar general lack of noticeable change in the number of days with deep cyclones can also be found in specific tracks. There were minor increases in the frequency of occurrence of cyclones from the Norwegian Sea (T1), the Atlantic (T3), Bay of Biscay (T6) and the Mediterranean (T7) after 1950. The study also found confirmation of the theory that cyclone tracks had shortened at their northeastern extremities.”

Also read the article on the paper at The Hockeyschtick.

Reality vs. Theory: Scientists Affirm ‘Recent Lack Of Any Detectable Acceleration’ In Sea Level Rise

Scientists: ‘Loud Divergence Between Sea

Level Reality And Climate Change Theory’

Global Sea Level ‘Acceleration’ Just 0.002 mm/year²


According to peer-reviewed, “consensus” climate science, anthropogenic CO2 emissions are the cause of Arctic sea ice decline.  In fact, peer-reviewed, “consensus” climate science indicates the causal relationship is so direct and so linear that it can be said with confidence that we humans melt one square foot of sea ice for every 75 miles we travel in a gasoline-powered engine.

The modeled results are even more alarming for the polar ice sheets.   Like Arctic sea ice, the peer-reviewed, “consensus” climate science says that there is a direct, causal relationship between the magnitude of our CO2 emissions and the magnitude of polar ice sheet melt.   Therefore, by driving our vehicles and heating our homes we are catastrophically melting the Antarctic and Greenland ice sheets to such a degree that our CO2 emissions will likely cause sea levels to rise 10 feet during the next 50 years (by 2065).

Ten feet is the equivalent of about 3.05 meters of sea level rise by 2065.

So, according to peer-reviewed, “consensus” climate science, the catastrophic melting of the polar ice sheets will produce 0.61 of a meter of sea level rise per decade, which is 61 mm/year, over the course of the next 50 (now 48 – a 2015 paper) years.  To achieve this, more than an order of magnitude greater sea level rise acceleration will need to begin . . . immediately.

The trouble is, the physics, and reality, do not support “mainstream” climate science models predicated on anthropogenic CO2 emissions as the principal driver of  ice sheet melt and sea level rise.  For example:

1. East Antarctica, which comprises two-thirds of the continent, has been gaining mass since 2003 (Martín-Español et al., 2017).

2. The Western Antarctic Peninsula has been rapidly cooling since 1999 (-0.47°C per decade), reversing the previous warming trend and leading to a shift to surface mass gains of the peripheral glacier” (Oliva et al., 2017).

3. The Greenland ice sheet (GIS) has been melting so slowly and so negligibly in recent decades that the entire ice sheet’s total contribution to global sea level rise was a mere 0.39 of a centimeter (0.17 to 0.61 cm) between 1993 and 2010 (Leeson et al, 2017) .  That’s a sea level rise contribution of about 0.23 mm/year since the 1990s, which is a canyon-sized divergence from the 61 mm/year that adherents of peer-reviewed, “consensus” climate science have projected for the coming decades.

And now Australian scientists have published a new paper in the journal Earth Systems and Environment that “does not support the notion of rapidly changing mass of ice in Greenland and Antarctica“.  The paper highlights the “loud divergence between sea level reality” and “the climate models [that] predict an accelerated sea-level rise driven by the anthropogenic CO2 emission“.

In fact, the key finding from the paper is that long-term observations from tide gauges reveal a “recent lack of any detectable acceleration in the rate of sea-level rise“.    The modern rate of sea level rise acceleration – 0.002 mm/year² – is so negligible it falls well below the threshold of measurement accuracy.

The lack of a detectable global-scale sea level rise acceleration recorded in tide gauge measurements isn’t a novel finding.  In recent years, dozens of other scientists have bravely come forward to challenge “consensus” modeling that implicates anthropogenic CO2 emissions as the preeminent cause of ice sheet melt and sea level rise.

Perhaps at some point “consensus”-based climate science will jettison its focus on models and projections of perilous future climate states directly caused by anthropogenic CO2 emissions and instead embrace the observational evidence that may undermine the alarm.

Until then, we will likely need to continue learning about how many millimeters we humans raise sea levels for each kilometer we drive in our fossil-fuel-powered vehicles.   Because that’s how “consensus” climate science works.


Parker and Ollier, 2017

[L]ocal sea-level forecasts should be based on proven local sea-level data. Their naïve averaging of all the tide gauges included in the PSMSL surveys showed ‘‘relative’’ trends of about + 1.04 mm/year (570 tide gauges of any length). By only considering the 100 tide gauges with more than 80 years of recording, the average trend was only + 0.25 mm/year [2.5 centimeters per century]. This naïve averaging has been stable in recent decades, and it shows that the sea levels are slowly rising but not significantly accelerating. They conclude that if the sea levels are only oscillating about constant trends everywhere, then the local patterns may be used for local coastal planning without any need to use purely speculative global trends based on emission scenarios.

The loud divergence between sea-level reality and climate change theory—the climate models predict an accelerated sea-level rise driven by the anthropogenic CO2 emission—has been also evidenced in other works such as Boretti (2012a, b), Boretti and Watson (2012), Douglas (1992), Douglas and Peltier (2002), Fasullo et al. (2016), Jevrejeva et al. (2006), Holgate (2007), Houston and Dean (2011), Mörner 2010a, b, 2016), Mörner and Parker (2013), Scafetta (2014), Wenzel and Schröter (2010) and Wunsch et al. (2007) reporting on the recent lack of any detectable acceleration in the rate of sea-level rise. The minimum length requirement of 50–60 years to produce a realistic sea-level rate of rise is also discussed in other works such as Baart et al. (2012), Douglas (1995, 1997), Gervais (2016), Jevrejeva et al. (2008), Knudsen et al. (2011), Scafetta (2013a, b), Wenzel and Schröter (2014) and Woodworth (2011).

[T]he information from the tide gauges of the USA and the rest of the world when considered globally and over time windows of not less than 80 years […] does not support the notion of rapidly changing mass of ice in Greenland and Antarctica as claimed by Davis and Vinogradova (2017). The sea levels have been oscillating about a nearly perfectly linear trend since the start of the twentieth century with no sign of acceleration. There are only different phases of some oscillations moving from one location to another that do not represent any global acceleration.

The global sea-level acceleration is therefore in the order of + 0.002  ± 0.003 mm/year², i.e. + 2 ÷ 3 μm/year², well below the accuracy of the estimation. This means that the sea levels may rise in the twenty-first century only a few centimeters more than what they rose during the twentieth century. This is by no means alarming.

The information from the tide gauges of the USA does not support any claim of rapidly changing ice mass in Greenland and Antarctica. The data only suggest the sea levels have been oscillating about the same trend line during the last century and this century.


Other New Supporting Papers Indicating No Anthropogenic Sea Level Rise Signal


Watson, 2017

The analysis in this paper is based on a recently developed analytical package titled ‘‘msltrend,’’ specifically designed to enhance estimates of trend, real-time velocity, and acceleration in the relative mean sea-level signal derived from long annual average ocean water level time series. Key findings are that at the 95% confidence level, no consistent or compelling evidence (yet) exists that recent rates of rise are higher or abnormal in the context of the historical records available across Europe, nor is there any evidence that geocentric rates of rise are above the global average. It is likely a further 20 years of data will distinguish whether recent increases are evidence of the onset of climate change–induced acceleration.


Munshi, 2017

Detrended correlation analysis of a global sea level reconstruction 1807-2010 does not show that changes in the rate of sea level rise are related to the rate of fossil fuel emissions at any of the nine time scales tried. The result is checked against the measured data from sixteen locations in the Pacific and Atlantic regions of the Northern Hemisphere. No evidence could be found that observed changes in the rate of sea level rise are unnatural phenomena that can be attributed to fossil fuel emissions. These results are inconsistent with the proposition that the rate of sea level rise can be moderated by reducing emissions. It is noted that correlation is a necessary but not sufficient condition for a causal relationship between emissions and acceleration of sea level rise.


Hansen et al., 2016

Together with a general sea-level rise of 1.18 mm/y, the sum of these five sea-level oscillations constitutes a reconstructed or theoretical sea-level curve of the eastern North Sea to the central Baltic Sea … which correlates very well with the observed sea-level changes of the 160-year period (1849–2009), from which 26 long tide gauge time series are available from the eastern North Sea to the central Baltic Sea.  Such identification of oscillators and general trends over 160 years would be of great importance for distinguishing long-term, natural developments from possible, more recent anthropogenic sea-level changes. However, we found that a possible candidate for such anthropogenic development, i.e. the large sea-level rise after 1970, is completely contained by the found small residuals, long-term oscillators, and general trend. Thus, we found that there is (yet) no observable sea-level effect of anthropogenic global warming in the world’s best recorded region.


Palanisamy, 2016

Building up on the relationship between thermocline and sea level in the tropical region, we show that most of the observed sea level spatial trend pattern in the tropical Pacific can be explained by the wind driven vertical thermocline movement. By performing detection and attribution study on sea level spatial trend patterns in the tropical Pacific and attempting to eliminate signal corresponding to the main internal climate mode, we further show that the remaining residual sea level trend pattern does not correspond to externally forced anthropogenic sea level signal. In addition, we also suggest that satellite altimetry measurement may not still be accurate enough to detect the anthropogenic signal in the 20-year tropical Pacific sea level trends.


Hadi Bordbar et al., 2016

Here we address the question as to whether the recent decadal trends in the tropical Pacific atmosphere-ocean system are within the range of internal variability, as simulated in long unforced integrations of global climate models. We show that the recent trends are still within the range of long-term internal decadal variability.


Global Sea Levels Actually Rising About 1 mm/yr… Not 3+ mm/yr


McAneney et al., 2017

Global averaged sea-level rise is estimated at about 1.7 ± 0.2 mm year−1 (Rhein et al. 2013), however, this global average rise ignores any local land movements. Church et al. (2006) and J. A. Church (2016; personal communication) suggest a long-term average rate of relative (ocean relative to land) sea-level rise of 1.3 mm year. …The data show no consistent trend in the frequency of flooding over the 122-year [1892-2013]  duration of observations despite persistent warming of air temperatures characterized in other studies. On the other hand, flood frequencies are strongly influenced by ENSO phases with many more floods of any height occurring in La Niña years. … In terms of flood heights, a marginal statistically significant upward trend is observed over the entire sequence of measurements. However, once the data have been adjusted for average sea-level rise of 1.3 mm year−1 over the entire length of the record, no statistical significance remains, either for the entire record, or for the shortened series based on higher quality data. The analysis of the uncorrected data shows how the choice of starting points in a time series can lead to quite different conclusions about trends in the data, even if the statistical analysis is consistent. … In short, we have been unable to detect any influence of global warming at this tropical location on either the frequency, or the height of major flooding other than that due to its influence on sea-level rise.


Zerbini et al., 2017

Our study focuses on the time series of Alicante, in Spain, Marseille, in France, Genoa, Marina di Ravenna (formerly Porto Corsini), Venice and Trieste, in Italy. After removing the vertical land motions in Venice and Marina di Ravenna, and the inverted barometer effect at all the sites, the linear long period sea-level rates were estimated. The results are in excellent agreement ranging between + 1.2 and + 1.3 mm/year for the overall period from the last decades of the 19th century till 2012. The associated errors, computed by accounting for serial autocorrelation, are of the order of 0.2–0.3 mm/year for all stations, except Alicante, for which the error turns out to be 0,5 mm/year. … Our estimated rates for the northern Mediterranean, a relatively small regional sea, are slightly lower than the global mean rate, + 1.7 ± 0.2 mm/year, recently published in the IPCC AR5 (Intergovernmental Panel on Climate Change 5th Assessment Report) (Church et al., 2013), but close enough, if uncertainties are taken into account. It is known that Mediterranean stations had always had lower trends than the global-average ones. Our regional results, however, are in close agreement with the global mean rate, + 1.2 mm/year, published by Hay et al. (2015) which is currently being discussed by the oceanographic community.


Mörner, 2017

Global tide gauge data sets may vary between +1.7 mm/yr to +0.25 mm/yr depending upon the choice of stations. At numerous individual sites, available tide gauges show variability around a stable zero level. Coastal morphology is a sharp tool in defining ongoing changes in sea level. A general stability has been defined in sites like the Maldives, Goa, Bangladesh and Fiji. In contrast to all those observations, satellite altimetry claim there is a global mean rise in sea level of about 3.0 mm/yr. In this paper, it is claimed that the satellite altimetry values have been “manipulated”.

In this situation, it is recommended that we return to the observational facts, which provides global sea level records varying between ±0.0 and +1.0 mm/yr; i.e. values that pose no problems in coastal protection.

New Working Paper: “Advent Of Computer Modelling Has Corrupted Climate Science”

A new working paper here by Dr. Anthonie Bastiaan Ruighaver concludes that climate science has been corrupted by computer models and that it is time to get back to how science is supposed to function.

The working paper is titled: “The Power of Falsification, Developing a Greenhouse Gas Theory“. What follows are some excerpts that I’ve emphasized.

Unfortunately the advent of computer modelling has corrupted climate science into believing models are now the main source of knowledge, even though it’s not uncommon for models to have systemic deficiencies [Santer, et. al. 2017]. Theories always had a bad press [Rabinovich, et. Al. 2012], but many scientists seem to be confused about the difference between a model and a theory [Hug, H., 2000]. Both are descriptions of a phenomenon, but in a theory that description is formulated to enable derivation of simple testable hypotheses. If you call something a theory, but there are no hypotheses, you are doing science no favour. Neither should you call a simple statement, that can only be tested by developing a theory, a hypothesis: “CO2 causes Global Warming” is not a useful scientific hypothesis. The lack of falsification in climate science since the advent of computer modelling basically has turned Climate Science into what Sir Karl Popper called a pseudoscience [ Popper, K. 2014]. But even a pseudoscience can “happen to stumble on the truth” as Sir Karl Popper stated. Is this paper [Nikolov, et. al. 2017] denying that greenhouse gases have any influence on global temperature closer to the truth? The problem is we won’t know what is likely to be close to the truth when authors refuse to formulate it as a theory, with hypotheses other people can try to falsify.

It started by investigating the belief that CO2 caused global warming, just to find that all “evidence” for this belief found in scientific literature is based on simulation! Having worked in a simulation group early in his academic career [Brok, et. al. 1983], the author was involved in validating many models that turned out to have little in common with reality. What is the value of a belief based on models with systemic deficiencies attempting to estimate CO2 sensitivity, based on the assumption CO2 causes Global Warming? They don’t even take into account CO2’s role in greening our earth and the influence that greening has on our climate. The author’s falsification of beliefs culminated in an attempt to falsify the new belief that “CO2 does not cause Global Warming”, by attempting to formulate a simple theory based on suggestions that greenhouse gas radiative re-emissions are influenced by diffusion [Barrett, J., 1995].”

The conclusion reads:

In this paper we have examined the culture of Climate Science in relation to its Basis of Truth and Rationale. We have argued that the reluctance to falsify knowledge by developing theories instead of computer models has had a negative impact. To illustrate that trying to falsify a theory will enrich science, we have developed a simple theory on how CO2 influences  heat transfer and the radiative balance both in the lower layer and the top layer of our atmosphere. The experiments needed to falsify the hypotheses suggested by this simple theory will provide new empirical evidence that without the formulation of this theory would likely not have been collected. Hence, the author argues that it is time to change the culture of Climate Science back to Sir Karl Popper’s vision of how science should function. Let’s start developing theories again and encourage the falsification of their hypotheses. Let’s try to provide a basis of truth and rationale by trying to falsify this new theory predicting more CO2 will cool our earth!”

Read entire working paper here.