European Commission Reflection Paper Puts “Climate Change” Near Bottom Of List Of Concerns!

If one were to rate the investment made by governments globally aimed at creating concern for a potential problem, then the huge investment in climate change fear by now would definitely have to be rated as “junk” quality.

Never has so much seen so little return.

Hundreds of billions have been invested so far with the aim of generating mass fear, and by now we would think the global public should be in a state of panic. That’s the least one would expect from such a massive investment in fostering fear.

But it turns out that climate change remains very low on the list of concerns that citizens have.

List of concerns for European citizens

Earlier this year, the European Commission presented a White Paper on the future of Europe. A series of reflection papers covering key topics for the future of the European Union with 27 Member States have been published subsequently.

The European Commission Reflection Paper on the Future of European Defence (2025) is the fourth in that series and it outlines the main trends and challenges that will shape the future of our security and defence and on this basis, sets out options in three different scenarios for moving towards a Security and Defence Union. While not mutually exclusive, these scenarios are underpinned by different levels of ambition for the EU in doing things together in security and defence.

Strengthening the protection and security of European citizens is part of the Juncker Commission priorities.

In the Reflection Paper is Figure 3, which happens to list what European citizens are most concerned about:

Figure 3 of the Reflection Paper on the Future of European Defence. Source: Eurobarometer

Climate, environment dead last

Not surprisingly, Europeans are most concerned about immigration and terrorism, followed by economic issues. In the last two place we find climate change and the environment respectively. These two issues have run dead last over the past 5 years.

In fact, citizens are even more concerned about the EU’s (waning) influence in the world. Obviously people are collectively not buying into the climate alarmism hoax.

U of Canberra Expert: Doubling Atmospheric CO2 Would Increase “Heating By less Than 0.01°K”

Recently Kenneth Richard posted a flurry of papers showing that the CO2 climate sensitivity estimate has been trending sharply downward over the years, which means CO2’s claimed effect on warming has been highly exaggerated.

Now another opinion has come to light, further supporting the notion that the recent rise in CO2 in fact is having very little impact on our climate. Software engineer Dr. Dai Davies has experimental and theoretical (quantum mechanics) experience in gas phase, and he believes a doubling of CO2 will have “no significant role” in atmospheric thermodynamics.

Photo right: U of Canberra Dr. Dai Davies

Davies posted online a review paper on CO2 and climate sensitivity: Atmospheric Radiative Heat Transfer in Context.

The abstract:

It is said that radiative gasses (RGs, or greenhouse gasses) trap heat radiated from the Earth’s surface causing it’s temperature to rise by 33 K above the theoretical temperature with no atmosphere. The word ‘trap’ is misleading. RGs delay the radiative transmission of heat from surface to space. I estimate this delay and conclude that its average impact on atmospheric temperatures, the Radiative Delay Effect (RDE), is in the order of 0.14 [0.1 to 1] K. This result is then placed in the broader context of atmospheric thermodynamics where it complements recent work on the air-surface interaction. The combination leaves no significant role for carbon dioxide.

Davies also believes that “increased atmospheric CO2 has been highly beneficial to the biosphere as would a doubling.”

His summary:

The IPCC climate consensus view of radiative dynamics is that the sun heats the Earth’s surface. The surface sheds heat through radiation and other processes. Around 88% of that radiation is trapped by RGs in the atmosphere, heating it by 33 K. They radiate much of that heat back to the surface. Surface cooling is impeded and its temperature rises. Carbon dioxide in the atmosphere reduces the gap in the water vapour absorption spectrum that allows the 12% of surface radiation to escape directly to space, so further decreasing surface heat loss. This view assumes strong positive feedbacks.

It has been claimed that these could cause runaway heating. A distinct alternate view, a total paradigm shift, is that the sun heats the surface during the day. The surface sheds heat through radiation and other processes. Around 88% of this radiation is delayed by RGs in the atmosphere, heating it by less than 1 K. Doubling CO2 in the atmosphere would increase this heating by less than 0.01 K. Meanwhile, at the surface, the intrinsic atmospheric radiation generated by molecular collisions, along with direct thermal conduction, allow the atmosphere to act as a thermal buffer reducing the daily surface temperature range and in doing so cause the surface temperature to rise by 60 K or more. This surface heating mechanism is near saturation and is in no way prone to runaway heating. The results reported here support and quantify the latter view – one in which carbon dioxide plays an insignificant role.

Scientific “debacle” needs to end rapidly

Back in the 1970s Dr. Davies spent years in experimental and theoretical work (QM calculations) in gas phase molecular spectroscopy for an MSc degree, and spent more than five years in environmental research. His work on leaded fuel showed that both sides of the debate were significantly wrong, and helped lead to breaking a deadlock.

Dr. Davies wrote in an e-mail that it is his hope that both sides of the debate will “follow the science so we can move more rapidly to an end of this debacle”.

 

Ph.D. Thermal Engineer Claims Supernova Theory Explains Global Warming, Extinction Events, Ice Ages

Do Supernova Events Cause

Extreme Climate Changes?

“Global warming will not be reduced by reducing man made CO2 emissions” — Dr. William Sokeland

In recent years, mass die-offs of large animals – like the sudden deaths of 211,000 endangered antelopes within a matter of weeks – have been described as “mysterious” and remain largely unexplained.

Determining the cause of the retreat to ice ages and the abrupt warmings that spawned the interglacial periods has remained controversial for many decades.

Dr. William Sokeland, a heat transfer expert and thermal engineer from the University of Florida, has published a paper in the Journal of Earth Science and Engineering that proposes rapid ice melt events and ice age terminations, extreme weather events leading to mass die-offs, and even modern global warming can be traced to (or at least correlate well with) supernova impact events.

The perspectives and conclusions of researchers who claim to have found strong correlations that could explain such wide-ranging geological phenomena as the causes of glacials/interglacials, modern temperatures, and mysterious large animal die-offs should at least be considered…while maintaining a healthy level of skepticism, of course.

Discovery – if that is potentially what is occurring here – is worth a look.


Sokeland, 2017

Scientists generally state that debris from supernova does not impact our planet.  They have no concept that incoming particles from exploding stars are focused by our sun’s gravity and the magnetic fields of the sun and earth.
[M]any harmful effects are possible in the Supernova and Nova Impact Theory, SNIT, including extreme changes of the climate.

Supernova Impacts and Solar Activity, Global Warming Correlation

The scattering of solar energy due to the small particles of supernova debris is also reflected in TSI data as shown in Fig. 3. The timing of impact for supernova debris streams allows the identification of the times and duration time periods for supernova debris streams impacting our planet. Fig. 3 indicates the duration of a single supernova debris stream flowing past our planet is at least 50 years and at times more than 100 years.

Fig. 3 shows an excellent correspondence between sunspot minimums, irradiance depressions, and supernova impact times. The six smaller dips in TSI generated by nova WZ Sagittae in the red portion of the TSI curve of Fig. 3 beginning with the Dalton minimum indicate we have been impacted by six different debris streams from the nova. The last one was in the 1965 to 1970 time region and it is the debris stream of Nova WZ Sagittae that started our current global warming episode near 1966.

Supernova Impacts and Ice Ages, Ice Sheet Melts, and Warm Period Correlations

Incoming supernova debris streams cause warming and melting ice caps that produce increased sea levels.  The increase in sea levels that correlates with supernova impact times is shown in Table 5. 
Termination of the last ice age results due to melting of numerous supernova impacts that correlate time of impact by changing sea level and geothermal energy released for 2,800 years from the exit crater of Dr. J. Kennet’s nano-diamond meteor theory and part of the process involves Dr. O’Keefe’s tektite theory. Correlation of Dr. Frezzotti’s ice melt Antarctica data with supernova impact times over the past 800 years establishes the Global Warming model in conjunction with the November 2016 Antarctic sea ice melt.
Supernova 393 debris impacted earth near 857 AD and started the Medieval Warming Period. When the warm part of the supernova oscillation or cycle stopped and the cooling occurred, the Little Ice Age began near 1250 AD. Supernova 393 also caused the decline of the Mayan Empire near 900 AD. Supernova 393 is proposed to have caused a gamma ray attack upon earth 1,200 years ago.
Two supernovas, G299 and G296.7-0.9, impacted the earth to produce first the Roman warming period shown in Fig. 4 with the normal cooling and then a third unknown supernova created some warming with a lot of cooling dropping temperature to a minimum near 1,100 years ago (900 AD). This cold period produced the Dark Ages. Then SN 393 occurred causing more warming than cooling, but the end result was the Little Ice Age. The Dark Ages and the Little Ice Age were very disastrous periods of time for our planet’s human populations. It should be concluded that the increase in CO2 caused by supernovas 1006 and 1054 that is currently being observed is a boon to mankind and will protect us from the coming cold phase that will be caused by these currently impacting supernovas.
Consider the Minoan Warming of Fig. 4. The incoming carbon from supernova G29.6+0.1 causes the warm up as shown by the increased Greenland ice core temperatures.

Supernova Impacts and Timings of Megafauna Extinctions and Civilization Collapses

Noted megafauna extinctions in the past 50,000 years are correlated with the times when the debris of supernova explosions impact earth. The time of extinction should be near the time of impact of a supernova debris stream. The time of impact is derived from the time the light of the supernova explosion was seen on earth by adding a correction for the fact that the debris from the explosion moves slower than the speed of light and is shown in the second equation. The severity of the extinction will depend on the distance of the supernova from our planet, the type supernova that indicates the power of the explosion and the surroundings of the supernova when it explodes.  In general, most major disturbances of earth’s biosphere can be attributed to the explosion of supernovas.
Due to the scattering of light for small particles, the sunspots will tend to disappear when a hollow sphere of small particles enters our solar system between the sun and the earth. Other signs of the presence of the small particles are the increase of animal die offs for birds, bees, and fish and a decrease in TSI (total solar irradiance).
Recent outstanding examples of animal and human die offs due to the incoming debris were the Saiga antelope in Asia in May of 2014 and people dying in  India in May of 2015-2016These die offs were caused by SN 1006 and would have been called megafauna extinctions if the populations were restricted to small island land areas.  The deaths of the destructive hollow spheres for supernovas 1054 and 1006 will be minimal in the beginning but will increase in intensity as the years of higher particle mass and densities are approached.  Since these supernovas are over 7,000 light-years away from our planet, the effects should not be as severe as the extinctions listed in Table 2 that were due to supernova remnants that were closer to our planet.
Supernova G32.0-4.9 impact time of 4,530 ya corresponds to the fall of Egypt’s fourth dynasty in 2494 BC. It is reported that Ancient Europeans vanished 4,500 years ago. Could supernova debris actually destroy the structure of an empire and change DNA in Europe? An impact time of 4,210 years ago matches the 4.2 Kiloyear Event.
Supernova W50 with an impact time of 17,600 ya and a declination +4 appears to have caused rapid melting of the Patagonian ice sheet 17,500 years ago and corresponds to the last glacial maximum of 18,000 years ago.
Supernova G31.9+0.0 impact time of 8,092 ya produces another correlated woolly mammoth extinction event at Lake Hill on St Paul Island in the Bering Sea 7,600 years ago. The climate change produced by this supernova caused these mammoth to die due to lack of fresh water or drought.
Supernova W51C provides the impact time of 8,130 years ago and this date coincides with the end of the 8.2 Kiloyear Event.
The W50 meteor at 12,800 ya matches the beginning of deglaciation in Antarctica 12,500 years ago. Supernova Vela has a range of impact times shown in Table 1 and Fig. 7 suggests the change of temperature date of 11,700 years ago should be used. Vela’s thermal impact in the northern hemisphere was large because it is the second closest supernova to our planet.
Supernova G82.2+5.3 in Table 3 with an impact time of 5,903 ya produced the 5.9 Kiloyear Event and it is so close in time to the Piora Oscillation that the two different events due to different supernovas are often considered the same event

The SNIT [Supernova and Nova Impact Theory] Model vs. Climate Models

Any model that claims to know the energy source for global warming must predict the past effects like Antarctic melts in Fig. 3a. Then the model can be successfully used to predict global warming effects in the future. If the proposed model cannot predict past global warming events from previously recorded independent data, the model is useless.
The SNIT model shows unusual and distinct conditions for beginning and ending ice ages. To start an ice age, a close supernova explosion like SN Monogem Ring must produce an extreme amount of iron on earth’s surface. To end an ice age a meteor from a supernova explosion must penetrate earth’s mantle and release geothermal energy over a long period of time to melt the ice.
Applying Occam’s razor, supernova debris impact is the simplest method that explains all these extinction and biosphere disturbance events because the only assumption is all debris streams travel at the same velocity from the remnant to our planet.

What Can We Do?

The debris streams of supernovas 1006 and 1054 have already began to destroy life on earth. When President Obama received a rough draft of this work, he issued an executive order to NASA stating, “Space weather has the potential to simultaneously affect and disrupt health and safety across entire continents”.
Since supernovas 1054 and 1006 are currently incoming, the planet’s average temperatures should continue to increase, global warming. Global warming will not be reduced by reducing man made CO2 emissions and in reality the only defense is to move to a cooler hemisphere, harvest CO2 from the atmosphere, or stop the incoming particles. 

Veteran Meteorologist Slams German Media For Poor Warnings As Storms Reveal Reporting Incompetence

First, at Twitter here Swiss high-profile meteorologist Jörg Kachelmann presented a video on how he thinks German public television failed to adequately warn the public before North Sea storm Xavier barreled through northern Germany on October 5.

Swiss meteorologist Jörg Kachelmann says he believes German media (image above) inadequately informed the public of the danger of storm ‘Xavier’. Seven Germans lost their lives to a storm that was “nothing out of the ordinary”. Image: ZDF German Public Television.

Media warnings of storm were inadequate

As a result 7 people were killed by falling limbs and trees. Some of these deaths could have been prevented had the media issued stronger warnings of the storm’s danger before it hit, believes Kachelmann, who in his twitter video pointed out that Xavier was not an unusual storm by any means and that there should not have been so many deaths.

Kachelmann said:

Why it happened has a bit to do with the media and what they could do. That’s the big difference to the USA when you look at the reporting there concerning hurricanes or even tornadoes, where the main reporting does not come after everything has happened — after all the deaths, injuries and everything laying around — but the main reporting is before where people are helped and told what to do and accompanies the people during this time with reporters out there in the storm with microphones, which here is ridiculed. But it helps.”

Kachelmann says the media here could have done this too to make the danger clear to people: “They could have done this here too. The storm did not just come as a surprise.”

The media hype comes afterwards, when it’s too late

As to why the German press did so little to warn the people of the storm, Kachelmann can only speculate: Maybe they were just “infinitely lazy“. Kachelmann thinks that had the media given stronger warnings, some of the lives would have been spared.

The average observer could say that German public media seems to have a habit of underrating storms before they hit, and then exaggerating them after they leave. For example, on Monday earlier this week – after Ophelia had already hit Ireland – North German NDR radio presented Ophelia as something that no one in Ireland “could recall ever happening“.

Certainly a bit of hyperbole here.

Here one could successfully argue that this is an exaggeration and that the German media are simply just too lazy to look back into the archives, or just aren’t interested in presenting accurate reports. Joe Bastardi at Weatherbell on Tuesday highlighted Ophelia in his Daily Update and showed that Ophelia “was not as bad as Debbie in 1961“.

Track of Hurricane Debbie in 1961, which was worse than Ophelia. Source: cropped from Weatherbell Saturday Summary.

Maybe the pre-storm downplaying and post-storm hyping is unwittingly intended by the German media. By neglecting to warn people beforehand, they get to blare out bigger, more spectacular headlines of death and destruction after the storm passes.

Of course no one seriously thinks it’s intentional on the German media’s part, yet the bottom line is that the German public is getting a distorted reporting both before and after the storm. They deserve far better for their exorbitant mandatory public television and radio fees.

Ophelia not unprecedented

And at his Saturday Summary of October 14, the veteran meteorologist blasted the hurricane hysteria coming from the usual US activists. He shows how Hurricane Faith in 1966 remained a hurricane far north of Ireland, and didn’t peter out until it reached the North Pole! There’s nothing unusual about Ophelia. Bastardi added:

I feel very strongly about these people who are using these storms […] for their agenda, and so what I’m doing here is that I’m letting you know that I’m showing you beforehand that there is visible evidence that this has happened before.”

 

Surprise: Defying Models, Antarctic Sea Ice Extent 100 Years Ago Similar To Today

By Dr. Sebastian Lüning and Prof. Fritz Vahrenholt

German text edited/translated by P Gosselin)

Satellite measurements of Antarctic sea ice do not go back even 40 years. That’s not very much, especially when we consider that many natural climate cycles have periods of 60 years and more.

Luckily we have the field of climate reconstruction. Using historical documents and sediment cores, the development of ice cover can be estimated. In November, 2016, Tom Edinburg and Jonathan Day examined shipping log books from the time of Antarctic explorers and published on ice extent in The Cryosphere:

Estimating the extent of Antarctic summer sea ice during the Heroic Age of Antarctic Exploration
In stark contrast to the sharp decline in Arctic sea ice, there has been a steady increase in ice extent around Antarctica during the last three decades, especially in the Weddell and Ross seas. In general, climate models do not to capture this trend and a lack of information about sea ice coverage in the pre-satellite period limits our ability to quantify the sensitivity of sea ice to climate change and robustly validate climate models. However, evidence of the presence and nature of sea ice was often recorded during early Antarctic exploration, though these sources have not previously been explored or exploited until now. We have analysed observations of the summer sea ice edge from the ship logbooks of explorers such as Robert Falcon Scott, Ernest Shackleton and their contemporaries during the Heroic Age of Antarctic Exploration (1897–1917), and in this study we compare these to satellite observations from the period 1989–2014, offering insight into the ice conditions of this period, from direct observations, for the first time. This comparison shows that the summer sea ice edge was between 1.0 and 1.7° further north in the Weddell Sea during this period but that ice conditions were surprisingly comparable to the present day in other sectors.”

The surprising result: with respect to sea ice extent 100 years ago things looked similar to what we have today, with the exception of the Weddell Sea. A study by Hobbs et al. 2016 also looked back at the last century, here using geoscientific sea ice reconstructions. Once again the strong discrepancies between the real ice development and model simulations were criticized:

Century-scale perspectives on observed and simulated Southern Ocean sea ice trends from proxy reconstructions
Since 1979 when continuous satellite observations began, Southern Ocean sea ice cover has increased, whilst global coupled climate models simulate a decrease over the same period. It is uncertain whether the observed trends are anthropogenically forced or due to internal variability, or whether the apparent discrepancy between models and observations can be explained by internal variability. The shortness of the satellite record is one source of this uncertainty, and a possible solution is to use proxy reconstructions, which extend the analysis period but at the expense of higher observational uncertainty. In this work, we evaluate the utility for change detection of 20th century Southern Ocean sea ice proxies. We find that there are reliable proxies for the East Antarctic, Amundsen, Bellingshausen and Weddell sectors in late winter, and for the Weddell Sea in late autumn. Models and reconstructions agree that sea ice extent in the East Antarctic, Amundsen and Bellingshausen Seas has decreased since the early 1970s, consistent with an anthropogenic response. However, the decrease is small compared to internal variability, and the change is not robustly detectable. We also find that optimal fingerprinting filters out much of the uncertainty in proxy reconstructions. The Ross Sea is a confounding factor, with a significant increase in sea ice since 1979 that is not captured by climate models; however, existing proxy reconstructions of this region are not yet sufficiently reliable for formal change detection.”

A paper published by Ellen & Abrams 2016 even looked back 300 years ago and showed that the increase in sea ice from 1979-2016 has been part of a long-term growth trend of the 20th century:

Ice core reconstruction of sea ice change in the Amundsen-Ross Seas since 1702 A.D.
Antarctic sea ice has been increasing in recent decades, but with strong regional differences in the expression of sea ice change. Declining sea ice in the Bellingshausen Sea since 1979 (the satellite era) has been linked to the observed warming on the Antarctic Peninsula, while the Ross Sea sector has seen a marked increase in sea ice during this period. Here we present a 308 year record of methansulphonic acid from coastal West Antarctica, representing sea ice conditions in the Amundsen-Ross Sea. We demonstrate that the recent increase in sea ice in this region is part of a longer trend, with an estimated ~1° northward expansion in winter sea ice extent (SIE) during the twentieth century and a total expansion of ~1.3° since 1702. The greatest reconstructed SIE occurred during the mid-1990s, with five of the past 30 years considered exceptional in the context of the past three centuries.”

Recent CO2 Climate Sensitivity Estimates Continue Trending Towards Zero

Updated: The Shrinking

CO2 Climate Sensitivity

A recently highlighted paper published by atmospheric scientists Scafetta et al., (2017) featured a graph (above) documenting post-2000 trends in the published estimates of the Earth’s climate sensitivity to a doubling of CO2 concentrations (from 280 parts per million to 560 ppm).

The trajectory for the published estimates of transient climate response (TCR, the average temperature response centered around the time of CO2 doubling) and equilibrium climate sensitivity (ECS, the temperature response upon reaching an equilibrium state after doubling) are shown to be declining from an average of about 3°C earlier in the century to below 2°C and edging towards 1°C for the more recent years.

This visual evidence would appear to indicate that past climate model determinations of very high climate sensitivity (4°C, 5°C, 6°C and up) have increasingly been determined to be in error.  The anthropogenic influence on the Earth’s surface temperature has likely been significantly exaggerated.


Scafetta et al., 2017   Since 2000 there has been a systematic tendency to find lower climate sensitivity values. The most recent studies suggest a transient climate response (TCR) of about 1.0 °C, an ECS less than 2.0 °C and an effective climate sensitivity (EfCS) in the neighborhood of 1.0 °C.”

Thus, all evidences suggest that the IPCC GCMs at least increase twofold or even triple the real anthropogenic warming. The GHG theory might even require a deep re-examination.”


An Update On The Gradually Declining Climate Sensitivity

The graph shown in Scafetta et al. (2017) ends in 2014, which means that papers published in the last 3 years are not included.   Also, there were several other published climate sensitivity papers from the last decade that were excluded from the analysis, possibly because they did not include and/or specify TCR and/or ECS estimates in isolation, but instead just used a generic doubled-CO2 climate sensitivity value (shown in purple here).

Below is a new, updated graph that (1) includes some of the previously unidentified papers and (2) adds the 10 – 12 climate sensitivity papers published in the last 3 years.  Notice, again, that the trend found in published papers has continued downwards, gradually heading towards zero.  The reference list for the over 20 additional papers used for the updated analysis is also included below.

For a more comprehensive list of over 60 papers with very low (<1°C) climate sensitivity estimates, see here.



Reinhart, 2017 (<0.24°C)

Our results permit to conclude that CO2 is a very weak greenhouse gas and cannot be accepted as the main driver of climate change. … The assumption of a constant temperature and black body radiation definitely violates reality and even the principles of thermodynamics. … [W]e conclude that the temperature increases predicted by the IPCC AR5 lack robust scientific justification. … A doubling [to 800 ppm] of the present level of CO2 [400 ppm] results in  [temperature change] < 0.24 K. … [T]he scientific community must look for causes of climate change that can be solidly based on physics and chemistry. … The observed temperature increase since pre-industrial times is close to an order of magnitude higher than that attributable to CO2.

Abbot and Marohasy, 2017  (0.6°C equilibrium)

The largest deviation between the ANN [artificial neural network] projections and measured temperatures for six geographically distinct regions was approximately 0.2 °C, and from this an Equilibrium Climate Sensitivity (ECS) of approximately 0.6 °C [for a doubling of CO2 from 280 ppm to 560 ppm plus feedbacks] was estimated. This is considerably less than estimates from the General Circulation Models (GCMs) used by the Intergovernmental Panel on Climate Change (IPCC), and similar to estimates from spectroscopic methods.
The proxy measurements suggest New Zealand’s climate has fluctuated within a band of approximately 2°C since at least 900 AD, as shown in Figure 2. The warming of nearly 1°C since 1940 falls within this band. The discrepancy between the orange and blue lines in recent decades as shown in Figure 3, suggests that the anthropogenic contribution to this warming could be in the order of approximately 0.2°C. [80% of the warming since 1940 may be due natural factors].

 Harde, 2016 (0.7°C equilibrium)

Including solar and cloud effects as well as all relevant feedback processes our simulations give an equilibrium climate sensitivity of CS = 0.7 °C (temperature increase at doubled CO2) and a solar sensitivity of SS = 0.17 °C (at 0.1 % increase of the total solar irradiance). Then CO2 contributes 40 % and the Sun 60 % to global warming over the last century.

Bates, 2016  (~1°C)

Estimates of 2xCO2 equilibrium climate sensitivity (EqCS) derive from running global climate models (GCMs) to equilibrium. Estimates of effective climate sensitivity (EfCS) are the corresponding quantities obtained using transient GCM output or observations. The EfCS approach uses an accompanying energy balance model (EBM), the zero-dimensional model (ZDM) being standard. GCM values of EqCS and EfCS vary widely [IPCC range: (1.5, 4.5)°C] and have failed to converge over the past 35 years. Recently, attempts have been made to refine the EfCS approach by using two-zone (tropical/extratropical) EBMs. When applied using satellite radiation data, these give low and tightly-constrained EfCS values, in the neighbourhood of 1°C. … The central conclusion of this study is that to disregard the low values of effective climate sensitivity (≈1°C) given by observations on the grounds that they do not agree with the larger values of equilibrium, or effective, climate sensitivity given by GCMs, while the GCMs themselves do not properly represent the observed value of the tropical radiative response coefficient, is a standpoint that needs to be reconsidered.

Evans, 2016 (<0.5°C equilibrium)

The conventional basic climate model applies “basic physics” to climate, estimating sensitivity to CO2. However, it has two serious architectural errors. It only allows feedbacks in response to surface warming, so it omits the driver-specific feedbacks. It treats extra-absorbed sunlight, which heats the surface and increases outgoing long-wave radiation (OLR), the same as extra CO2, which reduces OLR from carbon dioxide in the upper atmosphere but does not increase the total OLR. The rerouting feedback is proposed. An increasing CO2 concentration warms the upper troposphere, heating the water vapor emissions layer and some cloud tops, which emit more OLR and descend to lower and warmer altitudes. This feedback resolves the nonobservation of the “hotspot.” An alternative model is developed, whose architecture fixes the errors. By summing the (surface) warmings due to climate drivers, rather than their forcings, it allows driver-specific forcings and allows a separate CO2 response (the conventional model applies the same response, the solar response, to all forcings). It also applies a radiation balance, estimating OLR from properties of the emission layers. Fitting the climate data to the alternative model, we find that the equilibrium climate sensitivity is most likely less than 0.5°C, increasing CO2 most likely caused less than 20% of the global warming from the 1970s, and the CO2 response is less than one-third as strong as the solar response. The conventional model overestimates the potency of CO2 because it applies the strong solar response instead of the weak CO2response to the CO2 forcing.

Gervais, 2016 [full]  (<0.6°C transient)

Conclusion: Dangerous anthropogenic warming is questioned (i) upon recognition of the large amplitude of the natural 60–year cyclic component and (ii) upon revision downwards of the transient climate response consistent with latest tendencies shown in Fig. 1, here found to be at most 0.6 °C once the natural component has been removed, consistent with latest infrared studies (Harde, 2014). Anthropogenic warming well below the potentially dangerous range were reported in older and recent studies (Idso, 1998; Miskolczi, 2007; Paltridge et al., 2009; Gerlich and Tscheuschner, 2009; Lindzen and Choi, 2009, 2011; Spencer and Braswell, 2010; Clark, 2010; Kramm and Dlugi, 2011; Lewis and Curry, 2014; Skeie et al., 2014; Lewis, 2015; Volokin and ReLlez, 2015). On inspection of a risk of anthropogenic warming thus toned down, a change of paradigm which highlights a benefit for mankind related to the increase of plant feeding and crops yields by enhanced CO2 photosynthesis is suggested.

Marvel et al., 2016 (1.8°C transient, 3.0°C equilibrium)

Assuming that all forcings have the same transient efficacy as greenhouse gases, and following a previous study, the best estimate (median) for TCR is 1.3°C. However, scaling each forcing by our estimates of transient efficacy (determined from either iRF or ERF), we obtain a best estimate for TCR of 1.8°C. This scaling simultaneously considers both forcing and ocean heat uptake efficacy. Other estimates of TCR which differ slightly due to choices of base period and uncertainty estimates and the aerosol forcing used, are similarly revised upward when using calculated efficacies.  We apply the same reasoning to estimates of ECS. Using an estimate4 of the rate of recent heat uptake Q = 0.65 ± 0.27 W m-2, we find, assuming all equilibrium efficacies are unity, a best estimate of ECS = 2.0°C, comparable to the previous result of 1.9°C.  However, as with TCR, accounting for differences in equilibrium forcing efficacy revises the estimate upward; our new best estimate (using efficacies derived from the iRF) is 2.9°C. If efficacies are instead calculated from the ERF, the best estimate of ECS is 3.0°C. As for TCR, alternate estimates of ECS are revised upward when efficacies are taken into account.

Soon, Connolly, and Connolly, 2015 [full] (0.44°C)

Nonetheless, let us ignore the negative relationship with greenhouse gas (GHG) radiative forcing, and assume the carbon dioxide (CO2) relationship is valid. If atmospheric carbon dioxide concentrations have risen by ~110 ppmv since 1881 (i.e., 290→400 ppmv), this would imply that carbon dioxide (CO2) is responsible for a warming of at most 0.0011 × 110 = 0.12°C over the 1881-2014 period, where 0.0011 is the slope of the line in Figure 29(a). We can use this relationship to calculate the so-called “climate sensitivity” to carbon dioxide, i.e., the temperature response to a doubling of atmospheric carbon dioxide. According to this model, if atmospheric carbon dioxide concentrations were to increase by ~400 ppmv, this would contribute to at most 0.0011 × 400 = 0.44°C warming. That is, the climate sensitivity to atmospheric carbon dioxide is at most 0.44°C.

Lewis and Curry, 2015 (1.33°C  transient, 1.64°C  equilibrium)

Energy budget estimates of equilibrium climate sensitivity (ECS) and transient climate response (TCR) are derived using the comprehensive 1750–2011 time series and the uncertainty ranges for forcing components provided in the Intergovernmental Panel on Climate Change Fifth Assessment Working Group I Report, along with its estimates of heat accumulation in the climate system. The resulting estimates are less dependent on global climate models and allow more realistically for forcing uncertainties than similar estimates based on forcings diagnosed from simulations by such models. Base and final periods are selected that have well matched volcanic activity and influence from internal variability. Using 1859–1882 for the base period and 1995–2011 for the final period, thus avoiding major volcanic activity, median estimates are derived for ECS of 1.64 K and for TCR of 1.33 K.

Johansson et al., 2015 (2.5°C  equilibrium)

A key uncertainty in projecting future climate change is the magnitude of equilibrium climate sensitivity (ECS), that is, the eventual increase in global annual average surface temperature in response to a doubling of atmospheric CO2 concentration. The lower bound of the likely range for ECS given in the IPCC Fifth Assessment Report was revised downwards to 1.5 °C, from 2 °C in its previous report, mainly as an effect of considering observations over the warming hiatus—the period of slowdown of global average temperature increase since the early 2000s. Here we analyse how estimates of ECS change as observations accumulate over time and estimate the contribution of potential causes to the hiatus. We find that including observations over the hiatus reduces the most likely value for ECS from 2.8 °C to 2.5 °C, but that the lower bound of the 90% range remains stable around 2 °C. We also find that the hiatus is primarily attributable to El Niño/Southern Oscillation-related variability and reduced solar forcing.

Kissin, 2015 (~0.6°C)

[A] doubling the CO2 concentration in the Earth’s atmosphere would lead to an increase of the surface temperature by about +0.5 to 0.7 °C, hardly an effect calling for immediate drastic changes in the planet’s energy policies. An increase in the absolute air humidity caused by doubling the CO2 concentration and the resulting decrease of the outgoing IR flux would produce a relatively small additional effect due to a strong overlap of IR spectral bands of CO2 and H2O, the two compounds primarily responsible for the greenhouse properties of the atmosphere.

Kimoto, 2015  [full] (~0.16°C)

The central dogma is critically evaluated in the anthropogenic global warming (AGW) theory of the IPCC, claiming the Planck response is 1.2K when CO2 is doubled. The first basis of it is one dimensional model studies with the fixed lapse rate assumption of 6.5K/km. It is failed from the lack of the parameter sensitivity analysis of the lapse rate for CO2 doubling. The second basis is the Planck response calculation by Cess in 1976 having a mathematical error. Therefore, the AGW theory is collapsed along with the canonical climate sensitivity of 3K utilizing the radiative forcing of 3.7W/m2 for CO2 doubling. The surface climate sensitivity is 0.14 – 0.17 K in this study with the surface radiative forcing of 1.1 W/m2.

Ollila, 2014 (~0.6°C equilibrium)

According to this study the commonly applied radiative forcing (RF) value of 3.7 Wm-2 for CO2 concentration of 560 ppm includes water feedback. The same value without water feedback is 2.16 Wm-2 which is 41.6 % smaller. Spectral analyses show that the contribution of CO2 in the greenhouse (GH) phenomenon is about 11 % and water’s strength in the present climate in comparison to CO2 is 15.2. The author has analyzed the value of the climate sensitivity (CS) and the climate sensitivity parameter (l) using three different calculation bases. These methods include energy balance calculations, infrared radiation absorption in the atmosphere, and the changes in outgoing longwave radiation at the top of the atmosphere. According to the analyzed results, the equilibrium CS (ECS) is at maximum 0.6 °C and the best estimate of l is 0.268 K/(Wm-2 ) without any feedback mechanisms.

Loehle, 2014  (1.1°C  transient, 2.0°C  equilibrium)

Estimated sensitivity is 1.093 °C (transient) and 1.99 °C (equilibrium).  Empirical study sensitivity estimates fall below those based on GCMs.

Skeie et al., 2014  (1.8°C  equilibrium)

Equilibrium climate sensitivity (ECS) is constrained based on observed near-surface temperature change, changes in ocean heat content (OHC) and detailed radiative forcing (RF) time series from pre-industrial times to 2010 for all main anthropogenic and natural forcing mechanism. The RF time series are linked to the observations of OHC and temperature change through an energy balance model (EBM) and a stochastic model, using a Bayesian approach to estimate the ECS and other unknown parameters from the data. For the net anthropogenic RF the posterior mean in 2010 is 2.0 Wm−2, with a 90% credible interval (C.I.) of 1.3 to 2.8 Wm−2, excluding present-day total aerosol effects (direct + indirect) stronger than −1.7 Wm−2. The posterior mean of the ECS is 1.8 °C, with 90% C.I. ranging from 0.9 to 3.2 °C, which is tighter than most previously published estimates.

Scafetta, 2013 (1.5°C)

A quasi 60-year natural oscillation simultaneously explains the 1850–1880, 1910–1940 and 1970–2000 warming periods, the 1880–1910 and 1940–1970 cooling periods and the post 2000 GST plateau. This hypothesis implies that about 50% of the ~ 0.5 °C global surface warming observed from 1970 to 2000 was due to natural oscillations of the climate system, not to anthropogenic forcing as modeled by the CMIP3 and CMIP5 GCMs. Consequently, the climate sensitivity to CO2 doubling should be reduced by half, for example from the 2.0–4.5 °C range (as claimed by the IPCC, 2007) to 1.0–2.3 °C with a likely median of ~ 1.5 °C instead of ~ 3.0 °C.

Asten, 2012 (1.1°C)

Climate sensitivity estimated from the latter is 1.1 ± 0.4 °C (66% confidence) compared with the IPCC central value of 3 °C. The post Eocene-Oligocene transition (33.4 Ma) value of 1.1 °C obtained here is lower than those published from Holocene and Pleistocene glaciation-related temperature data (800 Kya to present) but is of similar order to sensitivity estimates published from satellite observations of tropospheric and sea-surface temperature variations. The value of 1.1 °C is grossly different from estimates up to 9 °C published from paleo-temperature studies of Pliocene (3 to 4 Mya) age sediments. 

Lindzen and Choi, 2011 (0.7°C)

As a result, the climate sensitivity for a doubling of CO2 is estimated to be 0.7K (with the confidence interval 0.5K – 1.3K at 99% levels). This observational result shows that model sensitivities indicated by the IPCC AR4 are likely greater than the possibilities estimated from the observations.

Florides and Christodoulides, 2009 (~0.02°C)

A very recent development on the greenhouse phenomenon is a validated adiabatic model, based on laws of physics, forecasting a maximum temperature-increase of 0.01–0.03 °C for a value doubling the present concentration of atmospheric CO2

Gray, 2009 (~0.4°C)

CO2 increases without positive water vapor feedback could only have been responsible for about  0.1 – 0.2 °C of the 0.6-0.7°C global mean surface temperature warming that has been observed since the early 20th  century.  Assuming a doubling of CO2 by the late 21st  century (assuming no  positive water vapor feedback), we should likely expect to see no more than about 0.3-0.5°C global surface warming and certainly not the 2-5°C warming that has been projected by the GCMs [global circulation models].

Chylek et al., 2007 (~0.39°C)

Consequently, both increasing atmospheric concentration of greenhouse gases and decreasing loading of atmospheric aerosols are major contributors to the top-of atmosphere radiative forcing. We find that the climate sensitivity is reduced by at least a factor of 2 when direct and indirect effects of decreasing aerosols are included, compared to the case where the radiative forcing is ascribed only to increases in atmospheric concentrations of carbon dioxide. We find the empirical climate sensitivity to be between 0.29 and 0.48 K/Wm-2 when aerosol direct and indirect radiative forcing is included.

Agung Volcano On The Verge Of Blowing…Major Eruption Would Have Impact On Earth’s Climate

Volcano Agung in Bali is showing worrisome signs of a major eruption, writes German climate blogger Schneefan here. The highest level of activity with multiple tremor episodes were just recorded. You can monitor Agung via live cam and live seismogram.

The 3000-mter tall Agung has been at the highest warning level 4 since September 21.

Schneefan writes that the lava rise has started and that “an eruption can be expected at any time“.

So far some 140,000 people have been evacuated from the area of hazard, which extends up to 12 km from the volcano. Schneefan writes:

Yesterday ground activity by far exceeded the previous high level. Quakes have become more frequent and stronger, which indicates a stronger magma flow (see green in the histogram). Since October 13 there has been for the first time a “nonharmonic trembling (tremor), which can be seen in red at the top of the last two bars of the histogram.”

The colors of the columns in the bar chart from bottom to top stand for perceptible earthquakes (blue) low eartnhquakes (green) surface quakes (orange . Just recently red appeared, signifying non harmonic tremors.  The seismogram below shows what are at times longer period quakes: meaning magma is violently flowing in the volcano. Source: https://magma.vsi.esdm.go.id/.

Since yesterday the seismogram for AGUNG has been showing powerful rumbling (red).

The seismogram of AGUNG shows powerful tremors (level RED). The seismogram is updated every 3 minutes: Source: Seismogramm

Because Agung is located near the equator, a major eruption with ash flying up into the stratosphere would have short-term climatic impacts that could last a few years.

Agung last erupted in 1963 with an explosivity index of VEI 5, sending a plume of ash some 25 km into the atmosphere before leading to a cooling of 0.5°C. The eruption of Pinatubo in the Philippines in 1991 led to a global cooling of 0.5°C.

 

 

Industry Group Warns German “Electricity Prices To Rise Significantly”, Fueled By Green Energies!

One thing is clear: Germans were fooled and deceived by politicians and activists into thinking that the transition to renewable energies would not cost much, reduce pollutants, create a clean environment, improve the climate and create many jobs.

None of these have come true.

Electricity prices have skyrocketed, the landscape is being industrialized and Germany has not reduced its greenhouse gases in more than 7 years. Moreover the climate is still the same. Now Germany’s industrial base is eroding.

Today we will look at the first point: cost. Yesterday the online industry journal Deutsche Mittelstand Nachrichten (Midsize Company News) here carried the headline:

“Association of Energy: Electricity Prices To Rise Significantly”

So the bad news continue, and this will further adversely impact consumers, small businesses and the all-important Mittelstand.

And because the Mittelstand employ some 70% of all workers in Germany, most of them highly skilled and well-paid, the news is bad. The Mittelstand is already facing crisis on a number of fronts. First is the lack of skilled workers on the labor market. Second: many of these companies are now being handed down to the next generation, but there are no successors. In fact Chinese companies have been busily snapping up the companies along with their patents and technical expertise.

Now, thirdly, comes the extreme energy prices (and volatile supply) – thanks to Germany’s mad and poorly thought-out rush into utopian green energies.

Rising feed-in costs

The main factor driving the higher prices remains the EEG green energy feed-in act, the site reports. Association head Christian Otto told German daily Bild that the feed-in subsidy will rise to 7 cents per kwh (currently 6.88 cents).

Three of the four power transmission grid operators have already announced an increase in the grid fees.”

The four German grid operators are Amprion, TransnetBW, Tennet and 50Hertz. Only East Germany-based 50Hertz does not plan to raise the fees for the time being.

Uncompetitive

The higher feed-in surcharges make push German electricity prices to 30 cents a kwh, almost three times more expensive than power in the USA, for example. Little wonder that some are now calling to “make Germany great again”.

Moreover, citing analysts, the site reports that heating oil as well has become 11 percent more expensive, and warns more increases lie ahead as winter approaches.

Grid instability adding to costs

The Deutsche Mittelstand Nachrichten site cited the head of the BDEW energy association, Stefan Kapferer, who blasted the “constantly more frequent and expensive interventions that are needed to keep the grid stable due to the fluctuating feed-in of renewable energies“.

 

Climate Science Like “Jehovah Witness” Religion…Realists Slam Weather Alarmists On German Television!

In the wake of a fall storm ‘Xavier’ that struck Germany and claimed 7 lives, one of Germany’s most popular TV Talkshows, Maischberger 1 on ARD German public television, recently featured climate change in discussion round bearing the title: “Xavier and the weather extremes: has our climate reached the tipping point?

Cologne and Berlin under water! Backdrop on set for ARD German television discussion round on climate change. Image cropped from Maischberger 1 here.

The discussion round included, among others, Prof Hans-Joachim Schellnhuber, high profile Swiss meteorologist Jörg Kachelmann, who by the way is a warmist, and Swiss science journalist Alex Reichmuth.

Not surprisingly, talk show moderator Sandra Maischberger introduced the show with dramatic scenes of a climate in collapse, and then asked the round if this if the recent storms Xavier and now Orphelia are unprecedented. So dramatic in fact were the images of Maischberger’s intro that even German daily Die Welt here commented that “ARD had allowed itself to be inspired a bit by Hollywood“.

When asked by Maischberger about the recent storms, Kachelmann immediately dumped cold water on the notion that they were unusual, noting that storm Xavier seemed worse because it hit in October when trees are still fraught with foliage and thus cause far more wind resistance and cause tress to fall more often. Overall, Xavier was just a normal fall storm, Kachelmann told the audience.

“Storms don’t come with a label”

When asked if this year’s heavy rains were due to global warming, Kachelmann responded:

“We don’t know case by case because all the thunderstorms and storms don’t come with a label stating: ‘I’m here only because of you, or your actions, to say it correctly’. The problem is that we don’t know.”

No detectable increase in storm frequency

Kachelmann went on to explain that experts evaluated the data from the German Weather Service and concluded there has been “no increase in frequency in these events“. He added:

Also with tropical storms, looking at it globally, from the data of the American weather agencies, up to now we see no increase in frequency.”

A few seconds later he responded to Maischberger’s inquiry about the “monster hurricanes” hitting the Caribbean and USA:

Yes, this is an active season. But it is not a record season. It is not anything that has not happened before. When we look back at the last 50 or 60 years, we see no trend.”

During the course of the Talkshow, moderator Maischberger globetrotted across the entire planet, it seemed, going from one weather disaster to the next. She pressed Kachelmann about the fires in northern California. Here too the Swiss veteran meteorologist dismissed the notion that it’s unusual, reminding the audience that California is a dry state and these things have always happened before:

When you look at the archives, we see no increase in frequency.”

Kachelmann also reminded that anyone can make a list of 100 disasters in any year.

Like Jehovah Witness

At about the 16-minute mark, Maischberger turned to Swiss journalist Alex Reichmuth, who like Schellnhuber studied physics and mathematics. Journalist Reichmuth, however, is far more critical of climate change,and told the audience that climate science is more a religion than science and that it all reminded him of the Jehovah Witness sect.

This is about a religious conversion – ride your bicycle more and we’ll be redeemed.”

Reichmuth reminded the audience that even the IPCC stated that there is no clear trend regarding weather extremes.

From 97% to 99%?

Reichmuth then slammed Schellnhuber for his outlandish predictions of the future, telling the “renowned” Potsdam professor that he “has clearly deviated from the scientific approach“. Just a minute earlier Schellnhuber had seemed to claim that 99% of the scientists agree with him.

Surprisingly guest Dorothee Bär of the conservative CSU party said she doubted that man was all responsible for the 1°C warming of the past century and that economy and well-being of the citizen had to be placed at the forefront of any energy policy. But later in the show the CSU politician hopped on the politically correct “we have to do something” facade – as did Kachelmann.

At the 34-minute mark, the talk switched to Trump’s backing out of the Paris Accord and whether the fight against the climate would hurt the economy. Most of the discussion was filled Marxist-brand utopian platitudes with few in the round grasping the technical implications of green energies. For example, suddenly Schellnhuber poased as an expert and leading authority on transportation technology, agriculture and economics, and gave the impression storage systems are all ready to go!

 

2 New Papers: Models ‘Severely Flawed’, Temp Changes Largely Natural, CO2 Influence ‘Half’ Of IPCC Claims

Atmospheric Scientists Slam Fundamentals

of the Anthropogenic Global Warming Theory

Scafetta et al., 2017    Natural climate variability, part 1: Observations versus the modeled predictions

[T]he AGWT [Anthropogenic Global Warming Theory] was globally advocated by the IPCC in 2001 because it appeared to be supported by the ‘infamous’ Hockey Stick temperature reconstructions by Mann et al. [10]* and by specific computer climate models mainly based on radiative forcings [4,11]. Those temperature reconstructions claimed that only a very modest change in the Northern Hemispheric climate had occurred during the pre-industrial times from A.D. 1000 to 1900, while an abrupt warming did occur just in the last century. Energy balance and general circulation climate models (GCM) were used to interpret the Hockey Stick climatic pattern as due mostly to anthropogenic greenhouse gas emissions such as CO2 because of coal and oil fuel consumption, which has been accelerating since the beginning of the 20th century [11].

However, since 2005 novel Northern Hemisphere proxy temperature reconstructions were published revealing the existence of a large millennial oscillation that contradicts the Hockey Stick temperature pattern

* see reference list


Wilson et al., 2016

Wilson et al., 2016

Abrantes et al., 2017

The new findings were consistent with alternative climatic and solar activity records showing that a quasi-millennial oscillation occurred throughout the entire Holocene for the last 10,000 years [16, 17].

The severe discrepancy between observations and modeled predictions found during the 1922-1941 and 2000-2016 periods further confirms, according to the criteria proposed by the AGWT advocates themselves, that the current climate models have significantly exaggerated the anthropogenic greenhouse warming effect.

In 2009 AGWT advocates acknowledged that: “Near-zero and even negative trends are common for intervals of a decade or less in the simulations, due to the model’s internal climate variability. The simulations rule out (at the 95% level) zero trends for intervals of 15 year or more, suggesting that an observed absence of warming of this duration is needed to create a discrepancy with the expected present-day warming rate” [24]. Thus, according to the AGWT advocates’ own criteria, a divergence between observations and climate models occurring at the bi-decadal scale would provide strong convincing evidences that the GCMs used to support the AGWT are severely flawed.

In conclusion, the temperature records clearly manifest several fluctuations from the inter-annual scale to the multidecadal one. Detailed spectral analyses have determined the likely existence of harmonics at about 9.1, 10.5, 20 and 60- year periods [7, 8, 9]. By contrast, the CMIP5 GCMs simulations used by the IPCC (2013) to advocate the AGWT show a quite monotonic accelerating warming since 1860, which is at most temporarily interrupted by volcano eruptions and only slightly modulated by aerosol emissions. Thus, the models are not able to reproduce the natural variability observed in the climate system and should not be trusted for future energy planning [33].

It has been suggested that non-radiative physical processes connected with solar activity and the “resonant” orbital motions of the moon and the planets can cast light on the otherwise incomprehensible temperature fluctuations [34, 35]. In fact, the magnetic activity of the sun and, probably, also the planetary motions modulate both the solar wind and the flux of the cosmic rays and interstellar dust on the earth with the result of a modulation of the clouds coverage.


Scafetta et al., 2017  Natural climate variability, part 2: Observations versus the modeled predictions

Several studies based on general circulation model (GCM) simulations of the Earth’s climate concluded that the 20th century climate warming and its future development depend almost completely on anthropogenic activities. Humans have been responsible of emitting in the atmosphere large amount of greenhouse gases (GHG) such as CO2 throughout the combustion of fossil fuels. This paradigm is known as the Anthropogenic Global Warming Theory (AGWT).

[S]ince 2001 AGWT was actually supported by the belief that the “hockey stick” proxy temperature reconstructions, which claim that an unprecedented warming occurred since 1900 in the Northern Hemisphere, were reliable [2,5] and could be considered an indirect validation of the available climate models supporting the AGWT [6]. However, since 2005 novel proxy temperature reconstructions questioned the reliability of such hockey stick trends by demonstrating the existence of a large millennial climatic oscillation [7-10]. This natural climatic variability is confirmed by historical inferences [11] and by climate proxy reconstructions spanning the entire Holocene [12, 13]. A millennial climatic oscillation would suggest that a significant percentage of the warming observed since 1850 could simply be a recovery from the Little Ice Age of the 14th – 18th centuries and that throughout the 20th century the climate naturally returned to a warm phase as it happened during the Roman and the Medieval warm periods [9, 11, 14-16].

We … critically analyze the year 2015-2016, which has been famed as the hottest year on record. We show that this anomaly is simply due to a strong El-Niño event that has induced a sudden increase of the global surface temperature by 0.6 °C. This event is unrelated to anthropogenic emissions. In fact, an even stronger El-Niño event occurred in 1878 when the sudden increase of the global surface temperature was 0.8 °C.

Herein, the authors have studied the post 2000 standstill global temperature records. It has been shown that once the ENSO signature is removed from the data, the serious divergence between the observations and the CMIP5 GCM projections becomes evident. Note that Medhaug et al. [28] claim that the models agree with the post 2000 temperature trend. However, these authors did not remove the ENSO signal and used annual mean temperature records up to 2015 that camouflage the real nature of the 2015-2016 ENSO peak.

Moreover, a semi-empirical model first proposed in 2011 based on a specific set of natural oscillations suggested by astronomical considerations plus a 50% reduced climatic effect of the radiative forcing, which includes the anthropogenic forcing, performs quite better in forecasting subsequent climate changes. Thus, the GCMs used to promote the AGWT have been also outperformed [by a natural oscillation/astronomical/anthropogenic “semi-empirical” model][15]. This result is indeed consistent with recent findings. In fact, although the equilibrium climate sensitivity (ECS) to CO2 doubling of the GCMs vary widely around a 3.0°C mean [3,4], recent studies have pointed out that those values are too high.

Since 2000 there has been a systematic tendency to find lower climate sensitivity values. The most recent studies suggest a transient climate response (TCR) of about 1.0 °C, an ECS less than 2.0 °C [20] and an effective climate sensitivity (EfCS) in the neighborhood of 1.0 °C [29].

Thus, all evidences suggest that the IPCC GCMs at least increase twofold or even triple the real anthropogenic warming. The GHG theory might even require a deep re-examination [30].

Central Europe Mean September Temperature Shows No Significant Rise Over Past 70 Years!

Germany Temperatures Baffle: September Mean Shows Hardly Any Warming In 70 Years

By Josef Kowatsch and Dr. Sebastian Lüning
(Translated and edited by P Gosselin)

Temperatures are rising and rising and rising. That’s what we read in any case in the daily newspaper, and that’s what some television professors, activists and climate scientists are telling us. Strangely rarely are temperature curves ever shown. Why is this so? One example is the September mean temperature for Germany, which we use to illustrate this peculiar media documentation gap.

Here we use the official DWD German Weather Service data. When we look at the past 100 years we see a very modest warming of just a few tenths of a degree (Fig. 1). This is no surprise as we find ourselves in the warming phase since the Little Ice Age, the coldest phase of the last 10,000 years. It would have been terrible had the climate stayed at this non-representative low level.

Figure 1: Chart depicting Germany September mean temperature over the past 100 years. Data source: DWD.

It is easy to see the long cycles in the temperature curve. Above we a cold phase between 1920 and 1930, followed by a warm period during the Nazi time, and then followed by a long-term cold dip.

Beginning in 1985, September began to warm up again before reaching a plateau that took hold just before the year 2000 and at which we currently find ourselves. Based on the past development one could speculate that we are headed towards a slight cooling.

Now let’s look at the period from the end of WWII until today, more than 70 years, the time of the last temperature plateau until today. Immediately we see that we are far from worrisome climate warming (Fig. 2):

Figure 2: Chart depicting Germany September mean temperature over the past 70 years. Data source: DWD.

Finally we take a look at the past 13 years (Fig. 3), i.e. the development since 2004. Again there has not been any significant warming. In fact there’s been some cooling. Everything other than a climate catastrophe.

Figure 3: Chart of September mean temperatures in Germany over the past 13 years. Data source: DWD.

Getting back to the primary question of why isn’t the German media showing the real German temperature curve, obviously the real facts are just too inconvenient. A pert of the public could even lose its faith in the much-preached climate catastrophe and end up sharply criticizing the harsh sacrifices now being made because of the climate fear that has been instilled by policymakers.

It’s high time for the issue to be made transparent and to push back against the activism. What’s needed is a new environmental protection ethic, one which addresses all the problems.

The excessive focus on the climate question is no longer sustainable and is even counterproductive. Other more important problems that can be solved over the short term require greater attention — clean water, clean air and clean food being evenly distributed — would be a common ethical goal for mankind to strive for. The fear-mongering climate protection issue is a repeat of the earlier business model of sin and the sale of indulgences.

 

Media Baffled…”WHERE Have All The Cyclones Gone?”…Pacific Near “Quietest Season On Record”!

Dr. Ryan Maue here reports at Twitter that although the Atlantic hurricane season “is going gangbusters“, the Pacific is in fact seeing “one of quietest Typhoon seasons on record“.

Last month in the media, amid the aftermath of Harvey and Irma, the public heard a long stream of hysterical reports claiming that the tropical storms were sure sure signs of man-made climate change.

Yet, according to Dr. Maue, the globe has seen significantly below average cyclone activity, despite the near record hurricane activity observed in the Atlantic this season.

Chart above shows cyclone activity globally being well below normal in a year awash with media hurricane hysteria. Status: October 10, 2017. Source: http://wx.graphics/tropical/

Though the North Atlantic is running at 240% of normal, the entire northern hemisphere is near normal at 98%. Astonishing is the fact that Southern Hemisphere cyclonic activity is near record-breaking low of 47%.

Globally the figure is mere 86%. This is an embarrassment and highly baffling to the media and climate alarmists, who have recently been giving false impressions of “unprecedented” storm activity this year.

“WHERE have all the cyclones gone?”

Even the Australian news site www.news.com.au here asks: “WHERE have all the cyclones gone?

Scientists are puzzled as to how global warming is having the opposite effect on storms from what is often claimed.

Dr. Maue’s following chart shows that the overall hurricane trend has been downward over the past quarter century:

Figure: Global Hurricane Frequency (all & major) — 12-month running sums. The top time series is the number of global tropical cyclones that reached at least hurricane-force (maximum lifetime wind speed exceeds 64-knots). The bottom time series is the number of global tropical cyclones that reached major hurricane strength (96-knots+). Adapted from Maue (2011) GRL. Source: http://wx.graphics/tropical/.

Record-low Southern Hemisphere

In his last chart chart at the above website, shown is how the southern hemisphere has been trending down to a near record low.

In fact the abstract of a recent peer-reviewed paper appearing at the Geophysical Research Letters, confirms the trends, writing (emphasis added):

In the pentad since 2006, Northern Hemisphere and global tropical cyclone ACE has decreased dramatically to the lowest levels since the late 1970s. Additionally, the frequency of tropical cyclones has reached a historical low. Here evidence is presented demonstrating that considerable variability in tropical cyclone ACE is associated with the evolution of the character of observed large-scale climate mechanisms including the El Nino Southern Oscillation and Pacific Decadal Oscillation.”

Reckless media neglect

This is information and data that alarmist climate scientists like Dr. Michael E. Mann or media such as the AP’s Seth Borenstein apparently recklessly neglected to examine before making hysterical statements to the public.

 

New Paper: Temperature Increase From Doubling CO2 Is ‘Insignificant Compared to Natural Variability’

Engineering Prof. Questions Temperature

Record, Models, CO2 Climate Sensitivity 

Photo California Baptist University

 Pontius, 2017  

Sustainable Infrastructure:

Climate Changes and Carbon Dioxide

Temperatures Record ‘Unreliable’, ‘Arbitrarily Adjusted’, And Of ‘Poor Data Quality’

Temperature measurement stations have been installed at various locations across the globe. The number of temperature monitoring stations is decreasing and many areas across the globe do not have any temperature monitoring stations. Consequently, average surface temperature is an unreliable metric for assessing global temperature trends.

Computer models are used to analyze data sets. In science and engineering (and this paper) the term “data” refers to actual physical measurement at a point in time and space. In some temperature data sets, however, computer simulated values have been added in or data may have been arbitrarily adjusted long after the physical measurement was taken. Such practices undermine the credibility of the data set.   Computer generated values are estimates, projections, or simulations and are of a different quality than physical measurements. Physical measurements represent a physical quantity whereas computer simulations represent numerical calculation.

The HADCRU, GISTEMP, and NOAA surface temperature archives rely on the same underlying input data and therefore are not independent data sets. Limitations of the GHCN affect all data sets. Sampling discontinuities, urbanization and land use changes have decreased the quality of GHCN data over time. Differences in data processing methods between research teams do not compensate for poor underlying data quality inherent in the GHCN data. A similar situation exists with historical Sea Surface Temperature (SST) data sets which are derived primarily from the International Comprehensive Ocean-Atmosphere Data Set (ICODADS).

Climate Models ‘Unreliable For Long-Term Climate Prediction’

Computer simulations involve mathematical models implemented on a computer imitating one or more natural processes. Models are based on general theories and fundamental principles, idealizations, approximations, mathematical concepts, metaphors, analogies, facts, and empirical data (Peterson, 2006, Meehl et al., 2012). Judgments and arbitrary choices must be made in model construction to apply fundamental laws to describe turbulent fluid flow. The large size and complexity of the atmosphere prohibit the direct application of general theory.

In general, ensemble model forecasts have been found unreliable for long-term climate prediction (Green and Armstrong, 2007Mihailović et al., 2014).

The forecasts in the [IPCC] Report were not the outcome of scientific procedures. In effect, they were the opinions of scientists transformed by mathematics and obscured by complex writing. Research on forecasting has shown that experts’ predictions are not useful in situations involving uncertainly and complexity. We have been unable to identify any scientific forecasts of global warming. Claims that the Earth will get warmer have no more credence than saying that it will get colder.”  –  Green and Armstrong, 2007.
“This analysis, set into context of the climate modeling, points out the fact that there exists set of domains where the environmental interface temperature cannot be calculated by the physics of currently designed climate models.” – Mihailović et al., 2014

Climate Sensitivity To Changing CO2 Concentrations

Global

The global atmospheric system is dynamic and is constantly in a state of change and adjustment. The sun is the primary climate change driving force.  

Using a Climate Sensitivity best estimate of 2°C, the increase in [global] temperature resulting from a doubling of atmospheric CO2 is estimated at approximately 0.009°C/yr which is insignificant compared to natural variability.

CO2 is a non-toxic trace gas constituting approximately 0.04% of the earth’s atmosphere. The global atmospheric concentration of CO2 increased from a pre-industrial value of about 280 ppmv to 379 ppmv in 2005 . The average CO2 concentration at the monitoring station at Mauna Loa, Hawaii for May 2017 is 409.65 ppmv.  A rising concentration of atmospheric CO2 will contribute to warming of the Earth’s atmosphere. The physics of CO2 in the atmosphere is very different than the physics of the heating effect occurring in a physical “greenhouse” for growing plants. The term “greenhouse effect” is commonly used to refer to the warming of the earth from “greenhouse” gases such as CO2 in the atmosphere. The term “greenhouse” is not used here to refer to the Earth’s warming to avoid equivocation.

Estimates of climate sensitivity differ widely suggesting that this characteristic of the climate system is not well-understood (Schwartz et al., 2014).

A simple model predicts that a doubling of the CO2 concentration in the atmosphere would result in a small increase of the Earth’s surface temperature, from approximately 0.[5] to < 0.7°C  (Kissin, 2015).

“[A] doubling the CO2 concentration in the Earth’s atmosphere would lead to an increase of the surface temperature by about +0.5 to 0.7 °C, hardly an effect calling for immediate drastic changes in the planet’s energy policies.” – Kissin, 2015

A best estimate of 2.0°C (Otto et al., 2013) is assumed here. If CO2 increases at the current rate of approximately 2 ppmv per year, a temperature increase of approximately 0.009°C/yr could be expected.

To date the impact of CO2 is assessed universally within a global reference frame. Although atmospheric CO2 has steadily increased the average satellite global temperatures have flattened since approximately 1995.

 From such trends, it must be inferred that changes in global lower troposphere average temperature correspond to fundamental changes in the climate system beyond internal variability.

Riverside, California

The impact of future atmospheric CO2 warming on the Riverside locational reference frame must be estimated.  GCMs [climate models] could be applied to project future global temperatures and those projections could be downscaled to the Riverside area. However, such efforts would be potentially misleading because of the limitations of GCMs discussed previously.  Detailed assessments of the CO2 effect have been performed analyzing the Earth’s energy balance in the total atmosphere column and the reduction of the upward infrared radiation emission at the tropopause. The impact of CO2 on warming of the Earth is expressed in terms of “climate sensitivity,” which is the amount of warming that could be  expected as a result of doubling of the CO2 concentration.

Available temperature data from both the Riverside Fire Station No. 3 and the Riverside Municipal Airport demonstrate horizontal trends within a wide band of  variability. Historical evidence of a significant increase in surface temperatures due to increases in atmospheric CO2 is absent from these data.   [C]limate models are useful but limited in their representation of underlying physical processes.  Uncertainties and other limitations discussed previously render such models unreliable for long-term global temperatures or local climate change prediction.

Climate sensitivity may be applied to estimate the warming effect of CO2 on the locational reference frame.  Factors affecting Climate Sensitivity are not well-understood and estimates differ among researchers. Alternatively, a site-specific model could be developed to estimate the future impact of CO2 warming on a particular location. If atmospheric CO2 continues to increase at its current rate the small annual temperature increase expected at Riverside will likely be insignificant (e.g. < 0.01°C/yr) compared to natural temperature variability.

A slight increase in minimum daily temperature is noticeable at Riverside Fire Station No. 3 after 1998 (Figure 8, lower) with a corresponding slight decrease in the daily temperature range (Figure 9). This trend is most likely due to the urban heat island effect (Tam et al., 2015) resulting from increased development within and around downtown Riverside over this extended period.

Potsdam Institute’s Stefan Rahmstorf Uses Tricks To Warn Against “Trickster Skeptics”

It’s safe to say that the only people who still believe the ultra-alarmist scenarios of the Potsdam Institute for Climate Impact Research are the leftist media and green activists. Even the government funders of this institute know they aren’t really true. After all Germany hasn’t cut CO2 emissions in close to 10 years.

Dr. Sebastian Lüning at Die Kalte Sonne exposes the latest dubious attempt by Potsdam scientist, Stefan Rahmstorf, to spread climate fear and to attack on journalist Daniel Wetzel of flagship daily Die Welt, who not long ago dared to question the science.

The method of attack used by Rahmstorf is every time the same:

  • Smear the dissenting journalist as a con-man.
  • Insist the science has long been settled (it isn’t).
  • Float out charts that use statistical trickery to mislead.

===================================

Again and again: Stefan Rahmstorf and his solar trick

By Dr. Sebastian Lüning and Prof. Fritz Vahrenholt
(Translated/edited by P Gosselin)

The Sherlock Holmes of climate sciences, Stefan Rahmstorf, at his climate blog post “Klimawandel XY Ungelöst” at Klimalounge, warned on 29 July 2017 – of climate con-men, fraudsters and hustlers:

The global CO2 increase: the facts and the tricks of con-men
The facts surrounding CO2 rise are clear, unequivocal and agreed on – yet Die Welt again and again gladly recycles old, worn-out climate skeptic myths. Are forests to blame for the CO2 rise?”

Here Stefan Rahmstorf’s rails against an article by Daniel Wetzel “Kurzschluss bei der Energiewende” [The Energiewende shorts out] in Die Welt, where Wetzel dared to question Rahmstorf’s favorite project. The main focus was man’s share of the total CO2 budget, which is a rather dry issue in itself. Also the article looked at the magnitude and its signficance. Depending on its toxicity, even small amounts can have an impact. The same old stuff.

But looking at his Figure 5, Rahmstorf’s seriousness really needs to be called into question. It involves his favorite chart which he regularly presents. Here it is (Fig. 2):

Figure 2.  Chart from Rahmstorf’s Blog posting “The global CO2 rise: the facts and the tricks of the con-men” dated 29 July, 2017.

Rahmstorf*’s text concerning the chart follows:

Curves showing global temperature, CO2-concentration and solar activity. Temperature and CO2 are scaled so that they correspond to the expected CO2-effect on climate (e.g. the best estimation of climate sensitivity). The amplitude of the solar curve is scaled in such a way as to correspond to the observed correlation between solar data and temperature data. (Details are explained here). You can generate this chart here and copy a code there that allows you to install the chart as a widget on your own website (like at my home page) – where here every year it is updated with the latest data. Thanks to Bernd Herd, who programmed it).

First remark: Contrary to Rahmstorf’s claim, there is no “best estimate of climate senstitivity“. The 5th IPCC report intentionally left this value open as no agreement among the report’s authors could be reached. Instead a very broad range of 1.5°C to 4.5°C for a doubling of CO2 was given, which ranges from manageable to catastrophic.

Second remark: The scaling of the solar curve was designed so as to make it impossible to detect a trend. Also the solar curve that was purposely selected is not really representative if one looks at the solar reconstructions of isotopes and cosmic rays. A more scientifically robust version of the chart would look as follows:

Figure 2: Global temperature (GISS), CO2-concentration and solar activity (Steinhilber et al. 2009).

Rahmstorf complains about con-men and tricksters, but completely fails himself when put to the test. Is this person, who gladly speaks at Green Party campaign events, really as credible as he fancies himself to be?

 

Now It’s “Global Stilling” …Researchers Amazed: Global Wind Speeds Have SLOWED DOWN Since 1960s!

German public radio DLF here  reports an astonishing finding by scientists: Global wind speeds globally are slowing down!

A number of European scientist groups and a European science magazine of the EU Commission just reported on this.

Global wind speeds are slowing down, European researchers believe. Image: anemometer, NOAA public domain photo.

According to the researchers, worldwide wind speeds have slowed down by about half a kilometer per hour since the 1960s.

The phenomenon is known as “stilling”, and scientists are not sure why it is happening. They speculate that it may have something to do with urbanization, climate change and cumulus clouds. But then the report admits: “Or it could be due to ageing wind speed instruments producing inaccurate results.

Normally this should come across as being good news amid the claims that “global warming” is leading to more powerful and destructive storms. With slower wind speeds, one would naturally assume less storm destruction.

But instead the researchers see only dark clouds ahead and warn that this could have “terrible consequences for things like agriculture” and that weak winds “also mean that smog over cities will stick around longer”. It adds:

And while it may sound deceptively calm, it could be a vital, missing piece of the climate change puzzle and a serious threat to our societies.”

We are damned no matter what happens.

Send more funding

Naturally there’s the thinly veiled call for MORE MONEY as University of Gothenburg climatologist Dr. Cesar Azorin-Molina “believes there is an urgent need to determine the causes of stilling in a changing climate“.

Ironically, another problem the report hints at is that wind farms may also see less output as a result. Now aren’t windfarms supposed to make the weather tamer in the first place? This is like blaming harsher winters for obstructing the fight against warming because people have to emit more CO2 to keep warm. maybe wind farms are a factor in slowing down winds as they extract energy from the wind.

Not surprisingly, the results of the researchers are getting only tiny blurbs of reporting in the media. For years people have been brainwashed into thinking man-made global warming is leading to stronger winds. But now they are supposed to believe the opposite is happening? The public must never hear that winds are calming down.

All the contradictions climate science has put out are starting to catch up and cast the field’s credibility into serious question.

The junk science never ceases to amaze us.

 

Storms Expose Just How Huge Model Uncertainty Can Be…Even With 6-Hour Forecasts!

Last Wednesday evening, Florida State University graduate student Levi Cowan showed at his Tropical Tidbits site his analysis of what was later to develop into tropical storm Nate in the Gulf of Mexico.

His analysis exposed the great differences – thus huge uncertainty – between the US GFS and the European ECMWF models for the early projected tracks of Nate.

Levi shows the two different model projections below:

In the above figure, the NOAA’s GFS model run takes the storm track over Louisiana (left) while the European showed landfall occurring some 400 miles away to the east at the start of the Florida panhandle (right). If that doesn’t illustrate the huge uncertainty within models, then what does? Source: Image cropped from Tropical Tidbits.

Keep in mind that these two projections for Nate coming from two different models are for just 4 days out, despite being generated by super computers that have been fed with reams of data.

In Levi’s latest analysis here, he shows that the latest storm track now favors the GFS and that the European had been flawed.

Great uncertainty even with forecasts just hours ahead

Meanwhile on Thursday storm Xavier passed through Northern Germany. Mid morning on Thursday the German DWD national weather service tweeted here how the forecast track of the storm over northern Germany over the coming 6 hours was still uncertain!

In the chart above the low pressure system Xavier designated by “T” was centered just over the North Sea just west of Hamburg and moving eastward. The DWD chart above shows the uncertainty of the storm’s projected path with a range of some 150 km upon reaching the Polish border just hours later. Models cannot even predict storm location even 6 hours in advance!

Too much emphasis on models?

Surely meteorologists will be the first to admit that the complexity of storm systems is still far too great to allow predictions of any reliable certainty.

In a recent daily summary at Weatherbell Analytics, veteran meteorologist Joe Bastardi even cautioned against relying too heavily on models, which he said sometimes flip flop between scenarios in just 6 hours. Bastardi feels there has to be greater emphasis on patterns observed over the decades. Already two weeks earlier on September 22 Bastardi warned of a Gulf storm developing between October 1 and October 10. He was right. His forecast was based on patterns, and made long before models sniffed out the storm. Models indeed still have a very long way to go.

If a 6-hour projection is uncertain, then projections out 20 years are worthless

This gives us an idea of what to expect of climate models going out 20, 50 or even 100 years in the future, which woefully lack long-term historical data from major climate drivers such as the oceans, continents, sun and atmosphere. Little wonder that IPCC climate model temperature projections made 10 years ago are already wrong.

But if you happen to be someone who is still sold on the projections made by these climate models, then I’ve got a great deal on a bridge in Brooklyn for you.

Scientists: Expansion Of Wind Turbines ‘Likely To Lead To Extinction’ For Endangered Vulture Species

Wind Energy Expansion: Endangering Wildlife

Photo from Ferrão da Costa et al., 2017

When pondering the future of wind power and its ecological impacts, it is well worth re-considering this seminal analysis from Dr. Matt Ridley.


[W]orld energy demand has been growing at about 2 per cent a year for nearly 40 years. Between 2013 and 2014, […] it grew by just under 2,000 terawatt-hours.

If wind turbines were to supply all of that growth but no more, how many would need to be built each year? The answer is nearly 350,000, since a two-megawatt turbine can produce about 0.005 terawatt-hours per annum. That’s one-and-a-half times as many as have been built in the world since governments started pouring consumer funds into this so-called industry in the early 2000s.

At a density of, very roughly, 50 acres per megawatt, typical for wind farms, that many turbines would require a land area half the size of the British Isles, including Ireland. Every year.

If we kept this up for 50 years, we would have covered every square mile of a land area half the size of Russia with wind farms. Remember, this would be just to fulfill the new demand for energy, not to displace the vast existing supply of energy from fossil fuels, which currently supply 80 per cent of global energy needs.  


The profound costs to wildlife of future-planning to expand wind energy to the levels demanded by “green” advocates — just to meet the world population’s additional energy demands with 350,000 more turbines each year — has been increasingly documented by scientists.

The last remaining vulture species native to southeastern Europe is “likely” faced with extinction in the next few decades due to an “eight to ten times greater” mortality rate associated with the rapid expansion of wind energy projects in the region (Vasilakis et al., 2017).

Bat species can be found dwelling in a wide variety of terrestrial habitats, including deserts and along sea coasts. Each species may play a fundamental role in its local ecosystem.  For example, Kuntz et al., (2011) indicate that 528 different plant species rely on bat pollination and seed dispersal for sustainability.  Boyles et al., (2011) estimated that by controlling pest populations (insects), the agricultural benefits of bats may reach $22.9 billion (U.S.D.) annually in the continental U.S. alone.

In addition to White Nose Syndrome, deaths  connected to collisions with wind turbines are now the leading cause of multiple mortality events in bats (O’Shea et al., 2016).  Roughly 25% of North American bats are now classified at risk for extinction (Hammerson et al, 2017), in large part due to the explosion of wind turbines across the landscape.   If the expansion of wind turbines continues at its current pace, the hoary bat population is projected to be reduced by 90% (Frick et al., 2017) within the next 50 years.   As Hein and Schirmacher (2016) conclude, the “current and presumed future level of fatality [for bat populations] is considered to be unsustainable.”

Even large mammals like the already endangered Portuguese wolf  (“between 200 and 400 individuals” left) has had its reproduction rates reduced by the recent addition of nearly 1,000 new turbines in their shrinking habitat range (Ferrão da Costa et al., 2017 ).

So what, exactly, are we gaining in exchange for increasingly endangering critically important wildlife species?  Slightly above nothing.

According to the IEA, wind energy provided for 0.39% of the world’s total energy demands as of 2013.

At what point may we ask: Are the benefits of wind energy worth the ecological and wildlife costs?


Vasilakis et al., 2017

Numerous wind farms are planned in a region hosting the only cinereous vulture population in south-eastern Europe. We combined range use modelling and a Collision Risk Model (CRM) to predict the cumulative collision mortality for cinereous vulture under all operating and proposed wind farms. Four different vulture avoidance rates were considered in the CRM.  Cumulative collision mortality was expected to be eight to ten times greater in the future (proposed and operating wind farms) than currently (operating wind farms), equivalent to 44% of the current population (103 individuals) if all proposals are authorized (2744 MW). Even under the most optimistic scenario whereby authorized proposals will not collectively exceed the national target for wind harnessing in the study area (960 MW), cumulative collision mortality would still be high (17% of current population) and likely lead to population extinction.


Hammerson et al, 2017

Conservationists are increasingly concerned about North American bats due to the arrival and spread of the White-nose Syndrome (WNS) disease and mortality associated with wind turbine strikes. To place these novel threats in context for a group of mammals that provides important ecosystem services, we performed the first comprehensive conservation status assessment focusing exclusively on the 45 species occurring in North America north of Mexico. Although most North American bats have large range sizes and large populations, as of 2015, 18–31% of the [North American bats] species were at risk (categorized as having vulnerable, imperiled, or critically imperiled NatureServe conservation statuses) and therefore among the most imperiled terrestrial vertebrates on the continent.


Frick et al., 2017

Large numbers of migratory bats are killed every year at wind energy facilities. However, population-level impacts are unknown as we lack basic demographic information about these species. We investigated whether fatalities at wind turbines could impact population viability of migratory bats, focusing on the hoary bat (Lasiurus cinereus), the species most frequently killed by turbines in North America. Using expert elicitation and population projection models, we show that mortality from wind turbines may drastically reduce population size and increase the risk of extinction. For example, the hoary bat population could decline by as much as 90% in the next 50 years if the initial population size is near 2.5 million bats and annual population growth rate is similar to rates estimated for other bat species (λ = 1.01). Our results suggest that wind energy development may pose a substantial threat to migratory bats in North America. If viable populations are to be sustained, conservation measures to reduce mortality from turbine collisions likely need to be initiated soon. Our findings inform policy decisions regarding preventing or mitigating impacts of energy infrastructure development on wildlife.


Hein and Schirmacher, 2016

Two recent attempts were made to estimate bat fatality in the United States for 2012. Hayes (2013) followed a similar approach to Cryan (2011) and based his analysis primarily on the limited dataset from Arnet et al. (2008). Hayes (2013) indicated that >600,000 bats were killed at wind energy facilities in 2012 and suggested that this was a conservative estimate. Smallwood (2013) estimated up to 888,000 bats were killed in the United States in 2012. … We suggest that each of these be considered an order of magnitude estimate; taken together, they highlight the almost certain large number of bats being killed (i.e., on the order of hundreds of thousands per year) in the United States and Canada. Given that bats have a low reproductive rate—typically only having 1 or 2 pups/year—and require high adult survivorship to avoid population declines (Barclay and Harder 2003), this level of impact presumably puts bat populations at risk. Moreover, many species were thought to be declining prior to the onset and expansion of wind energy development, including species impacted by white-nose syndrome (Winhold et al. 2008, Frick et al. 2010). Although population data are sparse or lacking for many bat species, current and presumed future level of fatality is considered to be unsustainable, and actions to reduce impact of wind turbines on bats should be implemented immediately.


Ferrão da Costa et al., 2017

Over the last 15 years, more than 900 wind turbines were built inside the range of the Portuguese wolf. Due to the endangered status of this large carnivore in Portugal, several monitoring plans were conducted, resulting in a reasonable amount of information being collected on the effects of wind farms on wolves. We reviewed the methodological approaches, compiled major findings and summarised the mitigation/compensation measures used in Portuguese wind farms. The overall outcomes show increasing human disturbance in wind farm areas, resulting in lower wolf reproduction rates during construction and the first years of operation, as well as shifts in denning site locations of more than 2.5 km away from the wind farm. … According to a review by Lovich and Ennen (2013), the construction and operation of wind farms have both potential and known impacts on terrestrial vertebrates, such as: (i) increase in direct mortality due to traffic collisions; (ii) destruction and modification of the habitat, including road development, habitat fragmentation and barriers to gene flow; (iii) noise effects, visual impacts, vibration and shadow flicker effects from turbines; (iv) electromagnetic field generation; (v) macro and microclimate change; (vi) predator attraction; and (vii) increase in fire risks.

Green Energy Debacle: Multi-Million-Euro Geothermal Power Plant Shuts Down After 8 Years Of Endless Troubles

Like so many (highly subsidized) green energy projects in Germany the Kalina geothermal power plant in Unterhaching, Germany, was put into operation with great fanfare some 8 years ago in 2009.

Look at all the great things we’re doing, high ranking politicians seemed to say as the cut the opening ribbon.

Today the online Merkur.de reports that the “prestige project of Germany” has been out of operation since summer, and so “possibly forever“. According to the Merkur, the plant has produced as much trouble as it has energy. Efforts to rescue the project have failed.

To produce power at the low temperature range between 90 and 200 °C, a complex and special power plant process is used. In Unterhaching, just south of Munich, the first Kalina geothermal power plant technology was used. Recently it was reported that the project may be be permanently shut down due to costs and technical problems. Chart: www.geothermie-unterhaching.de

According to the company’s promotional video here, the plant was designed to produce 3.4 megawatts of electric power and 38 megawatts of district heating “for thousands of households“…”emissions-free, sustainable and renewable“.

Unterhaching geothermal plant: Image cropped here.

The partner for the district heating part of the Unterhaching plant, the community of Grünwald, has invested in cooperation with Unterhaching already 23.5 million euros over the past five years. Credits were also given by Grünwald and Unterhaching in an attempt to save the project from insolvency, but the geothermal plant has since turned into “a bottomless pit” and members of the town councils spoke of “an immense burden” and a “failed project” due to the losses from the electricity generation part of the plant.

According to the Merkur, the technical problems are not related to the Kalina technology itself, but rather due to material used for the heat exchangers of the power plant.

As soon as the plant was switched on for the first time just after its opening by then German Environment Minister Sigmar Gabriel, it began to stink like ammonia just a half hour later. At the time the heat exchanger leaked and rubber seals did not help. It had to be welded. Other technical problems plagued the plant.

The Merkur points out that the district heating part of the plant, which is the main part, “functions excellently“, adding: “More than 50% of the households in the community were provided with a hook-up to the geothermal plant in 2015.”

Yet, for the tens of millions of euros invested, that may be a very tiny consolation. Alternative energy seems to be burning cash rather than generating power.

 

Leading German Economics Professor Calls Germany’s Energiewende An Energy Policy Calamity

In a recently released video interview by journalist Jörg Rehmann, University of Magdeburg economics professor Joachim Weimann explains why renewable energies have been a terrible idea for Germany so far.

Recently a high ranking expert commission set up by the German government even sharply criticized the German Energiewende (transition to renewable energies), saying it was leading the country down the wrong path. But as Prof. Weimann explains, the commission’s results fell on deaf ears.

Weimann starts the interview by explaining that the target of the Energiewende is to replace carbon-dioxide-emitting fossil fuels in order to protect our climate. One instrument used to achieve that target was Cap and Trade, in combination with the Energiewende, which Weimann says has not worked well at all. The U. of Magdeburg professor says that every cut that gets achieved in Germany gets offset elsewhere, and so net CO2 gets saved at all.

Weimann says that over the years policymakers promised and obstinately insisted that renewables were the way to go, and so ended up putting themselves in a position of which it is now impossible to back out. What leading politician is going to step forward and tell us that it was all a big mistake? “We find ourselves in quite a bind, says Weimann.

Weimann recommends that citizens step up and tell their leaders that what is currently happening is not in their interest, and that they need to exert influence media reporting on the issue. Weimann says:

It is very very difficult. Currently we have over 1000 citizens intiatives against wind power in Germany, yet they practically go unmentioned in media reporting. Compared to the resistance to nuclear energy, it is a crass disproportion. This shows us just how difficult it is to bring the issue to the forefront.”

Weimann hopes that the protests will grow until a critical mass is reached, and can no longer be ignored.

The professor points out that for years a number of institutions and experts have shown that the feed-in act is not functioning properly, that it wastes resources, and is bad policy that is having no impact on climate protection. He adds that the feed-in act entails extremely high costs, not only in terms of capital but also in terms of damage to the country’s landscape. “That means we are producing costs, and no yields. That is not good policy,” says Weimann.

Policymakers, in Weimann’s view, have long been ignoring what the scientific data and experts have told us with respect to renewable energies, but that they are refusing to back out it because they are so far deep into it and that it would be too embarrassing to do so.

Public kept in the dark by media, policymakers

According to Weimann, 80% of the German population are still in favor of renewable energies because they are not aware of the near zero-impact it is having on CO2 emissions and because they are poorly informed. It is in fact only when a wind park gets proposed nearby does a citizen really begin to get interested in what really is at stake and finds out what the true implications are. “Then they suddenly recognize the nonsense that is in fact happening.”

In Weimann’s view, renewable energy topics and calculations are far too complicated for the average citizen to deal with when they don’t feel they have to.

Total destruction of our landscape

Weimann notes that according to the Ministry of Environment, wind and solar energy in 2016 made up only 3.3% of Germany’s primary energy supply and that so far it represents only a “thimble” of the energy that is needed. And “when you compare it to the cost needed for it, not only financial, but also in terms of the burdens to the citizens who have these energy systems next door, we have to say it is first totally disproportional, and secondly that if we wish to meet our targets using wind, it would mean the total destruction of our landscape.”

So far only 3.3% of our primary energy need is being supplied by wind (28,000 turbines so far) and solar. Weimann asks us to imagine what it would take to reach the 95% target. He says the entire German landscape would be profoundly and fundamentally transformed into one massive industrial park that would lose all its attraction. In short: It’s a policy calamity.

Those were just some of Weimann’s comments and claims in just the first 17 minutes of the interview. More on this soon.

 

28 New Papers: Solar, Ocean Cycles Modulate Rainfall Trends

A Human Influence On Precipitation

‘Has Yet To Be Detected’

“Climate model output suggests decreasing rainfall as a consequence of anthropogenic greenhouse gas radiative forcing.”

“[I]f anthropogenic forcing has impacted the [regional rainfall pattern], the signal has yet to be detected above the level of natural climate variability.” – Lachniet et al., 2017


According to climate models, precipitation trends were supposed to have intensified as a consequence of human activity.

And yet after compiling decades of observational and proxy (paleoclimate) evidence, it has been determined there has been no detectable global-scale human influence on rainfall patterns in the last hundred years (even hundreds of years).  Instead, any variability in the hydrological cycle can be strongly linked to non-anthropogenic forcing mechanisms, namely solar activity and natural oceanic/atmospheric oscillations (NAO, PDO, AMO, ENSO).


Miralles et al., 2013

The hydrological cycle is expected to intensify in response to global warming. Yet, little unequivocal evidence of such an acceleration has been found on a global scale. This holds in particular for terrestrial evaporation, the crucial return flow of water from land to atmosphere. Here we use satellite observations to reveal that continental evaporation has increased in northern latitudes, at rates consistent with expectations derived from temperature trends. However, at the global scale, the dynamics of the El Niño/Southern Oscillation (ENSO) have dominated the multi-decadal variability. 

Modern Precipitation Trends Similar To Past Centuries


Verdon-Kidd et al., 2017

Overall, the inter-annual and inter-decadal variability of rainfall and runoff observed in the modern record (Coefficient of Variation (CV) of 22% for rainfall, 42% for runoff) is similar to the variability experienced over the last 500 years (CV of 21% for rainfall and 36% for runoff). However, the modern period is wetter on average than the pre-instrumental (13% higher for rainfall and 23% higher for runoff). Figure 9 also shows that the reconstructions contain a number of individual years (both wet and dry) of greater magnitude than what has been recorded in the instrumental record.


Kostyakova et al., 2017

A nested July–June precipitation reconstruction for the period AD 1777–2012 was developed from multi-century tree-ring records of Pinus sylvestris L. (Scots pine) for the Republic of Khakassia in Siberia, Russia. … The longest reconstructed dry period, defined as consecutive years with less than 25th percentile of observed July–June precipitation, was 3 years (1861–1863). There was no significant difference in the number dry and wet periods during the 236 years of the reconstructed precipitation.


Shi et al., 2017

Five of the six coupled ocean-atmosphere climate models of the Paleoclimate Modeling Intercomparison Project Phase III (PMIP3), can reproduce the south-north dipole mode of precipitation in eastern China, and its likely link with ENSO. However, there is mismatch in terms of their time development. This is consistent with an important role of the internal variability in the precipitation field changes over the past 500 years.


Conroy et al., 2017

20th century precipitation variability in southern Tibet falls within the range of natural variability in the last 4100 yr, and does not show a clear trend of increasing precipitation as projected by models. Instead, it appears that poorly understood multidecadal to centennial internal modes of monsoon variability remained influential throughout the last 4100 yr. … Until we have a predictive understanding of multidecade to multi-century variability in the Asian monsoon system, it would be wise to consider the risk of prolonged periods of anomalously dry and wet monsoon conditions to be substantial (Ault et al., 2014). Such variability may also explain why the predicted anthropogenic increase in Asian monsoon precipitation is not widely observed.

Clarke et al., 2017

Corresponding ~4-8 year periodicities identified from Wavelet analysis of particle size data from Pescadero Marsh in Central Coast California and rainfall data from San Francisco reflect established ENSO periodicity, as further evidenced in the Multivariate ENSO Index (MEI), and thus confirms an important ENSO control on both precipitation and barrier regime variability.


McCabe et al., 2017

In this study, a monthly water-balance model is used to simulate monthly runoff for 2109 hydrologic units (HUs) in the conterminous United States (CONUS) for water-years 1901 through 2014. … Results indicated that … the variability of precipitation appears to have been the principal climatic factor determining drought, and for most of the CONUS [conterminous US], drought frequency appears to have decreased during the 1901 through 2014 period.

Lachniet et al., 2017

[M]onsoon dynamics appear to be linked to low-frequency variability in the ENSO and NAO, suggesting that ocean-atmosphere processes in the tropical oceans drive rainfall in Mesoamerica. … Climate model output suggests decreasing rainfall as a consequence of anthropogenic greenhouse gas radiative forcing (Rauscher et al., 2008; Saenz-Romero et al., 2010). Our data show, however, that the response of the monsoon will be strongly modulated by the changes in ENSO and the NAO mean states … Our data also show that the magnitude of Mesoamerican monsoon variability over the modern era when the anthropogenic radiative forcing has dominated over solar and volcanic forcings (Schmidt et al., 2012) is within the natural bounds of rainfall variations over the past 2250 years. This observation suggests that if anthropogenic forcing has impacted the Mesoamerican monsoon, the signal has yet to be detected above the level of natural climate variability, and the monsoon response to direct radiative forcing and indirect ocean-atmosphere forcings may yet to be fully realized.

Past, Modern Precipitation Patterns Modulated By Solar Forcing


Lei et al., 2017

The precipitation variability on decadal to multi-centurial generally always reflects changes in solar activity and large-scale circulation, e.g., the ENSO and the EASM [East Asian Summer Monsoon] (Chen et al., 2011; Vleeschouwer et al., 2012; Feng et al., 2014). [D]uring the MWP [Medieval Warm Period], the wetter climate in this region was consistent with more frequent ENSO events, stronger EASM and higher solar activity, whereas the opposite was found for the LIA. In particular, d13Cac fluctuations on multi-decadal to centennial scales is consistent with the changes in solar activity, with fewer dry intervals corresponding to periods of minimum solar activity within dating errors, which are referred to as the Oort Minimum (AD 1010-1050), Wolf Minimum (AD 1280-1340), Sporer Minimum (AD 1420-1530), Maunder Minimum (AD 1645-1715) and Dalton Minimum (AD 1795-1820).


Warrier et al., 2017

Climatic periodicities recorded in lake sediment magnetic susceptibility data: Further evidence for solar forcing on Indian summer monsoon … The results obtained from this study show that solar variations are the main controlling factor of the southwest monsoon.

Zhang et al., 2017

The frequencies represent the influence of the Pacific Decadal Oscillation (PDO) and solar activity on the precipitation from the southwestern United States. In addition, solar activity has exerted a greater effect than PDO on the precipitation in the southwestern United States over the past 120 years. By comparing the trend of droughts with the two fundamental frequencies, we find that both the droughts in the 1900s and in the 21st century were affected by the PDO and solar activity, whereas the droughts from the 1950s to the 1970s were mainly affected by solar activity.

Munz et al., 2017

Decadal resolution record of Oman upwelling indicates solar forcing of the Indian summer monsoon (9–6 ka) … We use geochemical parameters, transfer functions of planktic foraminiferal assemblages and Mg /  Ca palaeothermometry, and find evidence corroborating previous studies showing that upwelling intensity varies significantly in coherence with solar sunspot cycles. The dominant  ∼  80–90-year Gleissberg cycle apparently also affected bottom-water oxygen conditions.

Zhai, 2017

The time series of sunspot number and the precipitation in the north-central China (108° ∼ 115° E, 33° ∼ 41° N) over the past 500 years (1470–2002) are investigated, through periodicity analysis, cross wavelet transform and ensemble empirical mode decomposition analysis. The results are as follows: the solar activity periods are determined in the precipitation time series of weak statistical significance, but are found in decomposed components of the series with statistically significance; the Quasi Biennial Oscillation (QBO) is determined to significantly exist in the time series, and its action on precipitation is opposite to the solar activity; the sun is inferred to act on precipitation in two ways, with one lagging the other by half of the solar activity period.

Sun et al., 2017

[A]t least six centennial droughts occurred at about 7300, 6300, 5500, 3400, 2500 and 500 cal yr BP. Our findings are generally consistent with other records from the ISM [Indian Summer Monsoon]  region, and suggest that the monsoon intensity is primarily controlled by solar irradiance on a centennial time scale.

Zhu et al., 2017

Abrupt enhancements in the flux of pedogenic magnetite in the stalagmite agree well with the timing of known regional paleofloods and with equatorial El Niño−Southern Oscillation (ENSO) patterns, documenting the occurrence of ENSO-related storms in the Holocene. Spectral power analyses reveal that the storms occur on a significant 500-y cycle, coincident with periodic solar activity and ENSO variance, showing that reinforced (subdued) storms in central China correspond to reduced (increased) solar activity and amplified (damped) ENSO. Thus, the magnetic minerals in speleothem HS4 preserve a record of the cyclic storms controlled by the coupled atmosphere−oceanic circulation driven by solar activity.

Zielhofer et al., 2017

Western Mediterranean Holocene record of abrupt hydro-climatic changes Imprints of North Atlantic meltwater discharges, NAO and solar forcing …Early Holocene winter rain minima are in phase with cooling events and millennial-scale meltwater discharges in the sub-polar North Atlantic. … [A] significant hydro-climatic shift at the end of the African Humid Period (∼5 ka) indicates a change in climate forcing mechanisms. The Late Holocene climate variability in the Middle Atlas features a multi-centennial-scale NAO-type pattern, with Atlantic cooling and Western Mediterranean winter rain maxima generally associated with solar minima.

Matveev et al., 2017

An increase in atmospheric moisture for the warm period of the year (May–September) since 1890s, and mean annual temperatures since the 1950s was identified. During the same time period, there was a marked increase in amplitude of the annual variations for temperature and precipitation. … These fluctuations [atmospheric moisture, mean annual temperatures] are consistent with 10–12-years Schwabe–Wolf, 22-years Hale, and the 32–36-years Bruckner Solar Cycles. There was an additional relationship found between high-frequency (short-period) climate fluctuations, lasting for about three years, and 70–90-years fluctuations of the moisture regime in the study region corresponding to longer cycles.

Luthardt and Rößler

The 11 yr solar cycle, also known as Schwabe cycle, represents the smallest-scaled solar cyclicity and is traced back to sunspot activity (Douglass, 1928; Lean, 2000), which has a measurable effect on the Earth’s climate, as indicated by the Maunder minimum (Usoskin et al., 2015). Global climate feedback reactions to solar irradiance variations caused by sunspots are complex and hypothesized to be triggered by (1) variation in total energy input (Cubasch and Voss, 2000), (2) the influence of ultraviolet light intensity variation on composition of the stratosphere (Lean and Rind, 2001), (3) the effect of cosmic rays on cloud formation (Marsh and Svensmark, 2000; Sun and Bradley, 2002), and/or (4) the effect of high-energy particles on the strato- and mesosphere (Jackman et al., 2005). …  [L]ike today, sunspot activity caused fluctuations of cosmic radiation input to the atmosphere, affecting cloud formation and annual rates of precipitation.

Park, 2017

[S]olar activity drove Holocene variations in both East Asian Monsoon (EAM) and El Niño Southern Oscillation (ENSO).

Shi et al., 2017

Our results imply that the synchronous change in the Asian–Australian monsoon may be caused by inherent solar variations, further strengthening previous findings.

Past, Modern Precipitation Patterns Modulated By AMO/PDO/NAO/ENSO


Macdonald and Sangster, 2017

Statistically significant relationships between the British flood index, the Atlantic Meridional Oscillation and the North Atlantic Oscillation Index are identified. The use of historical records identifies that the largest floods often transcend single catchments affecting regions and that the current flood-rich period is not unprecedented. … Solar forcing can manifest itself in a variety of different ways on flood patterns through modification of the climate (Benito et al., 2004). Several series indicated increased flood frequency during the late eighteenth century corresponding to the Dalton Minimum (AD 1790–1830), with notable flooding across catchments in the 8-year period AD 1769 1779, which was a climatic period considered to include the sharpest phases of temperature variability during the “Little Ice Age” (Lamb, 1995; Wanner et al., 2008).

Malik et al., 2017

[W]e investigate the impact of internal climate variability and external climate forcings on ISMR on decadal to multi-decadal timescales over the past 400 years. The results show that AMO, PDO, and Total Solar Irradiance (TSI) play a considerable role in controlling the wet and dry decades of ISMR [Indian summer monsoon rainfall]. Resembling observational findings most of the dry decades of ISMR occur during a negative phase of AMO and a simultaneous positive phase of PDO.

Valdés-Pineda et al., 2017

This study analyzes these low-frequency patterns of precipitation in Chile (>30 years), and their relationship to global Sea Surface Temperatures (SSTs), with special focus on associations with the Pacific Decadal Oscillation (PDO) and the Atlantic Multi-decadal Oscillation (AMO) indices. … We conclude that a significant multi-decadal precipitation cycle between 40 and 60 years is evident at the rain gauges located in the subtropical and extratropical regions of Chile. This low-frequency variability seems to be largely linked to PDO and AMO modulation.

Reischelmann et al., 2017

We document that long-term patterns in temperature and precipitation are recorded in dripwater patterns of Bunker Cave and that these are linked to the North Atlantic Oscillation (NAO).

Lapointe et al., 2017

This paper investigates an annually-laminated (varved) record from the western Canadian Arctic and finds that the varves are negatively correlated with both the instrumental Pacific Decadal Oscillation (PDO) during the past century and also with reconstructed PDO over the past 700 years, suggesting drier Arctic conditions during high-PDO phases, and vice versa. These results are in agreement with known regional teleconnections, whereby the PDO is negatively and positively correlated with summer precipitation and mean sea level pressure respectively.

Lim et al., 2017

Our study demonstrated that flood frequency and climate changes at centennial-to-millennial time scales in South Korea have been coupled mainly with ENSO activity, suggesting that the hydrologic changes, including flooding and drought, in East Asia are coupled to the centennial-to-millennial-scale atmospheric-oceanic circulation changes represented by the ENSO pattern.

Reynolds et al., 2017

Evidence derived from instrumental observations suggest that Atlantic variability, associated with changes in SSTs and fluctuations in the strength of the Atlantic Meridional Overturning Circulation (AMOC), is directly linked with broader scale climate variability, including Brazilian and Sahel precipitation (Folland et al., 1986 and Folland et al., 2001), Atlantic hurricanes and storm tracks (Goldenberg et al., 2001 and Emanuel, 2005), and North American and European temperatures (Sutton and Hodson, 2005, Knight et al., 2006 and Mann et al., 2009).

Park et al., 2017

According to our results, the central Mexican climate has been predominantly controlled by the combined influence of the 20-year Pacific Decadal Oscillation (PDO) and the 70-year Atlantic Multidecadal Oscillation (AMO).

Bianchette et al., 2017

Seven periods of increased water level, varying in duration, occurred during the backbarrier period, with El Niño-Southern Oscillation (ENSO) likely the main climatic mechanism causing these periodic shifts in the paleo-precipitation levels. We suggest that the deepest water levels detected over the last ~3200 years correlate with periods of increased ENSO activity.