German Scientists Back Findings By Gebbie et al 2019, Believe Climate CO2 Sensitivity Even “Likely To Be Lower” Than 1.3°C

Share this...
Share on Facebook
Tweet about this on Twitter

At Die kalte Sonne site here, Prof. Fritz Vahrenholt and Frank Bosse published an analysis of Gebbie et al 2019. What follows is the translation in the English.

Climate surprises

A paper very worth reading from the USA from January 2019 in Science (Geoffrey Gebbie of the Woods Hole Oceanographic Institution/Peter Huybers of Harvard University, hereinafter GH19) is titled “The Little Ice Age and 20th-century deep Pacific cooling”.

It shows fascinating science.

The authors evaluated temperature measurements made in the deep sea by the famous expedition of the “HMS Challenger” in the 1870s. The ship sailed the Atlantic and Pacific, and probably provided the first data on the oceans down to depths of over 2000 meters. The recalibration of the old data alone is a work of art! What the paper found: The Pacific down in depths has cooled from 1870 to today, the Atlantic not.

With a model of the global waters down to such depths, the authors got to the bottom of the cause and concluded: The circulation of the deep sea means that the Pacific depths today are still influenced by the Medieval warm period (MWP, about 950-1250 AD) and the transition to the Little Ice Age (about 1500-1800 AD).

The warmed up water from 1000 years ago needs that long a time until it arrives at depths of 3000 meters in the Pacific! This implies two things: The MWP was a globally large-scale event, as we have also demonstrated in the MWP project (not represented in climate models in this way and so it is an “unknown factor” to the IPCC) and it is still at work today. The GH19 paper is evidence.

The temperature development in the Pacific from the paper:

Fig. 3: The temperature development in the Pacific to depths of 5500 meters. The MWP at the surface warmed the water until 1300 AD and the subsequent cooling of the “Little Ice Age” (LIA) later had a cooling effect on the Pacific Ocean. After 1750 the ocean was still “warmed up” and cooling continues even until today. Source: Supplements Fig. S5 from GH19.

Today’s climate still impacted by Medieval Warm Period

This is a wonderful example that our climate was NOT in equilibrium around 1750, like all climate models assume. We are still feeling the effects of MWP today.

If we now assume an equilibrium in the ocean around 1750 for the determination of the sensitivity “ECS“, i.e. the long-term (several centuries) temperature increase with a doubling of the CO2 concentration, and if we trace the warming back to anthropogenic impulses to this day, then we indeed neglect that there was still residual heat (completely unknown for models as already mentioned) in the 1750s.

CO2 effect on ocean warming overstated

The growth in the total heat content of the oceans to this day is therefore smaller than models assume and this leads to a lower sensitivity to anthropogenic effects. Nic Lewis also stated this in this commentary on the work by noting a significant reduction in the size of climate sensitivity from the findings of GH19, even if one follows the IPCC guidelines in the following calculations.

IPCC report needs fundamental revising

Should the findings of Gebbie and Huybers be confirmed, then the IPCC report needs to be fundamentally revised, especially for long-term temperature forecasts. But this also applies to the warming to be expected by 2100, as reflected in the TCR (transient climate response) estimates when the CO2 concentration doubles.

Reader F.B. posted a comment on Judith Curry’s blog for discussion at the beginning of January, 2019. In it he tried to derive the natural variability 1950-2016 from observations and NOT from models. Taking into account all assumptions made by the IPCC, what ultimately results is a climate sensitivity (TCR) of about 1.3 °C/doubling the carbon dioxide content in our atmosphere. If one adds the known influences of volcanoes, the Pacific El Nino/La Nina cycles (ENSO for short) and the solar influences (also assumed by the IPCC to be exactly the same) and takes into account the internal variability since 1950, one can only reconstruct the annual temperature trend only with this magnitude:


Fig. 4: Reconstructed global temperature since 1950 (dark blue) and the real observed (green).

The agreement is astounding and it was not created by the excessive fudging and tuning of many parameters, as is done in many climate models. Even the order in which the parameters are “tuned” plays an important role in the result.

Sensitivity of only 1.3°C

The good reconstruction results if a TCR of 1.3 is used, unlike the models for the IPCC’s 5th Assessment Report, which do it by using 1.85. The TCR of 1.3 is the most important parameter for the results. That is 30% less warming than models see. This is also the result of Lewis/Curry (2018), which we reported here.

An additional result of this IPCC-conforming approach is that (almost) all longer-term warming, in the example shown since 1950, is assumed to be caused by “anthropogenic forcing”, essentially CO2. Longer-term studies also show this: if a similar approach is chosen and the time spans from 1870 onwards are investigated, then it inevitably follows that all tendential warming comes from anthropogenic drives, in this case since 1870. All such methods contain this requirement. Any long-term natural warming source is excluded by assuming our climate was in “equilibrium” in the 1750s, i.e. that it was not driven by long-term drivers and that only (short-term in the sense of decades) internal variability, volcanoes and ENSO had an impact on the climate, with the exception of even more long-term changes caused by the Earth’s orbit (ice ages, interglacial ice ages).

Climate sensitivity “likely to be lower”

Against the background of the paper by Gebbie and Huybers presented above, it is however highly questionable that influences acting on longer timescales can be neglected from the outset by the IPCC approach. In plain language, this means that the above-mentioned estimate of sensitivity is rather an upper limit and, considering the ocean warmth was not in climate equilibrium in the 1750s, it is likely to be lower.

Focusing on observations leads to surprises

We keep hearing, again and again, that science can hardly find anything new on anthropogenic climate change. “The science is settled.” But don’t fall for it! GH19 is a shining example of true science when it comes to climate. We look forward to further news and observations. If you don’t concentrate on models, but rather work on empirics, then we are always in for a surprise!

Share this...
Share on Facebook
Tweet about this on Twitter

Pacific Ocean Tide Gauges Of 100+ Years: ‘Both The Relative Rate Of Rise And Acceleration Are Negative’

Share this...
Share on Facebook
Tweet about this on Twitter

As the sea levels have been oscillating, but not accelerating, in the long-term-trend tide gauges of Japan since the start of the 20th century, the same as all the other long-term-trend tide gauges of the world, it is increasingly unacceptable to base coastal management on alarmist predictions that are not supported by measurements.” – Parker, 2019

Image Source: Parker, 2019

I. A Pacific Ocean-induced sea level rise catastrophe by 2100?

The Pacific Ocean encompasses more than half of the Earth’s water volume.

Consequently, measured relative sea level trends in Pacific Ocean have important implications in analyzing the climate-modeled projections of future sea level rise.

Anywhere from about an additional meter (IPCC, 2013) to multiple meters (Garner et al., 2017; University of Hawaii, 2018) of sea level rise by 2100 is assumed in climate modeling due primarily to the exponential rise in anthropogenic CO2 emissions since the late 19th century.

To achieve these catastrophic sea level rise amplitudes, however, both the rate of increase and overall acceleration will need to be elevated dramatically – by more than an order of magnitude – from that which has been observed in the last century.

“Our analyses do not indicate acceleration in sea level in U.S. tide gauge records during the 20th century. Instead, for each time period we consider, the records show small decelerations that are consistent with a number of earlier studies of worldwide-gauge records. The decelerations that we obtain are opposite in sign and one to two orders of magnitude less than the +0.07 to + 0.28mm/yr accelerations that are required to reach sea levels predicted for 2100 by Vermeer and Rahmsdorf (2009), Jevrejeva, Moore, and Grinsted (2010), and Grinsted, Moore, and Jevrejeva (2010). Bindoff et al. (2007) note an increase in worldwide temperature from1906 to 2005 of 0.74 C.  It is essential that investigations continue to address why this worldwide-temperature increase has not produced acceleration of global sea level over the past 100 years, and indeed why global sea level has possibly decelerated for at least the last 80 years.”  (Houston and Dean, 2011)

II. Due to natural variability, relative sea level changes may only be gauged with long-term records

Pacific tide gauges sited in the same location for a century or longer (with consistent rates of subsidence or uplift) can qualitatively assess the overall rate or acceleration of sea level changes.

Shorter-term (less than 80 years) measurements may easily compromise the accurate computation of rate or acceleration trends due especially to the prominence of 60-year cycles of natural origin (Chambers et al., 2012).

Separating an alleged anthropogenically-forced sea level rise trend from natural variability has been a well-documented problem that has been intensively debated in the scientific literature.

• “Global sea levels have been rising through the past century and are projected to rise at an accelerated rate throughout the 21st century. This has motivated a number of authors to search for already existing accelerations in observations, which would be, if present, vital for coastal protection planning purposes. No scientific consensus has been reached yet as to how a possible acceleration could be separated from intrinsic climate variability in sea level records. This has led to an intensive debate on its existence and, if absent, also on the general validity of current future projections.” (Visser et al., 2015)
• Previous research has shown that sea-level acceleration determined from individual tide gauge records has remarkably large scatter as record lengths decrease due to decadal variations in sea level. We extend previous data sets to the present time and find even greater acceleration scatter. Using analytic solutions, sinusoidal oscillations with amplitudes and periods of typical decadal variations are shown to basically account for the relationship between record length and both acceleration and trend difference. Data show that decadal variations will obscure estimates of underlying accelerations if record lengths of individual gauges are not greater than at least 75 years. Although worldwide data are less affected by decadal variations than individual gauge data, decadal variations still significantly affect estimates of underlying accelerations, in particular for record lengths less than about 60 years. We give two examples of recent studies that use record lengths of about 30 to 60 years to determine acceleration or related trend difference. Previous authors dismissed the importance of decadal variations on their results and, as a result, reached invalid conclusions.”  (Houston and Dean, 2013)
• “A proper coastal management requires an accurate estimation of sea level trends locally and globally. It is claimed that the sea levels are rising following an exponential growth since the 1990s, and because of that coastal communities are facing huge challenges. Many local governments throughout Australia, including those on the coast, have responded to the various warnings about changes in climate and increases in sea levels by undertaking detailed climate change risk management exercises. … It is shown here that the exponential growth claim is not supported by any measurement of enough length and quality when properly analysed. The tide gauge results do not support the exponential growth theory. The projections by the relevant state bodies should therefore be revised by considering the measurements and not the models to compute the future sea level rises for the next 30 years following the same trend experienced over the last 30 years.” (Parker et al., 2013)

III. Long-term negative Pacific Ocean sea level rise rates and acceleration belie a human influence

It has already been suggested that an anthropogenic fingerprint has yet to be detected in Pacific Ocean measurements of sea level change due to the overriding presence of natural variability.

• “Sea level rates up to three times the global mean rate are being observed in the western tropical Pacific since 1993 by satellite altimetry. From recently published studies, it is not yet clear whether the sea level spatial trend patterns of the Pacific Ocean observed by satellite altimetry are mostly due to internal climate variability or if some anthropogenic fingerprint is already detectable. We show that subtraction of the IPO contribution to sea level trends through the method of linear regression does not totally remove the internal variability, leaving significant signal related to the non-linear response of sea level to El Niño Southern Oscillation (ENSO). In addition, by making use of 21 CMIP5 coupled climate models, we study the contribution of external forcing to the Pacific Ocean regional sea level variability over 1993–2013, and show that according to climate models, externally forced and thereby the anthropogenic sea level fingerprint on regional sea level trends in the tropical Pacific is still too small to be observable by satellite altimetry.”  (Palanisamy et al., 2015)
• “The tropical Pacific has featured some remarkable trends during the recent decades such as an unprecedented strengthening of the Trade Winds, a strong cooling of sea surface temperatures (SST) in the eastern and central part, thereby slowing global warming and strengthening the zonal SST gradient, and highly asymmetric sea level trends with an accelerated rise relative to the global average in the western and a drop in the eastern part. These trends have been linked to an anomalously strong Pacific Walker Circulation, the major zonal atmospheric overturning cell in the tropical Pacific sector, but the origin of the strengthening is controversial. Here we address the question as to whether the recent decadal trends in the tropical Pacific atmosphere-ocean system are within the range of internal variability, as simulated in long unforced integrations of global climate models. We show that the recent trends are still within the range of long-term internal decadal variability.” (Hadi Bordbar et al., 2016)

Further supporting the conclusion that an anthropogenic fingerprint has not yet been detected in sea level data, a compilation of records from 30 long-term Pacific Ocean tide gauges has been made available in a new paper (Parker, 2019) published in the journal Ocean and Coastal Management.

The averaged measurements from these 30 tide gauge instruments reveal a negative trend in both the rate of sea level rise (-0.02 mm/yr) and acceleration (-0.00007 mm/yr2) since the early 20th century.

Observations of negative sea level trends for the Earth’s largest ocean basin while CO2 concentrations rose from about 300 ppm (1900 C.E.) to well over 400 ppm obviously do not support the narrative that says anthropogenic CO2 emissions are a driver of relative sea level changes.

Climate model-based projections of meters of sea level rise by 2100 promulgated by proponents of human-induced climate alarm may therefore be categorized as both speculative and dubious.

Parker, 2019

Sea level oscillations in Japan and China

since the start of the 20th century

“Japan has strong quasi-20 and quasi-60 years low frequencies sea level fluctuations. These periodicities translate in specific length requirements of tide gauge records. 1894/1906 to present, there is no sea level acceleration in the 5 long-term stations. Those not affected by crustal movement (4 of 5) do not even show a rising trend. … In Japan tide gauges are abundant, recording the sea levels since the end of the 19th century. Here I analyze the long-term tide gauges of Japan: the tide gauges of Oshoro, Wajima, Hosojima and Tonoura, that are affected to a lesser extent by crustal movement, and of Aburatsubo, which is more affected by crustal movement. Hosojima has an acceleration 1894 to 2018 of +0.0016 mm/yr2. Wajima has an acceleration 1894 to 2018 of +0.0046 mm/yr2. Oshoro has an acceleration 1906 to 2018 of −0.0058 mm/yr2. Tonoura has an acceleration 1894 to 1984 of −0.0446 mm/yr2. Aburatsubo, has an acceleration 1894 to 2018 of −0.0066 mm/yr2. There is no sign of any sea level acceleration around Japan since the start of the 20th century. The different tide gauges show low frequency (>10 years) oscillations of periodicity quasi-20 and quasi-60 years. The latter periodicity is the strongest in four cases out of five.”
As the sea levels have been oscillating, but not accelerating, in the long-term-trend tide gauges of Japan since the start of the 20th century, the same as all the other long-term-trend tide gauges of the world, it is increasingly unacceptable to base coastal management on alarmist predictions that are not supported by measurements.”
“Recently the Japan Meteorological Agency (2018) reported an analysis of the measured sea levels around Japan in the stations less affected by crustal movement. It showed substantial stability of sea level since the early 20th century.”
The Japan Meteorological Agency (2018) has shown that the relative rise in sea level on the coast of Japan has stabilized since the beginning of the 20th century and has not accelerated. The analysis presented here has further strengthened this result. We have also shown that the only accelerating record, the one of Hamada, is an artifact ofcoupling of the long-term tide gauge record of Tonoura, discontinued in 1984, with the short, recent tide gauge record of Hamada II, established in 1984. This latter result is consistent with the findings of Okunaka and Hirahara (2016). They verified by correcting by GPS data the tide gauge results at 3 stations, that a sea level rise of Japan was estimated largely due to the subsidence at Hamada.”
“At least 60 years of data collected by the same location without any perturbing event are needed to compute a reasonably accurate sea level rate of rise by linear fitting, and almost double that length, at least more than 100 years, are needed to compute a reasonably accurate sea level acceleration by parabolic fitting. In Japan, with its short time window, it may be wrongly concluded that the sea levels have been accelerating since the 1980s. By considering the data since 1906, as done by Japan Meteorological Agency (2018), or by considering the data since 1894 for Hosojima, Wajima, Aburatsubo and Tonoura, and since 1906 for Oshoro as done here, the sea levels are perfectly stable.”
“The relative sea level rise measured by a tide gauge has a sea and a land component. The relative sea level may rise, or fall, not only because the volume of the water is increasing, or reducing. It may also rise, or fall, because the tide gauge instrument is sinking, or uplifting. The sea component has important multi-decadal periodicities of quasi 60 years. Hence, not less than 60 years of data are needed to infer a rate of rise, and many more years, not less than 100 years, are needed to infer an acceleration. … Only where the land component is characterized by a stable pattern of subsidence or uplift, an absolute sea level rise can be computed. In Japan, only few areas are relatively stable, with the rest affected by crustal movements strongly variable in time and space.”
“The East coast of Asia has the five long-term tide gauges, Oshoro, Wajima, Hosojima and Tonoura affected to a lesser extent by crustal movement, and Aburatsubo, that is affected by crustal movement, all in Japan. Without Aburatsubo, “Japan-4” data set, the average relative rate of rise is +0.08 mm/yr, and the average acceleration is negative, −0.01105 mm/yr2. With Aburatsubo, “Japan-5” data set, the average relative rate of rise is +0.79 mm/yr, and the average acceleration is still negative, −0.01016 mm/yr2.”
“Oceania has similarly five long-term tide gauges, Fremantle, that is however in the Indian Ocean, and Sydney in Australia, Auckland and Dunedin in New Zealand, and Honolulu, in the Hawaii Islands. With Fremantle, ‘Oceania-5’ data set, the average relative rate of rise is +1.306 mm/yr, and the average acceleration is +0.00490 mm/yr2. Without Fremantle, the average relative rate of rise is +1.209 mm/yr, and the average acceleration is +0.00469 mm/yr2.”
“Along the West Coast of North America, by also accepting tide gauge records with more than 80 years of data to properly cover all the Pacific coast from Alaska to Panama, in the 20 long term stations of Ketchikan, AK, USA; Sitka, AK, USA; Juneau, AK, USA; Unalaska, AK, USA; Prince Rupert, Canada; Point Atkinson, Canada; Vancouver, Canada; Victoria, Canada; Tofino, Canada; Friday Harbor, WA, USA; Seattle, WA, USA; Neah Bay, WA, USA; Astoria, OR, USA; Crescent City, CA, USA; San Francisco, CA, USA; Santa Monica, CA, USA; Los Angeles, CA, USA; La Jolla, CA, USA; San Diego, CA, USA, and Balboa, Panama, the average rate of rise is −0.47 mm/yr, and the average acceleration is +0.0015 mm/yr2, ‘West Coast of North America-20’ data set.”
“The Pacific Ocean covers about 46% of Earth’s water surface, and more than 50% of the Earth’s water volume. Thinking even more global[ly], even larger data sets also including segmented records, do not show accelerations larger than a few micrometers per year squared (Parker and Ollier, 2017a, b). By considering the average of the tide gauges included in the Japan5, Oceania-4 and West Coast of North America-20 data sets, both the relative rate of rise and the acceleration are negative, −0.02139 mm/yr and −0.00007 mm/yr2 respectively. These values compare very badly with the models’ projections and speculations adopted for coastal management such as
the Intergovernmental Panel on Climate Change (2013) models’ projections of 850 mm by 2100 require an acceleration of +0.1268 mm/yr2 to come true.
• the speculation by Garner et al. (2017) of 2650 mm by 2100 that requires an even larger acceleration of +0.5556 mm/yr2 to come true. This is 5 times more than the already exaggerated IPCC values.
• the speculation by University of Hawaii (2018) of 3 feet, or914.4 mm, by 2050 that requires an even larger acceleration of +1.525 mm/yr2 to come true. This is 15 times more than the already exaggerated IPCC values.
The measured values of sea level rate of rise and acceleration are much smaller than the models’ predictions and the speculations.”

Image Source: Parker, 2019
Share this...
Share on Facebook
Tweet about this on Twitter

German Scientists To More Closely Investigate Cloud Formation, A Vital Component In Climate

Share this...
Share on Facebook
Tweet about this on Twitter

Leipzig, 20 December 2018

Researchers from Leipzig cooperate with scientists from Punta Arenas (Chile) to learn more about the relationship between air pollution, clouds and precipitation.

Leipzig/Punta Arenas. How do airborne particles, so-called aerosols, affect the formation and life cycle of clouds and precipitation? In order to come one step closer to solving this question, atmospheric scientists from the Leibniz Institute for Tropospheric Research (TROPOS) and the Leipzig Institute for Meteorology (LIM) at Leipzig University will observe the atmosphere at one of the cleanest places in the world for at least a year. The choice fell on Punta Arenas because the city is located on a comparable geographical latitude as Germany and will thus enable comparisons between the northern and southern hemispheres. The measurement campaign is part of the International Year of Polar Prediction (YOPP), which aims to improve weather and climate forecasts for the polar regions through intensive measurements.

“The experiment will allow us to explain more precisely the observed regional differences in the efficiency of ice formation in clouds and to describe the role of aerosols in more detail,” says Dr. Patric Seifert, scientist at TROPOS and initiator of DACAPO-PESO. “The synergies of the versatile remote sensing instruments and the development of new evaluation algorithms based on them will provide more precise insights into the clouds. One objective will be, to clarify whether the ice crystals in the virtually aerosol-free atmosphere of Punta Arenas grow more frequently through riming in liquid water layers than through aggregation as compared to the northern hemisphere,” explains Junior Prof. Heike Kalesse from Leipzig University.

The measuring station in Punta Arenas was set up in November and December 2018 by three scientists from TROPOS and UMAG. Patric Seifert returned to Leipzig shortly before Christmas and is satisfied with the progress made so far: “All instruments are working reliably and the first interesting observations have already been made. Pure liquid water clouds were observed at temperatures of around -20°C, which is extremely rare during comparable measurements in Germany. This could be an indication for few ice germs in the southern hemisphere.”

The formation of clouds and precipitation essentially depends on three parameters: Humidity, temperature, as well as the availability and type of aerosol particles that serve as nuclei for cloud drops and ice crystals. Even though it is widely accepted that water vapour and temperature dominate cloud processes, scientists still debate about the actual influence of aerosol particles on the weather. It is known that every cloud droplet and every ice crystal that forms at temperatures higher than -40°C requires the availability of a drop or ice nuclei.

In Germany and other regions in the mid-latitudes of the northern hemisphere, most precipitation (in the free troposphere) occurs between two and twelve kilometres above the ground. These air layers are characterised by aerosol particles from man-made air pollution, desert dust and forest fires. In the mid-latitudes of the southern hemisphere, these particles are largely absent because there are more oceans and much less industry, deserts and forests. If the amount of aerosol particles available in the atmosphere changes, the amount of cloud droplets and ice crystals formed also changes. It can thus be explained that liquid water clouds over regions of Europe or Southeast Asia that are heavily contaminated with aerosols consist of considerably more droplets. In contrast, liquid water clouds in less polluted regions such as over the oceans or polar regions consist of fewer and often larger droplets.

A similar influence of aerosol particles on the formation of ice crystals has already been observed by TROPOS scientists in clouds above Leipzig: Clouds formed under the influence of Sahara dust form ice more frequently and at higher temperatures than clouds formed in clean air. The life cycle of a cloud thus takes a different course because the formation of the tiny ice crystals and cloud droplets are the beginning of numerous processes that accompany the development of the cloud and ultimately lead to the formation of precipitation.

The knowledge about the connections between aerosol particles and clouds comes mainly from measurements in the more polluted northern hemisphere. However, relatively little is known in detail about how these processes take place under the much cleaner conditions of the southern hemisphere. Therefore, the idea for the field experiment DACAPO-PESO (Dynamics, Aerosol, Cloud and Precipitation Observations in the Pristine Environment of the Southern Ocean) was born at TROPOS, which should provide a reference data set in the Southern midlatitudes.

In order to be able to compare the situation in the northern hemisphere with that in the southern hemisphere, the choice was therefore made for a location that lies approximately at the same latitude as Germany and therefore has similar temperature and climate conditions. With the exception of South America, the southern hemisphere is covered by oceans in these latitudes and long-term measurements at sea are not practicable to this extent. Therefore, the choice fell on Chile, where the researchers from Leipzig (52°N) found a cooperation partner with Magellan University in Punta Arenas (53°S).

South America is currently in the focus of international atmospheric research: Almost 3000 km to the north, a large-scale measurement campaign by various universities and institutions from the USA is currently undergoing in Argentina and southern Brazil: RELAMPAGO-CATI investigates the formation of thunderstorms to improve the prediction of hail and tornadoes.

In recent years, the available techniques for the continuous observation of aerosols and clouds have progressed extremely. Lidar and radar equipment such as the Leipzig Aerosol and Cloud Remote Observations System (LACROS) can be used to record the structure of cloud and aerosol layers as well as precipitation from the ground with high resolutions in the range of seconds and meters. In the DACAPO-PESO measurement campaign, a total of three lidar systems are used to record aerosol properties. In addition, three differently configured Doppler radar systems observe the structure of clouds and precipitation. The radars and a so-called Doppler lidar also provide information on the vertical movement of the air, which is so important for cloud formation. These vertical movements lead to cooling. This increases the relative humidity in the air and cloud formation can occur.  Rain sensors record the size and type of precipitation arriving on the ground, a microwave radiometer determines the amount of water vapour and liquid water present in the atmosphere, and radiation measuring instruments document the influence of aerosols and clouds on the energy arriving on the Earth’s surface. Most of the instruments are part of the LACROS station of the TROPOS, which is during DACAPO-PESO extended by a new Doppler radar of LIM and a lidar of the UMAG.

All measurements together provide a detailed picture of the weather in the clean atmosphere over Punta Arenas. The long time series of measurements of at least one year will provide a comprehensive data set. This will serve the scientists as a reference for the comparison with the weather conditions in the highly aerosol-burden temperate latitudes of the northern hemisphere. .

Since TROPOS was founded in the 1990s, researchers at the institute have been working on the question of how aerosols and clouds interact. They discovered, for example, that ice forms in clouds in the northern hemisphere at much higher temperatures than in clouds in the southern hemisphere. This was the result of lidar-assisted investigations at TROPOS, UMAG in Chile, Stellenbosch University in South Africa and during transatlantic cross-sections of the research vessel Polarstern, which were already collected in 2008-2010. In Central Europe, around 70 percent of clouds already form ice at temperatures around -18 degrees Celsius. In southern Chile and South Africa, on the other hand, only 20 and 35 percent respectively form ice at the same temperature. The reason for such a contrast is most probably the larger number and greater diversity of aerosol particles floating in the air in the northern hemisphere.

The two project coordinators Seifert and Kalesse are confident that DACAPO-PESO will significantly increase the level of knowledge on the interaction between aerosols, clouds, and precipitation, especially due to the good personnel support. Thanks to approved projects of the DFG, the ESF and own funds of TROPOS and UMAG, the DACAPO-PESO team already comprises 4 experienced researchers and 4 PhD students. The data set will provide an unprecedented wealth of information on aerosols and clouds in the mid-latitudes of South America.


Foth, A., Kanitz, T., Engelmann, R., Baars, H., Radenz, M., Seifert, P., Barja, B., Kalesse, H., and Ansmann, A.: Vertical aerosol distribution in the Southern hemispheric Midlatitudes as observed with lidar at Punta Arenas, Chile (53.2° S and 70.9° W) during ALPACA, Atmos. Chem. Phys. Discuss., , in review, 2018.

The study was funded by the BMBF (HD(CP)2 ; FKZ: 01LK1504C and 01LK1502N), the European Union (ACTRIS grant no. 262254; EUCAARI grant no. 036 833-2), the Leibniz Association (OCEANET) and the Deutsche Forschungsgemeinschaft (PROM, DFG-KA 4162/2-1).

More info:

Dr Patric Seifert
Working group for ground based remote sensing at the
Leibniz Institute for Tropospheric Research (TROPOS)
Tel. +49-341-2717-7080

Junior Prof Heike Kalesse
Working group “Remote Sensing and the Arctic System” at the
Leipzig Institute for Meteorology (LIM) of the Leipzig University
Tel.: +49-341-97-36650

Tilo Arnhold
Public Relations, TROPOS
Tel. +49-341-2717-7189

Susann Huster
Media Editorial Office, Leipzig University
Tel. +49-341-97-35022

Share this...
Share on Facebook
Tweet about this on Twitter

The Green Energies Of Instability…Swiss Power Grid Requires 200 Fold More Intervention Than 8 Years Ago!

Share this...
Share on Facebook
Tweet about this on Twitter

Switzerland’s power grid operator, Swissgrid, seems to have its hands full nowadays.

Just less than 10 years ago, Switzerland’s power grid was among the most stable worldwide, operating with the same efficiency as its famous Swiss-made watches.

Today however, thanks to green energies like wind and sun, this is no longer the case, so reports the country’s Baseler Allgemeine Zeitung (BAZ).

According to the BAZ, “In 2011 Swissgrid, the operator of the Swiss electricity grid, only intervened twice in the electricity grid to prevent major problems.”

But last year, in order to keep the power supply stable, “these interventions reached a new record number with 382 interventions.”

In other words, on average more than once a day. Hence the article by the BAZ is titled: “The power grid is fluctuating like never before.”

The number of interventions needed to keep the Swiss power grid stable jumped from just 2 in 2011 to 382 last year, a new record. Chart source: Swissgrid, via BAZ.

Stabilizing the Swiss grid can mean throttling one or more of its dozen power plants, says the BAZ. Power fluctuations in the grid not only heighten the risk of overload and blackouts, but also threaten sensitive production equipment which rely on a steady supply of power for critical process control.

The BAZ adds that the problem is not only getting worse in Switzerland, but also in all the grids in neighboring countries, such Germany, France, Austria and Italy.

What’s the reason for all the instability? The BAZ reports:

With more and more electricity from the sun and wind, whose production depends on the weather changing at short notice, it is becoming increasingly difficult to ensure an even supply. The production of these renewable electricity producers is much higher than consumption in wind and sunshine, but too low at night and when there is no wind.”

The BAZ also notes that Switzerland is sorely lacking in energy storage systems to balance out the peaks that are seen in sun and wind power production.

“Customers bearing the costs”

The Swiss daily also writes that the country’s power grid at times is so unstable that on July 2 (2018) alone, Swissgrid “had to throttle power plants eight times or have them started up so that grid operation was not endangered.”

In Switzerland, whenever a power plant is asked to throttle its power production in order to keep the green-energy-flooded grid stable, it is entitled to financial compensation. No problem here: The power consumers pick up the tab, reports the BAZ.

When asked why the grid has become increasingly unstable with each passing year, Swissgrid replied that it was “due to sun and wind energy”, the BAZ reports.

Europe dodged a bullet two weeks ago

Not only the Swiss power grid has been rickety and unstable, but so has the entire European grid, writes the BAZ. An example of this occurred recently, Swissgrid here reported in a press release:

A drop in frequency on the synchronously interconnected Continental Europe system was registered on 10 January 2019 at around 21.00.

The causes of this drop are still under investigation by the transmission system operators (TSOs) of ENTSO-E Regional Group Continental Europe.

A mismeasurement on lines between Germany and Austria was identified and corrected by TenneT Germany. However, this mismeasurement cannot explain the entire frequency drop on 10 January. The investigation, which is still on-going, is reviewing the significant variation in European production around 21.00 which coincided with changes in trade between different countries.

The frequency drop was sufficient to alert the TSOs but did not at any moment endanger security of supply.

TSOs of ENTSO-E Continental Europe Regional Group are taking collective actions to restore frequency as foreseen in such cases and continue their technical analysis of the incident.”

Next time Europeans may not be so lucky.

Unfortunately it may just take a widespread blackout to get European leaders back to their senses. But even then, don’t count on it. Some policymakers are in fact insisting that the grid instability problems can be solved by “adding more wind and solar power.”

Share this...
Share on Facebook
Tweet about this on Twitter

The Climate Non-Problem…Major Aspects Of Japan’s Climate Have Seen No Change In 100 Years

Share this...
Share on Facebook
Tweet about this on Twitter

By Kirye in Tokyo

In Japan we often hear about “climate change” and how citizens are told by their elected officials that it is necessary to act quickly to combat it.

Leaders worldwide want to have trillions of already scarce dollars to “mitigate climate change”.

But what if there hasn’t been any real changes in climate and so really nothing much to mitigate, except changes that exist only in models?

What if the changes that have been taking place are all within the range of natural variability? Does it really make any sense to risk bankrupting the planet to fight a problem that doesn’t really exist?

In my home country of Japan we also keep hearing how our own weather is becoming more extreme and storms more intense because of rising CO2. Yet, when we look at the statistics for Japan over the past decades, we see some very surprising developments.

Precipitation over the past 100 years

Firstly, annual precipitation in Japan has not seen any trend change – in 100 years.

Completely flat annual precipitation trend for Japan over the past 100 years. Where’s the climate change? Source:

As the chart above shows, there hasn’t been very many extremes over the past 2 decades. Precipitation has been quite steady for more than a decade. Japan saw a wetter period during the 1950s than what we have seen in the 2010s. The 1970s and 80s were marked by much variability.

No rise in landfalling typhoons

As mentioned earlier, alarmists and media like telling the public that storm activity has been increasing as well. But again the data do not show this. Like hurricanes and tornadoes in the US, or cyclones globally, typhoons making landfall in Japan have also not been increasing:

No trend. Data source:

The Japanese Meteorology Agency has been keeping records on typhoons since 1951. As the chart above shows, there’s been no trend over the past seven decades. Is this “change” that needs to be combatted?

In terms of typhoons and precipitation, it seems we are fighting — at a gigantic expense — changes that don’t even exist.

Very little sea level rise

Japan is an island country, and so sea level rise is an important issue. Yet, even sea level around Japan is not much different than it was in 1950:

Sea levels around Japan were almost just as high 70 years ago as they are today, according to the meticulous data of the JMA.

Temperature abrupt jump

Finally we look at Japan’s mean annual temperature. Though it has risen almost 1°C over the past century, here we observe that there has not been any change over the past two decades.

Source: JMA.

The chart above has some notable features. Namely, annual temperatures from 1951 to the mid 1980s were flat, even cooling a bit. Why wasn’t “greenhouse gas” CO2 driving the temperatures up? Was it taking time off?

Very poor correlation

Then at the very end of the 1980s, in just a matter  of a couple of years, the temperature literally jumped to a whole new plateau, but it has not climbed much ever since. What’s going on here?

Does CO2 take 30 years off, work for a couple of years, and then disappear for another 30 years again? The correlation between CO2 and Japan mean annual temperature is very poor.

The abrupt jump at the end of the 1980s likely had something to do with natural oceanic cycles.

In summary, major aspects of the Japanese climate have seen no trend change over the past 100 years, and the temperature increase hardly appears to have much to do with CO2.

Pierre contributed to this post

Share this...
Share on Facebook
Tweet about this on Twitter

Atmospheric Physicists: A Human Signature Hasn’t Shown Up In 40 Years Of Temperature Change Observations

Share this...
Share on Facebook
Tweet about this on Twitter

“It is not possible to reliably support the view of the presence of
global warming in the sense of an enhanced greenhouse effect
due to human activities.” — Drs. Varotsos and Efstathiou, 2019

Image Source: Varotsos and Efstathiou, 2019

In a step-by-step dissection of the anthropogenic global warming (AGW) hypothesis, or “greenhouse hypothesis of global warming”, two prominent Greek atmospheric physicists  – Dr. Carlos Varotsos and Dr. Maria Efstathiou – expose the withering contradictions between (a) what is hypothesized to occur atmospherically according to AGW models and (b) what was actually observed from satellite measurements during 1978 to 2018.

According to AGW models, there was supposed to be “a consistent warming with gradual increase from low to high latitudes in both hemispheres” in response to the dramatic increase in greenhouse gases over the last 40 years.

According to temperature change observations in the satellite era (December, 1978, to present), this pattern did not occur.

According to AGW models, there was supposed to be an evident intrinsic relationship between lower stratospheric temperatures and tropospheric temperatures in accordance with the explosive increase in anthropogenic CO2 emissions during 1978-2018.

Satellite observations do not indicate that such a stratospheric-tropospheric relationship existed during this period.

The fundamental discrepancies between AGW models and real-world observations led these climate scientists to conclude that (a) “climate models are not able to simulate real climate”, and (b) the view that increases in greenhouse gases from human activities are what caused the global warming over the last 40 years cannot be reliably supported by observed evidence.

Varotsos and Efstathiou, 2019

Has global warming already arrived? 

• “The enhancement of the atmospheric greenhouse effect due to the increase in the atmospheric greenhouse gases is often considered as responsible for global warming (known as greenhouse hypothesis of global warming). In this context, the temperature field of global troposphere and lower stratosphere over the period 12/1978–07/2018 is explored using the recent Version 6 of the UAH MSU/AMSU global satellite temperature dataset. Our analysis did not show a consistent warming with gradual increase from low to high latitudes in both hemispheres, as it should be from the global warming theory.”
• “In addition, in the lower stratosphere the temperature cooling over both poles is lower than that over tropics and extratropics. To study further the thermal field variability we investigated the long-range correlations throughout the global lower troposphere-lower stratosphere region. The results show that the temperature field displays power-law behaviour that becomes stronger by going from the lower troposphere to the tropopause. This power-law behaviour suggests that the fluctuations in global tropospheric temperature at short intervals are positively correlated with those at longer intervals in a power-law manner. The latter, however, does not apply to global temperature in the lower stratosphere. This suggests that the investigated intrinsic properties of the lower stratospheric temperature are not related to those of the troposphere, as is expected by the global warming theory.”
• “In summary, the tropospheric temperature has not increased over the last four decades, in both hemispheres, in a way that is more amplified at high latitudes near the surface. In addition, the lower stratospheric temperature did not decline as a function of latitude. Finally, the intrinsic properties of the tropospheric temperature are different from those of the lower stratosphere. Based on these results and bearing in mind that the climate system is complicated and complex with the existing uncertainties in the climate predictions, it is not possible to reliably support the view of the presence of global warming in the sense of an enhanced greenhouse effect due to human activities.”
• “Over the last decades, the rise in surface air temperature in regions of our planet has led to a debate in the scientific community about the causes and impacts of this temperature rise, especially if it comes from anthropogenic activities or is of natural origin.”
• “[R]eal climate change results from the non-linear interactions between numerous components of the climatic system. In these should also be taken into consideration and possible contributions by external forcings e.g., cosmic factors, such as solar activity. Despite the projected exponential growth in computer power, these processes will not be adequately resolved in numerical climate models in the near future (Franzke et al., 2015). Stochastic methods for numerical climate prediction may allow for an adequate representation of uncertainties, the reduction of systematic biases and improved representation of long-term climate variability (e.g., Droegemeier, 2009). Some analyses show that current models are not able to simulate real climate. The main reason is that climate is a high-dimensional forced and dissipative complex system with chaotic dynamics that displays different physical and chemical properties of its various components and coupling mechanisms.”

Image Source: Varotsos and Efstathiou, 2019
• “An identifiable signature of [anthropogenic] global warming is the combination of tropospheric heating and stratospheric cooling leading to an increase in the height of tropopause. However, according to Figure 1 this is not the case, because the TA [tropopause air] trend at the tropopause is near zero.”
• “According to Hoskins (1991) the expectation for global warming is to be more enhanced at high latitudes near the surface. That is, in the case of global warming occurrence, warming would have been stronger at the poles and would gradually decrease by approaching the equator. However, the pattern depicted in Table 1 does not comply with the gradual increase of the warming with latitude as predicted by the global warming theory.”
• “[A]ccording to the results obtained by Xia et al. (2018) the upper troposphere and lower stratosphere are heated by ozone, which affects the high clouds (due to its effect on relative humidity) and the stratospheric water vapor (due to its impact on the tropical tropopause temperature). Consequently, the thermal regime in the lower stratosphere is mainly affected by the ozone dynamics in this area and not by the thermal regime in the troposphere alone. Therefore, the observed cooling in the lower stratosphere can not be attributed unambiguously to the warming of the troposphere, as dictated by the theory of global warming.”
Share this...
Share on Facebook
Tweet about this on Twitter

Surprise: Decline In GDP-Adjusted Global Catastrophe Losses Over Past 25 Years, Violent Tornadoes Trending Downward

Share this...
Share on Facebook
Tweet about this on Twitter

By Die kalte Sonne

(German text translated/edited in the English by P Gosselin)

World economic output is rising and rising. This can be nicely seen in the growth of the global total gross domestic product. It is therefore not surprising that losses from natural catastrophes are also rising steadily. As there is more value that can be destroyed, the amount of damage would increase even if the number and severity of natural catastrophes remained constant.

This is an aspect that is often concealed when MunichRe and other companies disseminate statistical loss figures.

A new study by Roger Pielke has been able to document precisely this effect. Over the past 25 years, losses have risen sharply, but when standardized over GDP, a decline has been recorded. The study was published in the journal Environmental Hazards on 27 October 2018. Abstract:

Tracking progress on the economic costs of disasters under the indicators of the sustainable development goals
The Sustainable Development Goals indicator framework identifies as an indicator of progress the objective of reducing disaster losses as a proportion of global gross domestic product. This short analysis presents data on this indicator from 1990. In constant 2017 US dollars, both weather-related and non-weather related catastrophe losses have increased, with a 74% increase in the former and 182% increase in the latter since 1990. However, since 1990 both overall and weather/climate losses have decreased as proportion of global GDP, indicating progress with respect to the SDG indicator. Extending this trend into the future will require vigilance to exposure, vulnerability and resilience in the face of uncertainty about the future frequency and magnitude of extreme events.”

Violent  tornadoes in the USA have declined over the past 70 years

Severe tornadoes in the USA since 1950. Source: Roger Pielke Jr.

Share this...
Share on Facebook
Tweet about this on Twitter

Munich Conference: Leading Danish Astrophysicist Says Solar Activity Has Significant Impact On Global Climate

Share this...
Share on Facebook
Tweet about this on Twitter

Danish Professor Henrik Svensmark is a leading physicist of cosmic radiation. At the end of last year he made a presentation at the 12th International Climate Conference in Munich, where he demonstrated that the climate is indeed modulated in large part by cloud cover, which in turn is modulated by solar activity in combination with cosmic rays.

His theory is that cosmic rays, which are extremely fast-flying particles – which originate from dying supernovae – travel through the cosmos, strike the Earth’s atmosphere and have a major impact on cloud cover and thus climate on the Earth’s surface.

This, Svensmark says, has been confirmed in numerous laboratory experiments.

Video source: EIKE.

In his presentation, the renowned Danish scientist showed how solar activity modulates the cosmic rays striking the atmosphere, and thus the climate-impacting cloud cover. Dr. Svensmark shows that there are powerful correlations worldwide between solar activity and climatic cycles, and so the sun is clearly playing a role in combination with the cosmic cloud-seeding rays. Hundreds of studies confirm this.

Observations and proxy data show that “when you have high cosmic rays, you have a cold climate” because of greater cloud cover. According Svensmark, the net effect of clouds is to cool the Earth by up to 30 W/m2.

Clouds are extremely important for the Earth’s energy budget. The net effect is about 20 to 30 watts per square meter.”

That figure is great in terms of impact on climate change, and it is grossly neglected by CO2-fixated climate scientists.

His research shows there is clear link between low cloud cover formation and galactic cosmic rays:


Sun modulates the cosmic ray intensity hitting the Earth’s atmosphere

In his presentation (see video) he explains the mechanism of how the cosmic rays seed low level clouds, which act to cool the climate. In periods of intense solar activity, the sun’s magnetic field engulfs and shields the Earth’s atmosphere from the cloud-seeding cosmic rays, thus less low level clouds are formed and the Earth warms.

Vice versa, i.e. during periods of low solar activity, the sun’s magnetic field is weaker, and so more cosmic rays are able to penetrate into the atmosphere and seed clouds. The resulting clouds act to cool the planet.

Confirmed by experiments

Svensmark’s experiments confirm that solar cycles impact energy changes in the oceans by an order of 1.5 W/m2 over an 11-year cycle and that his findings are consistent with climate changes over the Holocene and even geological times going back more than 100 million years.

Over geological history, especially when the Earth traveled through one of the spiral arms of the Milky Way, cosmic rays striking the atmosphere were very intense and thus led to extremely cold conditions known as the Snowball Earth episodes. Other scientists insist the episodes were caused by intense volcanic eruptions.

Significant solar changes in Earth’s energy budget

Dr. Svensmark summarizes the solar activity/cosmic ray climate modulation system with the following chart:

In the end, changes in solar activity lead to significant changes in the earth’s energy budget, and thus climate change, Svensmark believes. This explains why the Earth has seen “coolings and warmings of around 2°C repeatedly over the past 10,000 years.”

He concludes:

The Sun became unusually active during the 20th Century and as a result part of the ‘global warming’ observed.”

Share this...
Share on Facebook
Tweet about this on Twitter

New Paper: Modern Warming Was Driven By ‘Primarily Natural’ Factors. Global Cooling Has Now Begun.

Share this...
Share on Facebook
Tweet about this on Twitter

Four climate scientists assert (1) the last ~130 years of temperature changes fit “perfectly” into statistical indices of natural variation, and (2) a long-term deep cooling of the Earth system has recently commenced.

Image Source: Mao et al., 2019

An analysis published in the journal Atmospheric and Climate Sciences by 4 climate scientists reveals the 1880-2013 temperature changes fit “perfectly” (0.9 correlation) into a calculation utilizing 15,295 periodic functions of natural variation.

The authors claim this affirms that the non-anthropogenic “major climate factors” (i.e., solar-cloud and ENSO forcing) can still be considered the “main reason” driving modern warming (Lakshmi and Tiari, 2015; Hassan et al., 2016; McLean, 2014Yeo and Kim, 2015;  Wielicki et al., 2002; Douglass and Knox, 2014; Sejrup et al., 2010Large and Yeager, 2012Irvine, 2015; Cess and Udelhofen, 2003; Clark, 2010Ogurtsov et al., 2017; Fleming, 2018Zherebtsov et al., 2019).

Mao, Tan, Chen, and Fan (2019) effectively suggest we humans do not exert fundamental control over the Earth’s climate-modulating “Ocean Stabilization Machine”.

Consequently, their statistical analysis further indicates a global cooling trend has recently begun, and the overall decline in global temperature will precisely reach −0.6051˚C below the long-term average in the year 2111.

Mao et al., 2019

The “Ocean Stabilization Machine” May
Represent a Primary Factor Underlying the
Effect of “Global Warming on Climate Change”

• “Contemporary references to global warming pertain to the dramatic increase in monthly global land surface temperature (GLST) anomalies since 1976. In this paper, we argue that recent global warming is primarily a result of natural causes.”
• “Global climate changes are controlled by major periodic [natural] factors that represent basic principles in climatology, such as solar radiation, atmospheric circulation and oceans.”
• “A number of scientists subjectively consider that the recent dramatic upward trends in monthly GLST anomalies represent non-periodic and irreversible changes and postulate that warming related to the global greenhouse effect has primarily been caused by anthropogenic emissions. However, with the decline of global warming, an increasing number of scientists have started to question this view [Chen and Tung, 2014; Easterling and Wehner, 2009; Fyfe et al., 2013; Meehl et al., 2011; Risbey et al., 2014; Curry and Webster, 2011Loehle, 2007; Lindzen, 2007; Holland, 2013Seneviratne et al., 2014; Kosaka and Xie, 2014; England et al., 2014].”
Image Source: Loehle, 2007
• “There are two primary methods challenging the hypothesis that recent global warming is caused by anthropogenic emissions: the first method is to prove that the recent dramatic upward trend of monthly GLST anomalies is periodic, and the second method is to link global warming to major factors in nature.”
• “In this paper, we have found that the dramatic upward rising signals can be perfectly fitted with periodic functions, which suggests that the major climate factors can still be the main reason for the recent global climate warming, and the secondary climate factor such as anthropogenic emissions might be the secondary reason.  … We have identified 15,295 periodic functions that perfectly fit the monthly GLST anomalies from 1880 to 2013 and show that the monthly SST anomalies in six domains in different oceans are highly correlated [0.9 coefficient] with the monthly GLST anomalies.”
• “If we use the best function to predict the future behaviour of GLST, we can know that the downward trend for the monthly anomaly of GLST had already begun, and it will reach −0.6051˚C in 2111.”
• “The correlation study tells us that the dramatic anomalies can be seen in SST fields of different oceans, which might be the results of OSM [“Ocean Stabilization Machine”], and with the k-line diagram technique, we can see that most of the annual dramatically increasing GLST anomalies occur in El Niño years; and most of the annual dramatically decreasing GLST anomalies occur in La Niña years. These findings show us how OSM works. In a word, although there are many academic topics need to study further in future, we can still make a conclusion: “OSM” might play a very important role to cause global climate changes.”

Image Source: Mao et al., 2019
Share this...
Share on Facebook
Tweet about this on Twitter

Meteorologist Joe Bastardi Warns Brutal Cold About To Grip Large Areas Of Northern Hemisphere

Share this...
Share on Facebook
Tweet about this on Twitter

As winter progresses through January and heads into February, the latest forecast tells us one thing: Global warming is not putting an end to brutal cold winter conditions like experts said it would in the early 2000s.

At his Weatherbell Analytics Saturday Summary yesterday, meteorologist Joe Bastardi pretty much gave his seal of approval on the latest longer-term NCEP forecast for North America and Europe. Bastardi has long said that the current winter would turn out to be “severe”.

It fits with the pattern we were thinking would evolve for this winter.”

Well, the winter party is about to start in earnest. What follows is the NCEP CFS forecast temperature anomaly for North America for the next 45 days:

Image cropped from Weatherbell Saturday Summary.

Brutal cold by end of January

For the end of January, the situation looks especially brutal, as temperatures are expected to plummet to some 15°C below normal across wide areas of the Midwest and Canada:

Image cropped from Weatherbell Saturday Summary.

Snow is also forecast “to plaster” the Northeast and fall across all 48 states over the next 15 days.

Pattern change in Europe

In Europe the story for the end of January is similar, where large parts of Europe will be caught in the deep freeze:

Image cropped from Weatherbell Saturday Summary.

Also snow is expected to fall across Europe over the coming weeks, with the weather pattern having flipped from one of milder westerly winds to one with colder northerly and easterly winds.

Expect alarmist climate scientists to launch another disinformation campaign, where they will blame warming for all the cold weather we will have been experiencing.

Arctic sea ice stable more than 10 years

Sea ice volume is also creeping towards the normal levels, as shown by Japanese blogger Kirye:

Chances are good that Arctic ice volume will rise above normal, and thus keeping the more-than-one-decade-long trend of no ice melt alive and well.

As Joe points out, Arctic temperatures have been dropping and are now close to the mean value.

Share this...
Share on Facebook
Tweet about this on Twitter

Slowing Down For The Climate: German Evangelical Church Petitioning For Speed Limit On ‘Autobahns’

Share this...
Share on Facebook
Tweet about this on Twitter

The days of unlimited speed on Germany’s famed autobahns are indisputably coming to an end – probably soon. And thank God!


The online Mitteldeutche Zeitung MZ) here reports the Evangelical Church in Central Germany (EKM) is pushing to impose a speed limit on German autobahns, the motorways where drivers  in certain stretches are free to drive as fast as they dare.

The days of no speed limit on Germany’s famed autobahns are about to end. Photo: Darkone, CC BY-SA 2.5, (Wikipedia)

Quoting Christian Fuhrmann of the EKM, the MZ reports that cars should she limited to 130 km/hr and the Church will mobilize to send a public petition to the German Bundestag.

The MZ adds that the petition will be launched on Ash Wednesday and 50,000 signatures are needed.

According to the MZ: “The Central German Church covers large parts of Saxony-Anhalt and Thuringia and has about 700,000 members.”

The Church’s main reason for limiting the speed of cars to 130 km/hr on the autobahn is to reduce greenhouse gas emissions and exhaust fumes, which the Church says would translate to an emissions reduction of 2 – 2.5%. Also the high speeds and associated braking leads to more fine particle emissions from tire abrasion on asphalt surfaces, as well as more noise.

“A Confession to the Creator”

The MZ also reports that potential CO2 savings could be 2 million tonnes annually. Currently Germany emits some 900 million tonnes of CO2 equivalent gases annually.

The MZ adds:

“We see a world responsibility for ourselves, assigned by God,” said state bishop Ilse Junkermann. Mankind is destroying the foundations of life. The fight against it is “a confession to the Creator”.

Makes sense from safety alone

Personally I support a 130 km/hr limit, but not because of the silly climate reasons. Overall a limit is far safer, and some studies show it would likely improve over traffic flow efficiency and help reduce long traffic jams, which often result from messy car accidents. In the end less traffic jams means reaching your destination just as quickly as you would madly driving at breakneck speeds.

Also an autobahn speed limit could mean lower maintenance costs because the surface specifications for unevenness could be relaxed a bit. Any unevenness really becomes a factor when a vehicle is flying at 280 km/hr. With speeds at 130 km/hr, road crews would not have to be sent out to “repair” the road, which in turn means lane closures and again more traffic obstruction and slowdown.

One compromise could be to tolerate speeds of 160 km/hr (100 mph) during times and areas of light traffic.

Share this...
Share on Facebook
Tweet about this on Twitter

PIK Scientist Rahmstorf Goes After Hebrew University’s Nir Shaviv …But Gets Caught Fudging Inconvenient Data

Share this...
Share on Facebook
Tweet about this on Twitter

Stefan Rahmstorf caught redhanded manipulating temperature charts

By Michael Krueger
(Text translated/edited by P. Gosselin)


Three days ago, climate researcher Stefan Rahmstorf published an article at his KlimaLounge blog on the hearing of Jewish climate scientist Nir Shaviv in the German Bundestag concerning the Climate Change Conference in Katowice.

Accuses Shaviv of presenting “outlandish theories”

There he describes Shaviv as a “climate skeptic” with outlandish theories and who is courted by the fossil lobby and AfD Party. During the hearing, the German Left party even accused Shaviv of obviously being paid to publish climate-denialism. Stefan Rahmstorf went even further, claiming, “This is a targeted misleading of the layperson audience”.

Just who is misleading whom, I would like to pursue here.

The conflict between Jewish climate scientist Nir Shaviv and Stefan Rahmstorf dates back to 2003, when Stefan Rahmstorf wrote the following e-mail to his colleagues:

I feel another recent paper may require a similar scientific response, the one by Shaviv & Veizer (attached). …This paper got big media coverage here in Germany and I guess it is set to become a climate skeptics classic: …”

Since then, Shaviv has fallen out of favor with Stefan Rahmstorf.

Dissenters get labelled as right wingers

In the comment area of Mr. Rahmstorf’s article, some commentators — who were immediately labelled by other commenters as right wing spectrum — criticized that NASA’s temperature curve in Figure 5 was truncated in 2016, exactly when the last El-Nino pushed up the global temperature. Mr. Rahmstorf vehemently rejected the criticism.

In the article Mr. Rahmstorf refers to a link on how to create your own widget according to Figure 5, here the link. There the year 2017 is included and the curve is not cut off at 2016. Between 2016 and 2017 the global temperature dropped by 0.1°C, and in 2018 by a further 0.1°C.

Hide the decline

In the year 2016 we were at +1°C temperature anomaly according to NASA (a new record!), but today in the year 2018 only at 0.8°C. Mr. Rahmstorf obviously wanted to hide this by cleverly truncating at 2016, probably with the hope his lay public would not notice it?

Mr. Rahmstorf first showed the following Figure 5 in the article, which was truncated at 2016:

When the “deception” was discovered, he quickly changed Figure 5, without comment, and recorded the year 2017. Now the chart looks like this:

The year 2018 is still missing, which is currently only 0.8°C above the mean just before the end of the year, i.e. 0.1°C lower than 2017.

Poor correlation

With the chart he also tries to give the impression that there is a close relationship between CO2 rise and temperature rise. If one looks at the correlation coefficients, i.e. whether there is a linear relationship between CO2 and temperature rise, one can immediately see that between 1880-1970/80 there is only a moderate correlation between CO2 and temperature. It is around 0.6. A value of zero means no correlation, 1 means a perfect correlation. Only between 1980 and today does the correlation increase to around 0.9.

Thus one can say that actually only since 1980 does a good correlation exist.

To be correct, it has to be taken into account that other climatic factors also contribute to the rise in temperature and not just CO2 alone. In addition, in the recent geological past (around the last 1 million years), as evidenced by ice cores in the Antarctic and Arctic, it is always the temperature that has risen and then followed by CO2.

So it may well be that even today the temperature increase precedes the CO2 increase and the CO2 increase is partly due to temperature, e.g. because less CO2 can be stored in warm seas. (Half of the CO2 remains in the air, the other half goes into the ocean).

Natural factors kept hidden

Of course, Mr. Rahmstorf does not mention this in his article, in the belief nobody would notice. Or in other words, it’s about “deliberately misleading the lay audience”, and here not by Mr. Shaviv.

Mr. Rahmstorf’s supporters in the comments section obviously do not care whether Mr. Rahmstorf uses the methods which he accuses others of using. It is about deliberately discrediting others who have a different opinion and not of having a debate at factual level. The climate activists consider themselves scientifically and politically legitimized to force their policy on the opponents of opinion, and to do so by using smearing and violence if necessary. Sometimes a Jewish scientist gets placed in the “right-wing corner” and persons with a different opinion placed with the right wing AfD. The ends justify the means.

More misleading claims by Rahmstorf

In the following I would like to briefly go into further “deceptions” in Mr. Rahmstorf’s article. He writes:

30 years ago, in 1988, the American climate researcher James Hansen famously declared in the US Senate that the long predicted warming was now there and recognizable in the data.”

But Hansen was completely wrong with his 1988 scenarios, as we know today. See the following figure:


Furthermore ,Stefan Rahmstorf defends the hockey stick curve of his friend Michael Mann from 1998/99.

Busted hockey stick chart

Recent reconstructions would still show the same result. It should be noted that Mann’s 1998/99 hockey stick was truncated in 1980 because the proxy data showed no increase at the end of the time series. His colleague Briffa even truncated the chart in 1960. Used in place of the proxy data were weather data/temperatures from weather stations showing much larger rises than the proxy data averaged.

What follows is the the Briffa version where the proxy data are cut off at 1960 and replaced by data from weather stations and with the proxy data (green) up until the present.


The same was done with the more recent “reconstructions” mentioned by Mr. Rahmstorf.

When Mr. Rahmstorf was asked, he replied, claiming “these are well-known ‘talking points’ of the ‘climate sceptics’, and almost everything is wrong or misleading.”

Original article in German at Science Skeptical here.

Share this...
Share on Facebook
Tweet about this on Twitter

New Science: 89% Of The Globe’s Islands – And 100% Of Large Islands – Have Stable Or Growing Coasts

Share this...
Share on Facebook
Tweet about this on Twitter

Rapid sea level rise was supposed

to shrink Earth’s coasts. It hasn’t.

Image Source: Mörner, 2018

Over the past decades, atoll islands exhibited no widespread sign of physical destabilization in the face of sea-level rise. 88.6% of islands were either stable or increased in area, while only 11.4% contracted. It is noteworthy that no island larger than 10 ha decreased in size. These results show that atoll and island areal stability is a global trend, whatever the rate of sea-level rise.”- Duvat, 2019

Image Sources: Donchyts et al., 2016 and BBC (press release)

I. Despite sea level rise, “the coasts are growing all over the world”

Sea levels aren’t rising fast enough to deleteriously affect coastal areas on a net global scale.

Satellite observations indicate there has been 13,565 km2 of net growth in land area across the globe’s coasts between 1985-2015.

In other words, the Earth’s coasts gained more land area than were lost to rising sea levels.

“Earth’s surface gained 115,000 km2 of water and 173,000 km2 of land over the past 30 years, including 20,135 km2 of water and 33,700 km2 of land in coastal areas.” (Donchyts et al., 2016)

As a visual example, Ahmed et al. (2018) find that Bangladesh’s coastal land area grew by 7.9 kmper year during 1985-2015.

“This paper draws upon the application of GIS and remote sensing techniques to investigate the dynamic nature and management aspects of land in the coastal areas of Bangladesh. … This research reveals that the rate of accretion [coastal land growth] in the study area is slightly higher than the rate of erosion. Overall land dynamics indicate a net gain of 237 km2 (7.9 km2annual average) of land in the area for the whole period from 1985 to 2015.”  (Ahmed et al., 2018)

Image Source: Ahmed et al., 2018

II. Even with ~4 mm yr−1 local sea level rise, Pacific islands grew in size during 1971-2014

Between 1958-2014, the globe’s sea levels rose at a rate of about 1.4 mm yr−1 , or 14 centimeters (5.5 inches) per century (Frederikse et al., 2018).

Ice melt from Greenland and Antarctica contributed a grand total of 1.5 cm of the 7.9 cm (3.1 inches) of sea level rise during those 56 years.

“The global-mean sea level reconstruction shows a trend of 1.5 ± 0.2 mm yr−1 over 1958–2014 (1σ), compared to 1.3 ± 0.1 mm yr−1 for the sum of contributors.” (Frederikse et al., 2018)

However, there are regions of the world where sea levels are rising at rates two or three times the global average.  Tuvalu, representing over 100 islands located in the central west Pacific, has  undergone “twice the global average” rate of sea level rise (~3.90 ± 0.4 mm yr−1) since the 1970s.

It would be expected that such high rates of local sea level change would result in shrinking island coasts and overall land area during this period.

But the opposite has occurred.  There has been a net increase in the coastal land area of Tuvalu between 1971-2014 in 8 of 9 atolls.

“We specifically examine spatial differences in island behaviour, of all 101 islands in Tuvalu, over the past four decades (1971–2014), a period in which local sea level has risen at twice the global average. Surprisingly, we show that all islands have changed and that the dominant mode of change has been island expansion, which has increased the land area of the nation.”
“Using remotely sensed data, change is analysed over the past four decades, a period when local sea level has risen at twice the global average (~3.90 ± 0.4 mm yr−1). Results highlight a net increase in land area in Tuvalu of 73.5 ha (2.9%), despite sea-level rise, and land area increase in eight of nine atolls.” (Kench et al., 2018)

III. The stability or coastal net growth of islands in recent decades to century “is a global trend”

Coastal stability and expansion for atoll and island land area is not just a regional trend, but a global one.

A comprehensive (709 islands) review of coastal changes that have been observed in the last decades to century (Duvat, 2019) reveals that no atoll island destabilization has occurred due to the effects of rising sea levels.

In fact, 88.6% of the globe’s islands have coasts that are either stable or expanding in size.

Further, not a single island larger than 10 hectares [1 ha = 10,000 square m, or 2.5 acres] has decreased in size in recent decades.

None of these observed trends affirm the popularized claim that modern sea level rise is currently threatening the globe’s coasts.

Duvat, 2019

A global assessment of atoll island

planform changes over the past decades

Image Source: Duvat, 2019
This review first confirms that over the past decades to century, atoll islands exhibited no widespread sign of physical destabilization by sea level rise. The global sample considered in this paper, which includes 30 atolls and 709 islands, reveals that atolls did not lose land area, and that 73.1% of islands were stable in land area, including most settled islands, while 15.5% of islands increased and 11.4% decreased in size. Atoll and island areal stability can therefore be considered as a global trend.”
“Importantly, islands located in ocean regions affected by rapid sea-level rise showed neither contraction nor marked shoreline retreat, which indicates that they may not be affected yet by the presumably negative, that is, erosive, impact of sea-level rise.”
“It is noteworthy that no island larger than 10 ha decreased in size, making this value a relevant threshold to define atoll island areal stability.”
[A]mong the 27 islands having a land area lying between 100 and 200 ha (9 in French Polynesia, 6 in the Marshall Islands, 6 in Kiribati, 5 in Tuvalu and 1 in the Federated States of Micronesia), only 3 increased in area, while 24 were stable.”
“The great majority of Pacific islands showed positional stability, as illustrated by the Tuamotu atolls, where 85–100% of islands were stable, depending on atolls (Duvat & Pillet, 2017; Duvat, Salvat, et al., 2017).”
“Importantly, the reanalysis of available data on atoll island planform change indicates that over the past decades to century, no island larger than 10 ha and only 4 out of the 334 islands larger than 5 ha (i.e., 1.2%) underwent a reduction in size.”

IV. Sea level rise is not the coastal threat that natural geological processes are

Rapid sea level rise rates due to global warming are not the threat to coastal communities and wildlife that they have often been claimed to be.  What is?  Natural geological/crustal changes, or vertical land movement.

Tied to the Earth’s gravitational attraction and shifting plates, geologic subsidence (land sinking) or uplift (land rising) processes are far more of a determinant of coastal structure and positioning than the rather small (by comparison) seawater volume changes in the world’s ocean basins.  This is why sea level rise (or fall) rates are local, not global, as they vary quite dramatically across the world.

Along the coast of Juneau, Alaska, for example, the land surface has been rapidly rising due to gravitational uplift for many decades.  Consequently, relative sea levels are plummeting in this region at a rate of over -13 mm yr−1 (minus-5 inches per decade) according to NOAA.

The opposite is occurring along the U.S. Gulf coast (Grand Isle, Louisiana), where the land area is sinking and thus sea levels are rising at a rate of over +9 mm yr−1.

Scientists have concluded that “sea level rise is not the primary factor controlling the shoreline changes” in regions where sea levels are rapidly rising (Testut et al., 2016).   Even localized rates of sea level rise as high as 5 mm yr−1 are not rapid enough to overcome the much more pronounced geologic changes (accretion and uplift).

We show that Grande Glorieuse Island has increased in area by 7.5 ha between 1989 and 2003, predominantly as a result of shoreline accretion [growth]: accretion occurred over 47% of shoreline length, whereas 26% was stable and 28% was eroded. Topographic transects and field observations show that the accretion is due to sediment transfer from the reef outer slopes to the reef flat and then to the beach. This accretion occurred in a context of sea level rise: sea level has risen by about 6 cm in the last twenty years and the island height is probably stable or very slowly subsiding. This island expansion during a period of rising sea level demonstrates that sea level rise is not the primary factor controlling the shoreline changes. This paper highlights the key role of non-climate factors in changes in island area, especially sediment availability and transport.”  (Testut et al., 2016)

V. Vertical motions of the Earth’s crust exert “dominant control” over relative sea level

Along the eastern coast of the U.S., Piecuch and colleagues (2018) find that the “dominant control” over the disparate rates of sea level rise during the modern era has been exerted by “vertical motions of the Earth’s crust”, not climate.

“Here we analyse instrumental data and proxy reconstructions using probabilistic methods to show that vertical motions of Earth’s crust exerted the dominant control on regional spatial differences in relative sea-level trends along the US East Coast during 1900–2017, explaining most of the large-scale spatial variance. … Rates of coastal subsidence caused by ongoing relaxation of the peripheral forebulge associated with the last deglaciation are strongest near North Carolina, Maryland and Virginia [locations where the sea level rise rates are highest]. Our results indicate that the majority of large-scale spatial variation in long-term rates of relative sea-level rise on the US East Coast is due to geological processes that will persist at similar rates for centuries. … We note that negative VLM [vertical land motion] reflects subsidence and hence contributes to sea-level rise. Correspondingly, the most negative VLM [vertical land motion] rate (−2.5 ± 0.6 mm yr−1) is likely (P = 0.75) to occur in the states that host the maximum sea-level rise, North Carolina or Virginia, whereas the most positive rate of VLM (0.7 ± 0.8 mm yr−1) is very likely (P = 0.90) to occur in Maine.” (Piecuch et al., 2018)

Pfeffer and colleagues (2017) assessed 849 coastal sites and determined that geophysical processes, or vertical land motion (VLM) trends (ranging from −13 to +16 mm yr−1 ), “have been recognized as a dominant component of the total relative sea-level variations observed at coasts” at locations throughout the globe.

VLMs contribute actively to the sea-level changes felt by coastal populations, as they can amplify, diminish or counteract the effects of climate-induced signals (e.g. Pfeffer & Allemand 2016). In many cases, for example, Torres Islands (Ballu et al. 2011), western Tropical Pacific (Becker et al. 2012), southern Europe (Woppelmann & Marcos ¨ 2012) or Indian Ocean (Palanisamy et al. 2014), VLMs [vertical land motions] have been recognized as a dominant component of the total relative sea-level variations observed at coasts. … VLMs are estimated at 258 GPS stations and 591 tide gauges, for a total of 849 coastal sites. Trend values range from −13 to +16 mm yr−1 (Fig. 3a). A strong spatial variability in VLM is observed at all scales, including locally, which is evidenced for example by the dense instrumental network in Japan. High rates of crustal uplift are observed at high latitudes, while subsidence of coastal areas is more often observed at medium latitudes (e.g. the East American coast).” (Pfeffer et al., 2017)

VI. Meter-scale sea level rise is “related to geologic events only”, not climate change

In a 2018 paper published in the journal Geoscience Frontiers, geophysicist and tectonics expert Dr. Aftab Khan asserts that “both regional and local sea-level rise and fall in meter-scale is related to the geologic events only and not related to global warming and/or polar ice melt.”

Very high rates of land subsidence and uplift persist today.  Vertical land motions as profound as ±10 to 30 mm yr−1 have been observed by geologists – easily overwhelming even the highest measured relative sea level changes.

The conclusion that rapid and high-amplitude (i.e., meters-per-century) sea level changes occur primarily as a consequence of natural geologic processes effectively leaves no room for global warming and/or polar ice melt to significantly contribute to the alarming meters-per-century sea level rise projected to engulf the Earth’s coasts by the end of the century.

Modeled predictions of multiple meters of sea level rise by 2100 (for example, 2.6 meters of global sea level rise by 2100 according to authors like Dr. Michael Mann and Dr. Richard Alley in Garner et al. 2017) are dismissed as “highly erroneous” by Dr. Khan.

Suggestions of a controlling anthropogenic influence on coastal and shoreline changes — the scariest aspect of climate modeling predictions — may therefore be significantly undermined by real-world scientific observations.

Khan, 2018

Why would sea-level rise for global

warming and polar ice-melt?

“Geophysical shape of the earth is the fundamental component of the global sea level distribution. Global warming and ice-melt, although a reality, would not contribute to sea-level riseGravitational attraction of the earth plays a dominant role against sea level rise.”
“Geological processes are responsible of two types of major movements of the crustal block viz., uplift and subsidence. Hence, the relation of sea level and crustal motion is attributed to sea level drops when there is an uplift while it rises when there is subsidence.”
“Snay et al. (2016) have found large residual vertical velocities [land uplift], some with values exceeding 30 mm/yr, in southeastern Alaska. The uplift occurring here is due to present-day melting of glaciers and ice fields formed during the Little Ice Age glacial advance that occurred between 1550 A.D. and 1850 A.D.”
“Johansson et al. (2002) conducted research on a project BIFROST (Baseline Inferences for Fennoscandian Rebound Observations, Sea-level, and Tectonics) that combines networks of continuously operating GPS receivers in Sweden and Finland to measure ongoing crustal deformation due to glacial isostatic adjustment (GIA). They have found the maximum observed uplift rate 10 mm/yr for Fennoscandian region analyzing data between August 1993 and May 2000. Sella et al. (2007) and Lidberg et al. (2010) suggested that postglacial rebound continues today albeit very slowly wherein the land beneath the former ice sheets around Hudson Bay and central Scandinavia, is still rising by over a centimeter a year, while those regions which had bulged upwards around the ice sheet are subsiding such as the Baltic states and much of the eastern seaboard of North America.”
“Transgression commences when continental block undergoes subsidence with respect to continental shelf and abyssal plain, while regression occurs when this process is reverse i.e., when continental block is uplifted with respect to continental shelf and abyssal plain. Prograding delta system in low lying areas and other geologic events may cause local/relative sea-level fall as new sedimentary deposition advances as accretion pushing sea further down the coast irrespective of global warming and polar ice-melt.”
“Hence, both regional and local sea-level rise and fall in meter-scale is related to the geologic events only and not related to global warming and/or polar ice melt.”
Prediction of 4–6.6 ft sea level rise in the next 91 years between 2009 and 2100 is highly erroneous.”
Share this...
Share on Facebook
Tweet about this on Twitter

700,000 Square Kilometers Of Added Green Vegetation, Climate Change Shrinks Sahara Desert By Whopping 8%!

Share this...
Share on Facebook
Tweet about this on Twitter

Almost daily the CO2 Science site brings reports on the impact of climate change on the living world. Hat-tip: Die kalte Sonne here

Recently, CO2 Science brought up a paper in Nature Communications.

Using satellite images, Venter et al. 2018 found an eight percent increase in woody vegetation in sub-Saharan Africa over the last three decades, underscoring the global “greening trend”.

Recent study by Venter et al finds that the Sahara has shrunk by 8% over the past three decades. NASA image, public domain.

According to Wikipedia, the Sahara covers a vast area of some 9.2 million square kilometers. Eight percent of that translates into more than 700,000 square kilometers. That’s an area that’s almost as big as Germany and France combined! This is profound.

In other words, it’s well over 10,000 Manhattans!

If the added green area were effectively used for agriculture, it could produce enough food to feed the African continent. Unfortunately, this is a fact that the doomsday-obsessed media, activists and ruling politicians fear will become publicly known. They instead would prefer that the globe returns to a climate of the 1980s, when drought and famine ravaged the vast North African region.

According to the recent study, the cause was a decline in vegetation fires in a warmer and more humid climate. Abstract:

Drivers of woody plant encroachment over Africa
While global deforestation induced by human land use has been quantified, the drivers and extent of simultaneous woody plant encroachment (WPE) into open areas are only regionally known. WPE has important consequences for ecosystem functioning, global carbon balances and human economies. Here we report, using high-resolution satellite imagery, that woody vegetation cover over sub-Saharan Africa increased by 8% over the past three decades and that a diversity of drivers, other than CO2, were able to explain 78% of the spatial variation in this trend. A decline in burned area along with warmer, wetter climates drove WPE, although this has been mitigated in areas with high population growth rates, and high and low extremes of herbivory, specifically browsers. These results confirm global greening trends, thereby bringing into question widely held theories about declining terrestrial carbon balances and desert expansion. Importantly, while global drivers such as climate and CO2 may enhance the risk of WPE, managing fire and herbivory at the local scale provides tools to mitigate continental WPE.

Read more at CO2 Science.

Another element that is unmentioned is the fertilization effect of the added CO2 into the atmosphere surely provides.

Relotian media

This is positive news that no one will find in the Relotian mainstream media, which are fixated on purveying propaganda, falsehoods, half truths and censorship with the aim of distorting public opinion and vigorously marginalizing dissenting views.

Also (thanks to readers:

Share this...
Share on Facebook
Tweet about this on Twitter

Swiss Daily NZZ: More Than 1000 Citizen Wind Energy Protest Groups …Germany “On Path To The Unknown”

Share this...
Share on Facebook
Tweet about this on Twitter

Although resistance to littering the landscape with industrial wind turbines continues to grow strongly and the power grid is becoming ever more unstable, the German government refuses to back off its expansion of wind and solar energy.

Protests groups reach over 1000 in number as wind turbine litter the German landscape. Photo: Windwahn.

Jonas Herrmann of the Swiss NZZ here comments how Germany “is struggling with its wind turbines”.

It started 28 years ago, when the German government enacted a law that forced power companies to buy up any green power produced, pay exorbitant prices for it and feed it into the power grid whether it was needed or not. Over time the installation of solar panels and wind turbines exploded and today many parts of the countryside have become littered with unsightly wind parks.

Ruined landscape, yet a long way to go

Yet, Germany today remains far away from supplying its energy needs through “green” sources.

The NZZ comments: “The landscape has changed in many places as a result. A longer drive through Germany inevitably leads past dozens of wind turbines.”

Moreover, every community has been impacted, Nikolai Ziegler, says the chairman of resistance group Vernunftkraft, which is the major umbrella association of wind power opponents. According to Ziegler: “In Germany there are more than 1000 citizens’ initiatives that are mobilizing against wind energy” and that these groups are getting involved in politics.

Wind and sun will never be able to do the job

Not only are wind turbines ruining Germany’s idyllic landscape, but the NZZ writes that much of the resistance is also based on wind energy’s technical unreliability as a power supply. The Herrmann at NZZ cites engineering expert Dr. Detlef Ahlborn, who says wind energy is too erratic and thus unreliable.

According to the NZZ;

With the umbrella organization Vernunftkraft, everyone is convinced that wind and solar energy will never be able to ensure a secure power supply.”

Proponents in denial

This is a claim that the German government and green energy proponents refuse to acknowledge. Proponents in Germany believe that the problems with green energies will somehow go away and the supply will miraculously somehow smooth out if more and more volatile wind and sun capacity gets installed.

Critics like Nikolai Ziegler also criticize that there really hasn’t even been any real Energiewende (transition the green energies) so far because electricity is only one fifth of Germany’s total energy demand. Green energies provide only one third of that one measly fifth, and so “it’s relatively meaningless”.

The NZZ notes that “most of the leaders of the protest group Vernunftkraft are “male in the second half of their lives” who “are united by their anger at wind energy”, but adds most have a background in natural science almost half of them are professors.

“Path to the unknown”

Vernunftkraft is calling for an end to green energy subsidies, a stop to the construction of additional wind turbines and instead greater investment in gas-fired power plants

The NZZ writes:

However, a political majority is not in sight. Although the AfD and parts of the FDP [parties] believe the Energiewende is a mistake, the German government can hardly prevent a further expansion of renewable energies. Germany is thus still on the path to the unknown with the Energiewende.”

Share this...
Share on Facebook
Tweet about this on Twitter

Regional Models: 3-10°C Warming In The Next 80 Years. Observations: No Warming In The Last 40-100 Years.

Share this...
Share on Facebook
Tweet about this on Twitter

There are large regions of the globe where observations indicate there has been no warming (even cooling) during the last decades to century. Climate models rooted in the assumption that fossil fuel emissions drive dangerous warming dismiss these modeling failures and project temperature increases of 3° – 10°C by 2100 for these same regions anyway.

Four decades of Southern Ocean cooling

After warming from the 1940s to the mid-1970s, the Southern Ocean has been cooling since the late-1970s, which has consequently resulted in an increase in sea ice extent (Fan et al., 2014Purich et al., 2018Latif et al., 2017Turney et al., 2017 ).

In their paper entitled “Natural variability of Southern Ocean convection as a driver of observed climate trends”, Zhang et al. (2019) suggest that the Southern Ocean cooling was driven by natural processes.

Zhang et al., 2019

Observed Southern Ocean surface cooling and sea-ice expansion over the past several decades are inconsistent with many historical simulations from climate models. Here we show that natural multidecadal variability involving Southern Ocean convection may have contributed strongly to the observed temperature and sea-ice trends.”

Climate models, in contrast, had projected a rapid warming and significant decreases in sea ice extent during the last few decades.

Image(s) Source: Zhang et al., 2019

The East-Central U.S. has been cooling (about -0.6°C) since the 1950s

Partridge et al., 2018

“We present a novel approach to characterize the spatiotemporal evolution of regional cooling across the eastern U.S. (commonly called the U.S. warming hole), by defining a spatially explicit boundary around the region of most persistent cooling. The warming hole emerges after a regime shift in 1958 where annual maximum (Tmax) and minimum (Tmin) temperatures decreased by 0.46°C and 0.83°C respectively.”

Image Source: Partridge et al., 2018

Alter et al., 2017

From 1910- 1949 (pre-agricultural development, pre-DEV) to 1970-2009 (full agricultural development, full-DEV), the central United States experienced large-scale increases in rainfall of up to 35% and decreases in surface air temperature of up to 1°C during the boreal summer months of July and August … which conflicts with expectations from climate change projections for the end of the 21st century (i.e., warming and decreasing rainfall) (Melillo et al., 2014).”

Image Source: Alter et al., 2017

Climate models project 3°C – 10°C warming in the Midwest (U.S.) by 2100

Even though climate models failed to simulate the last 50 to 100 years of temperatures for this region, hindcasting a dramatic warming instead of the observed cooling, the projections for 2100 are still predicated on CO2 emission scenarios (RCP4.5, RCP8.5) as the determinant of regional surface temperatures.  Consequently, the regional models project a warming of 3°C – 10°C over the next 80 years.

Hamlet et al., 2019

“For the two most widely used greenhouse gas concentration scenarios, Representative Concentration Pathways (RCP) 4.5 and 8.5 (Moss et al. 2008) (representing “medium” and “high” twenty-first century greenhouse gas concentration trajectories respectively), the Midwestern United States is projected to experience profound changes in climate by 2100, especially for (T). Projections for annual mean T over the Midwestern United States from 31 global climate models (GCMs) for the RCP8.5 scenario show an ensemble mean increase in T of about 6.5 °C (11.7 °F) by 2100 relative to the historical 1971–2000 baseline (Fig. S1) (Byun and Hamlet 2018). The projected change in the annual ensemble mean T for RCP4.5 over the Midwestern United States is about 3.3 °C (5.9 °F) by 2100 relative to the 1971–2000 baseline. The upper tail of the annual mean T distribution, represented by the 97.5th percentile of the 31 GCM projections for RCP8.5 (i.e., a “worst-case” scenario), is nearly 10 °C (18 °F) warmer than the historical baseline by 2100.”

Image(s) Source: Hamlet et al., 2019

The North Atlantic hasn’t warmed since the 1800s

Grieman et al., 2018

Image Source: Grieman et al., 2018
Image Source: Birkel et al., 2018

Climate models project 3°C warming in the North Atlantic by 2100

Gervais et al., 2018

“Recent studies have documented the development of a warming deficit in North Atlantic sea surface temperatures (SST) both in observations of the current climate (Rahmstorf et al. 2015; Drijfhout et al. 2012) and in future climate simulations (Drijfhout et al. 2012; Marshall et al. 2015; Woollings et al. 2012). This “North Atlantic warming hole” (NAWH) is characterized in the observed record as a region south of Greenland with negative trends in SSTs of 0.8 K century-1 (Rahmstorf et al. 2015). In fully coupled global climate model (GCM) future simulations, the NAWH is seen as a significant deficit in warming within the North Atlantic subpolar gyre (Marshall et al. 2015; Winton et al. 2013; Gervais et al. 2016).  This local reduction in future warming is communicated to the overlying atmosphere and may impact atmospheric circulation (Gervais et al. 2016), including the North Atlantic storm track (Woollings et al. 2012).”

Image Source: Gervais et al., 2018

Hansen (2013): CO2 emissions will cause 20°C of global warming by ~2130

Back in 1989, Dr. James Hansen, the former head of NASA, predicted that New York City’s West Side Highway would be underwater within 20 years due to rapid global warming and the consequent rising sea levels.

A glance at a 2018 image of the West Side Highway  indicates that it is still very much above water, no lower than its position in 1936.

A few decades later (2012), Hansen was the lead author of a paper published by The Royal Society (2013) that indicated ever-growing fossil fuel emissions would lead to a nearly five-fold rise in atmospheric CO2 concentrations (to 1,400 ppm) within 118 years.

He then projected this CO2 increase and presumed 9 W m-2 forcing would cause a global surface temperature warming of 20°C by about 2130, with 30°C warming at the poles.

Image Source: Hansen et al., 2013

Hansen et al., 2013

“Let us now verify that our assumed fossil fuel climate forcing of 9 W m−2 is feasible. If we assume that fossil fuel emissions increase by 3% per year, typical of the past decade and of the entire period since 1950, cumulative fossil fuel emissions will reach 10 000 Gt C in 118 years [2012 + 118 years = ~2130 C.E.] … [T]he fossil fuel source required to yield a 9 W m−2 forcing may be closer to 5000 Gt C, rather than 10 000 Gt C.”
9 W m−2 forcing requires approximately 4.8×CO2 [1400 ppm] … Our calculated global warming in this case is 16°C, with warming at the poles approximately 30°C. Calculated warming over land areas averages approximately 20°C. … Such temperatures would eliminate grain production in almost all agricultural regions in the world. Increased stratospheric water vapour would diminish the stratospheric ozone layer. More ominously, global warming of that magnitude would make most of the planet uninhabitable by humans.”
Given the 20°C warming we find with 4.8×CO2 [1400 ppm], it is clear that such a climate forcing would produce intolerable climatic conditions even if the true climate sensitivity is significantly less than the Russell sensitivity, or, if the Russell sensitivity is accurate, the CO2 amount required to produce intolerable conditions for humans is less than 4.8×CO2 [1400 ppm].”
“Are there sufficient fossil fuel reserves to yield 5000–10 000 Gt C? Recent updates of potential reserves, including unconventional fossil fuels (such as tar sands, tar shale and hydrofracking-derived shale gas) in addition to conventional oil, gas and coal, suggest that 5×CO2 (1400 ppm) is indeed feasible.”

Given the documented modeled forecast failures and lack of extreme or dangerous warming in recent decades, is there good reason to assume that Hansen’s prediction of a 20°C warming over the next 110 years will be realized?

At what point do modeling failures lead to a reconsideration of the forcing mechanisms?

Share this...
Share on Facebook
Tweet about this on Twitter

Sun As Main Driver: Japanese Scientist Cites 7 Major Examples How Real Climatic Data Contradict AGW Claims

Share this...
Share on Facebook
Tweet about this on Twitter

By Kyoji Kimoto

1. Warmer period of the 1930s

In 1998 D. Dahl-Jensen et al. pointed out in the journal Science that the 1930s is 0.5°K warmer than the present time based on a bore-hole study of Greenland ice sheet.

The following data support D. Dahl-Jensen’s findings, from Soon 2012.

Also heat waves were far worse across the USA in the 1930s:

More heat waves in the 1930s.

The strongest hurricane was the Labor Day hurricane, which hit in 1935. Hurricane Irma and Harvey had much higher central pressure at landfall. (U.S .National Hurricane Center):

Strongest hurricane occurred in the 1930s.

2. Arctic temperature and sea ice extent

Parts of the Arctic were warmer in the 1930s:

Source: Real Climate Science.

Arctic sea ice levels were just as low in the 1930s as they are today:

Read more here.

3. NASA & NOAA altered the data

The climatic data above can be understood with solar activity change (aa Index) and ocean oscillation (Pacific Decadal Oscillation Index), see the 2 charts that follow.


The Pacific Decadal Oscillation (PDO) index was also in its warm phase during the 1930s:

PDO index was positive from1925 to1945. Data: Japan Meteorological Agency.

In summary, increasing solar activity with positive PDO index caused the warmer period of the 1930s. However, NASA and NOAA have made data tampering to stress recent warming.

Recent temperature data shows strong influence of ocean oscillation (El Nino) and no relation with CO2 increase as follows:

Source: Climate4you

4. MWP & LIA caused by changes in solar activity

Solar activity proxies show the MWP & the LIA in Japan and China as follows:

Chart above: Kitagawa, H. & Matsumoto, E., Geophysical Research Letters, Vol. 22, 2155-2158, 1995


Graphic above: Quansheng GE. et al., Advances in atmospheric sciences, Vol. 34, 941-951, 2017.

There are hundreds of other proxies worldwide that support solar activity as the main climate driver.

5. Sea level rise

Almost 25-years of meticulous data gathered by the Australian Bureau of Meteorology displays no discernible sea-level rise for Solomon Islands and Nauru. See the two graphs that follow:

Source: WUWT.

6. El Nino linked to solar activity

A publication by Njau (2006) showed El Nino starts at the year of sunspot minimum or maximum, thus showing that solar activity has a major impact on oceanic oscillations, which in turn powerfully impact weather and climate.

7. Extreme weather and solar activity

Bucha (1988) showed decreased solar activity causes meandering of jet stream which produces extreme weather in broad area. Since 2006 decreased solar activity has been causing heat waves, wildfires and heavy rainfall and snowfall all over the world.

Bucha (1988)

Share this...
Share on Facebook
Tweet about this on Twitter

Veteran Meteorologists Warn Of “Bitter Cold” …”Areas Under The Gun” As Models Project Cold Polar Blasts!

Share this...
Share on Facebook
Tweet about this on Twitter

Yesterday we wrote about a study that told us the data do not support that weather blockings are occurring more often than they used to. Some alarmist media and scientists have claimed that the heavy snowfalls in the Alps are happening due to manmade global warming.

Swiss meteorologist: Such snowfalls “nothing unique” for Alps

Yesterday one of Europe’s most high profile meteorologists, Jörg Kachelmann, penned an opinion piece at reminding the public that heavy snow events in the Alps, such as the one we are now experiencing, are in fact nothing unique and that it is not a catastrophe.

In the days ahead, many parts of the Alps are expecting up to another meter of new snow, yet, according to Kachelmann, this should not pose any problems to buildings and structures – if their construction indeed adhered to the applicable building codes.

“Nothing to do with climate change”

Kachelmann adds later in his t-online piece: “1. The snowfalls are nothing unique so far for the Alps. 2. They have nothing to do with climate change.”

The veteran Swiss meteorologist adds that the heavy snow in the Alps “are making people happy because they ensure the ski season will extend until Easter.”

Snow is in fact welcome

Moreover, they are good for the glaciers, and will help relieve the drought conditions seen in Europe last year. The Swiss meteorologist adds:

They [the snowfalls] are making the media happy as well because weather catastrophes get many clicks. However, the current weather situation is not a catastrophe. Human lives (except in the mentioned 0.1 percent problem) are only in danger if people behave inappropriately.”

Kachelmann also thinks the media overhype now taking place may be unnecessarily scaring people away from booking ski trips to the Alps, a region where massive amounts of snow are common.

Models suggest severe winter conditions

Meanwhile, German meteorologist Dominik Jung of reported that a Russian “Beast from the East” looks to be in the works for Europe for the weeks ahead:

According to Jung, who cites US and European weather models, beginning January 20 much greater cold will be invading large parts of Europe. And by January 25, the cold conditions could deepen and become extreme.

Northern hemisphere “under the gun”

Also at Weatherbell, 40-year meteorologist Joe Bastardi backs Jung’s projected development, presenting a frigid chart at yesterday’s Daily Summary that shows the forecast 11-16 days out:

Cold to grip Europe, North America and Asia in the weeks ahead, models suggest.

Joe says:

Once this gets here, until March, these areas and these areas, are under the gun. Repetitive snow threats and bitter cold relative to averages, and it’s coming at the coldest time of the year.”

Share this...
Share on Facebook
Tweet about this on Twitter

It’s Weather!…Cold, Heavy Snowfall Across Europe Not Linked To Global Warming And Melting Arctic

Share this...
Share on Facebook
Tweet about this on Twitter

We’re seeing lots of headlines about heavy snowfalls and cold temperatures gripping Eastern and Southern Europe. Not surprisingly some activist scientists are blaming manmade global warming.

Expected snow depths by January 15. Chart: WXCharts.EU.

Junk theory: Global warming causing more snow extremes

Yet global warming logically isn’t supposed to be directly causing massive snow and bitter cold, and so there has to be some explanation for the unexpected cold and snowy weather. So to explain its all, a gaggle of activist scientists have concocted a theory that claims “unprecedented” Arctic sea ice loss over the past two decades has led to an increase in blocking over North America and Europe [e.g., Liu et al., 2012; Francis and Vavrus, 2012] and so is indirectly causing lots of snow and cold.

These desparate scientists then hope that the public and media will be gullible enough to buy into it.

Blocking is strongly tied to weather extremes in the midlatitudes (e.g., cold snaps, heat waves) and can persist for days to weeks [e.g., Black et al., 2004; Dole et al., 2011], so more blocking could mean more weather extremes as Arctic sea ice continues to decline (Note: Arctic sea ice in fact hasn’t declined in more than 10 years).

Analyses: no data to support the theory

However, a recent paper authored by Elisabeth A. Barnes, Department of Atmospheric Science, Colorado State University, says the data to support this just aren’t there.

The paper’s abstract:

Observed blocking trends are diagnosed to test the hypothesis that recent Arctic warming and sea ice loss has increased the likelihood of blocking over the Northern Hemisphere. To ensure robust results, we diagnose blocking using three unique blocking identification methods from the literature, each applied to four different reanalyses. No clear hemispheric increase in blocking is found for any blocking index, and while seasonal increases and decreases are found for specific isolated regions and time periods, there is no instance where all three methods agree on a robust trend. Blocking is shown to exhibit large interannual and decadal variability, highlighting the difficulty in separating any potentially forced response from natural variability.”

The results of the analyses are summed up in the following charts of the paper’s Figure 3:

Time series of blocking frequencies for the three indices and four reanalyses for (a, c, and e) Asia in DJF and (b, d, and f) the North Atlantic in JJA. Trends significantly different from zero at 95% confidence are denoted by asterisks in the legend of each panel for Asia (1990–2012) and the North Atlantic (1980–2012). Filled circles (stars) denote the seasons following the 5 highest (lowest) years of September Arctic sea ice extent over the trend period. Blocking frequencies are averaged between 40° and 80°N for the 2‐D indices. Chart: Barnes et al (2014)

Not supported by observations

The findings reiterate those of Barnes [2013], The 2014 paper concludes that “the link between recent Arctic warming and increased Northern Hemisphere blocking is currently not supported by observations.”

Blocking events well within historical observed range

The paper adds:

While Arctic sea ice experienced unprecedented losses in recent years, blocking frequencies in these years do not appear exceptional, falling well within their historically observed range.”

In other words, the theory that global warming is causing more extremes due to melting Arctic sea ice is just plain crap. There’s no data to support it. It’s just a hypothesis – one that was rolled out in a desperate attempt to explain events that weren’t supposed to happen.

Correspondence to: Elisabeth A. Barnes:
Search for more papers by this author


Share this...
Share on Facebook
Tweet about this on Twitter

Science: The Deep Ocean Plays A ‘Leading Role’ In Global Warming. It’s Colder Now Than During The 1700s.

Share this...
Share on Facebook
Tweet about this on Twitter

Authors of a new paper published in the journal Science (Gebbie and Huybers, 2019) insist the deep ocean ultimately plays a leading role in the planetary heat budget.” The global deep ocean has much less heat today than it had during both the Medieval Warm Period and the Little Ice Age.

Image Source: Gebbie and Huybers, 2019

A Bottom-Up Heat Flux?

The deep ocean may warm hundreds (to thousands) of years before hemispheric surface temperatures and CO2 concentrations do (Stott et al., 2007).

Image Source: Stott et al., 2007

This bottom-up hemispheric-scale heat flux – independent of CO2-forcing – may occur for land area as well.

The increase of carbon dioxide concentrations occurred 2–3 thousands of years later than the heat flux increase and synchronously with temperature response.”  (Demezhko and Gornostaeva, 2015)
“GST [ground surface temperature] and SHF [surface heat flux] histories differ substantially in shape and chronology. Heat flux changes ahead of temperature changes by 500–1000 years.” (Demezhko et al., 2017)
 “During the Last Glacial Maximum 26–19 thousand years ago (ka), a vast ice sheet stretched over North America [Clark et al., 2009]. In subsequent millennia, as climate warmed and this ice sheet decayed, large volumes of meltwater flooded to the oceans [Tarasov and Peltier, 2006; Wickert, 2016]. This period, known as the ‘last deglaciation’, included episodes of abrupt climate change, such as the Bølling warming [~14.7–14.5 ka], when Northern Hemisphere temperatures increased by 4–5°C in just a few decades [Lea et al., 2003; Buizert et al., 2014], coinciding with a 12–22 m sea level rise in less than 340 years [5.3 meters per century] (Meltwater Pulse 1a (MWP1a)) [Deschamps et al., 2012].” (Ivanovic et al., 2017)

Deep ocean heat leads surface temperature change yet today?

A new paper indicates that the deep ocean in the Pacific has continued cooling in recent decades, extending the long-term cooling trend that commenced after the warmer-than-today Medieval Warm Period ended.

Other authors (Wunsch and Heimbach, 2014) have also documented a global-scale deep ocean (below 2,000 meters) cooling trend within the last few decades.

“About 52% of the ocean lies below 2000 m and about 18% below 3600 m. … A very weak long-term [1993-2011] cooling is seen over the bulk of the rest of the ocean below that depth [2,000 meters] including the entirety of the Pacific and Indian Oceans, along with the eastern Atlantic basin.”  (Wunsch and Heimbach, 2014)

Image Source: (Wunsch and Heimbach, 2014)

Little Ice Age conditions may still dominate in the deep ocean despite the dramatic rise in CO2 concentrations during the last few hundred years — from about 280 ppm during the late 1700s to well over 400 ppm today.

The global ocean below 2000 meters may actually be colder today than during the 18th century.

Gebbie and Huybers, 2019

The Little Ice Age and 20th-century deep Pacific cooling

The ongoing deep Pacific is cooling, which revises Earth’s overall heat budget since 1750 downward by 35%.”
In the deep Pacific, we find basin-wide cooling ranging from 0.02° to 0.08°C at depths between 1600 and 2800 m that is also statistically significant. The basic pattern of Atlantic warming and deep-Pacific cooling diagnosed from the observations is consistent with our model results, although the observations indicate stronger cooling trends in the Pacific.” 
These basin-wide average trends are used to relax the assumption of globally uniform changes in surface conditions and to constrain regional temperature histories for 14 distinct regions over the Common Era by a control theory method. The result, referred to as OPT-0015, fits the observed vertical structure of Pacific cooling and Atlantic warming. Global surface changes still explain the basic Atlantic-Pacific difference in OPT-0015, but greater Southern Ocean cooling between 600 and 1600 CE leads to greater rates of cooling in the deep Pacific over recent centuries.”
“OPT-0015 indicates that the upper 2000 m of the ocean has been gaining heat since the 1700s, but that one-fourth of this heat uptake was mined from the deeper ocean. This upper-lower distinction is most pronounced in the Pacific since 1750, where cooling below 2000 m offsets more than one-third of the heat gain above 2000 m.”
“Finally, we note that OPT-0015 indicates that ocean heat content was larger during the Medieval Warm Period than at present, not because surface temperature was greater, but because the deep ocean had a longer time to adjust to surface anomalies. Over multicentennial time scales, changes in upper and deep ocean heat content have similar ranges, underscoring how the deep ocean ultimately plays a leading role in the planetary heat budget.”

Image Source: Gebbie and Huybers, 2019

A lack of long-term CO2→OHC correlation 

It may be worth a closer look at the graph of global ocean heat content (OHC, 0 m-bottom) during the last 2,000 years from Gebbie and Huybers (2019).

Image Source (bottom graph, heavily annotated): Gebbie and Huybers, 2019

It is interesting to note the multiple centennial-scale warming and cooling trends during the last two millennia that exceed the rate and amplitude of the ocean heat changes that have occurred since 1950, or since atmospheric CO2 concentrations began rising dramatically.

For example, despite the very modest  associated changes in atmospheric CO2 concentrations (< 5 ppm), it appears that both the 1850-1875 and 1925-1945 global warming periods in the 0-700 m layer exceeded the rate and amplitude of the heat content changes since 1950.

As the global oceans rapidly warmed and cooled in the centuries preceding modern times (i.e., the Medieval Warm Period and Little Ice Age), the corresponding CO2 concentrations were remarkably stable, neither rising with the warming or falling with the cooling.

Considering 93% of the Earth’s heat changes are expressed in the global ocean, and that just 1% of global warming is said to be reflected in surface air temperatures (IPCC, 2013), the lack of conspicuous correlation between ocean heat content and CO2 during the last 2,000 years would seem to undermine claims that atmospheric CO2 concentration changes drive zero-to-bottom global ocean warming.

Share this...
Share on Facebook
Tweet about this on Twitter

By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this. More information at our Data Privacy Policy