Scientists: Antarctica Ice Sheet Thinned 400 Meters 5000 Years Ago, And Natural Oceanic Cycles Drive Climate

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Today we present two papers on climate reconstruction using proxy data. One about East Antarctica and the other about belize. Hat-tip reader Mary Brown.

AMO behind sea surface temperatures

First we look at a paper authored by a team of German scientists: “Great Blue Hole (Lighthouse Reef, Belize): A continuous, annually-resolved record of Common Era sea surface temperature, Atlantic Multidecadal Oscillation and cyclone-controlled run-off“.

The team looked at 2000 years of proxy data from Belize and found interesting natural cycles at play. According to the authors, the Atlantic Multidecadal Oscillation (AMO) occurred 1885 years back in time and that it controls the SW Caribbean sea surface temperature patterns on multi-decadal time scales.

The authors note that the Holocene (<11.7 kyr BP) has been characterized by several periods of distinct climate changes and that the climate remains difficult to predict “due to the lack of comprehensive, annually-resolved and continuous sea-surface temperature (SST) data”.

So what about them models?

Examining an 8.55 m long sediment core from the bottom of the Great Blue Hole (Lighthouse Reef, Belize), the scientists were able to extract “an annually-resolved, continuous and unique south-western Caribbean climate record for the last 1885 years”.

The result? The data imply a general SST rise within the south-western Caribbean and that the modulation of SST within the time series likely operated on two different time levels: (1) Solar (e.g., “Gleissberg Cycles”) and volcanic activity triggered climate changes, which in turn induced responses of the Atlantic Multidecadal Oscillation (AMO), the North Atlantic Oscillation (NAO) and the El-Niño-Southern Oscillation (ENSO).

The authors conclude further in the abstract:

We suspect long-term positive AMO and NAO modes as the primary key control mechanisms of the Dark Ages Cold and Medieval Warm Period SST patterns. ENSO mode modulation likely exerted primary control on regional SST variability during the Little Ice Age and the Modern Global Warming. (2) Our δ18O data further indicate a striking secondary control on multi-decadal time scales: δ18O variations occur with 32–64 years periodicity. This signal is clearly evidence of SST modulation controlled by AMO phase changes (50–70 years) over almost the entire Common Era. Our carbon isotope record (δ13C) exhibits two remarkable negative anomalies and a long-term up-core decreasing trend. The first excursion (drop of 0.5‰) occurred with the onset of the Medieval Warm Period, which is reconstructed to be a peak time in south-western Caribbean tropical cyclone (TC) activity. This overlap is stressing a potential context between TC activity, enhanced coastal run-off and increased soil-erosion reflected by 13C-depleted carbon isotopes. A second anomaly (>1900 CE) is more likely the result of the “Suess Effect” (anthropogenic impact of the Industrial Revolution on carbon isotopes composition) than another reflection of a TC peak activity interval.”

But since 1900, man has taken over control of the earth’s climate, the authors seem to be suggesting. That was probably written in witha wink to the funders.

Antarctica suddenly lost 400 meters of ice

In another new paper: Abrupt Holocene ice-sheet thinning along the southern Soya Coast, Lützow-Holm Bay, East Antarctica, revealed by glacial geomorphology and surface exposure dating, a team of Japanese scientists led by Moto Kawamata examined the deglacial history of the East Antarctic Ice Sheet (EAIS).

Image: Figure 1 here.

The authors found that it had thinned from at least 400 m a.s.l. during the Early to Mid-Holocene (9–5 ka) and say the abrupt thinning was likely caused by the natural inflow of modified Circumpolar Deep Water via submarine valleys in Lützow-Holm Bay.

Abstract:

Geological reconstruction of the retreat history of the East Antarctic Ice Sheet (EAIS) since the Last Glacial Maximum (LGM) is essential for understanding the response of the ice sheet to global climatic change and the mechanisms of retreat, including a possible abrupt melting event. Such information is key for constraining climatic and ice-sheet models that are used to predict future Antarctic Ice Sheet AIS melting. However, data required to make a detailed reconstruction of the history of the EAIS involving changes in its thickness and lateral extent since the LGM remain sparse. Here, we present a new detailed ice-sheet history for the southern Soya Coast, Lützow-Holm Bay, East Antarctica, based on geomorphological observations and surface exposure ages. Our results demonstrate that the ice sheet completely covered the highest peak of Skarvsnes (400 m a.s.l.) prior to ∼9 ka and retreated eastward by at least 10 km during the Early to Mid-Holocene (ca. 9 to 5 ka). The timing of the abrupt ice-sheet thinning and retreat is consistent with the intrusion of modified Circumpolar Deep Water (mCDW) into deep submarine valleys in Lützow-Holm Bay, as inferred from fossil foraminifera records of marine sediment cores. Thus, we propose that the mechanism of the abrupt thinning and retreat of the EAIS along the southern Soya Coast was marine ice-sheet instability caused by mCDW intrusion into deep submarine valleys. Such abrupt ice-sheet thinning and retreat with similar magnitude and timing have also been reported from Enderby Land, East Antarctica. Our findings suggest that abrupt thinning and retreat as a consequence of marine ice-sheet instability and intrusion of mCDW during the Early to Mid-Holocene may have led to rapid ice-surface lowering of hundreds of meters in East Antarctica.”

Today, if an ice sheet loses 60 cm, it’s deemed a crisis by climate bedwetters. Just imagine if an ice sheet in Antarctica were to lose 400 meters thickness.




Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Scientists: No Correlation Between Climate Change And Wildfires In California – Or Anywhere Else On Earth

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

A “potential connection” between anthropogenic global warming and the frequency or intensity of wildfires in California has yet to emerge in the trend observations.

Scientists have found a “lack of correlation between late summer/autumn wildfires” and “summer precipitation or temperature” in coastal California. In fact, “there is no long-term trend in the number of fires over coastal California” in the last 50 years (Mass and Ovens, 2019).

Image Source: Mass and Ovens, 2019

Fire history in the Western USA has recently declined to the lowest point in the last 1,400+ years (Marlon et al., 2012).

Image Source: Marlon et al., 2012

As CO2 concentrations have risen from 300 ppm to 400 ppm (1900 to 2007), the decline in global burned area has been significant (Yang et al., 2014).

Image Source: Yang et al., 2014

The falling trends in global-scale wildfires can even be dectected over the short-term 2001-2016 period (Earl and Simmonds, 2018).

Image Source: Earl and Simmons, 2018
Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

German Electricity Imports Hit New Record, Rise 43.3 Percent in First Half Of 2020!

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

You would think that with all the added wind and solar energy in Germany, along with all the conventional power plants on standby, all totaling up to huge unneeded capacity, there would be no need to import any power at all. Well, think again.

Photo: P. Gosselin

The German epochtimes.de here reports that German imports of electricity in fact: “rose by 43.3 percent to 25.7 billion kilowatt hours in the first half of 2020 compared with the first half of 2019.”

The epochtimes.de explains further:

One reason for this was the declining share of domestic feed-in from base-load-capable, mostly conventionally operated power plants, which mainly use coal, nuclear energy and natural gas. As a result, electricity was imported to cover the demand for electricity, especially when there was no wind or darkness. The main import country for electricity was France with 8.7 billion kilowatt hours.

Overall, however, more electricity was still exported from Germany.”

What the article does not mention, however, is the reason for the rise in export from Germany. On windy and sunshine-plenty days, Germany produces more electricity than needed, and so is forced to dump the excess power into neighboring foreign markets – often at negative prices. The negative prices, in combination with the mandatory feed-in tariffs and excess production capacity, all means higher costs for consumers.

Little wonder that at close to 35 US cents per kwh, Germany’s electricity prices are among the highest in the world.




Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

New Study Shows German Offshore Wind Turbines May Cannibalize Each Other When Improperly Sited

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Hat-tip: Die kalte Sonne

According to a new study, the expansion of offshore wind energy planned to date could lead to less electricity actually being produced at higher costs because, according to current planning, wind farms are taking the wind away from each other.

The researchers from the Technical University of Denmark in Roskilde and the Max Planck Institute for Biogeochemistry in Jena, Germany have investigated the topic. The study entitled “Making the Most of Offshore Wind” was commissioned by the Agora Energiewende and Agora Verkehrswende think tanks.

The report looks at that the question whether energy models used today by wind farm planners and investors can adequately capture the interaction effects between turbines stemming from very large areas covered with offshore wind farms at high installed capacity density.

Among the study’s key findings:

Offshore wind power needs sufficient space, as the full load operating time may otherwise shrink
from currently around 4,000 hours per year to between 3,000 and 3,300 hours. The more turbines
are installed in a region, the less efficient offshore wind production becomes due to a lack of wind
recovery. If Germany were to install 50 to 70 GW solely in the German Bight, the number of full-load
hours achieved by offshore wind farms would decrease considerably.”

Countries on the North and Baltic Seas should cooperate with a view to maximizing the wind yield
and full-load hours of their offshore wind farms. In order to maximize the efficiency and potential of
offshore wind, the planning and development of wind farms – as well as broader maritime spatial
planning – should be intelligently coordinated across national borders. This finding is relevant to
both the North and Baltic Seas. In addition, floating offshore wind farms could enable the creative
integration of deep waters into wind farm planning.”

Chart source: Study: “Making the Most of Offshore Wind“, Agora Energiewende and Agora Verkehrswende.

More unexpected costs, inefficiency

In a nutshell: a central pillar of the German and European transition to green energies threatens to become even more inefficient and more expensive than planned.




Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

New Study Finds Weak CO2-Induced Warming An ‘Implausible’ Explanation For The End-Triassic Mass Extinction

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Evidence that temperature swings of ±17°C occurred during the end-Triassic mass extinction event imply that CO2 would have needed to increase 8- to 1,024-fold (3 to 10 doublings) to have induced that magnitude of temperature change. It didn’t.

Evidence from a new study (Petryshyn et al., 2020) suggests “repeated” temperature swings of 16-17°C occurred in Cotham Marble (CM, southwest United Kingdom) at the end of the Triassic epoch, when the worst extinction event of the last 500 million years occurred.

However, Petryshyn and colleagues acknowledge they “cannot resolve millennial-scale increases in temperatures in the region, implying that, at least locally, the initial extinction is not attributable to extreme warming.”

But even if the end-Triassic mass extinction (ETME) could be attributable to extreme warming, CO2 would be “implausible” as a mechanism.

According to models, CO2 increases up to 8 times the pre-industrial baseline (280 ppm) could only increase sea surface temperatures 5.4°C at most (Petryshyn et al., 2020). CO2 concentrations would need to increase up to 1,024-fold to elicit temperature changes reaching 16 or 17°C.

Therefore, “the initial onset of the biodiversity crisis may necessitate another mechanism.”

Image Source: Petryshyn et al., 2020
Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Greta Thunberg’s Scandinavia Has Seen August Cooling Trend Over The Past Quarter Century

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

By Kirye
and Pierre Gosselin

Today we plot the Japan Meteorological Agency (JMA) data for Northern Europe for the month of August, 2020.

We have selected this region because it is the home of 17-year old climate alarmist/activist Greta Thunberg, who thinks the planet is heating up rapidly and so we’re all doomed.

We plot the data for the stations for which the JMA has sufficient data going back over 2 decades. First we plot the August data for Sweden, Greta’s home country:

Data: JMA

Five of the 6 stations plotted show a cooling trend. So it’s a mystery how Greta thinks her country is warming up. The data suggest that summers have been shortening a bit. Over the course of Greta’s life, she has yet to see warming in August.

Next we examine Norway, Greta’s western neighbor:

Data: JMA

Here we see 6 of the 11 stations have seen an August cooling trend over the past quarter century. The colder stations have warmed somewhat, while the warmer ones have cooled. Overall, no warming to speak of, really.

The story is similar in Finland (further from the Atlantic), but here the colder stations have cooled, while the warmer stations have warmed slightly – but statistically insignificantly:

Data: JMA

Finally we plot the data for the emerald island, Ireland, next to the tempestuous Atlantic:

Data: JMA

Four of the six stations in Ireland have been cooling in August. Those warming have done so insignificantly. Overall the emerald island has been cooling during the month of August since 1983!




Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Schizophrenic German Wind Power Output In August, Plagued By Wild Volatility

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Here’s another example illustrating just how volatile and unreliable wind energy really is.

Wind energy proponents like to claim that although turbines installed on land don’t produce so optimally, the ones at sea are wonderful because the wind there is always blowing and so it all kind of evens out.

The chart below shows the output of all wind turbines installed in Germany, both on land and offshore, from the five major German grid operators:

The dark horizontal line denoting 60,000 MW represents the so-called installed total capacity. Readers will note that less than 10% of rated capacity often gets produced. Only rarely does an output of 33% (20 MW) ever get reached.




Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Austrian Analyst: Things With Greenhouse Effect (GHE) Aren’t Adding Up…”Something Totally Wrong”

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Something is rotten with the GHE

By Erich Schaffer

Introduction

The greenhouse effect (GHE) is a well established theory which most people consider a solid fact, even those who are otherwise “critical” over global warming. On the other side there are some voices who “deny” the GHE with flatearther-like arguments, which seemingly only adds to the credibility of the theory. This is a very odd situation, since the are huge issues with the GHE hidden in plain sight.

“Without GHGs, the Earth would be a frozen planet with a temperature of only -18°C, or 255°K”. This definition is all too familiar to us all and the experts naming it are legion. The 255°K isthe result of a (relatively) simple formula.

(342 x ((1-0.3) / 1)  / 5.67e-8) ^0.25 = 255

342W/m2 is the amount of solar radiation (the exact number may vary), 5.67e-8 is the Stefan-Boltzmann constant and ^0.25 (or the 4th root) represents the Stefan-Boltzmann law according to which radiation is a function of temperature to the power of 4.

Black body assumption trouble

The interesting part however is (1-0.3 / 1). 0.3 is the albedo of Earth and 1-0.3 os thus the absorbtivity, which is the share of solar radiation the Earth absorbs (~70%). The 1 below the comma, which is usually omitted, represents emissivity, which is the share of LWIR emitted by the Earth relative to a perfect black body of the same temperature. In other words, it is being assumed Earth would be emitting just like a perfect black body if it were not for GHGs. And that is where the trouble starts.

The basic problem

Quite obviously there are two factors that “violate” the assumption named above.

  1. The surface of the Earth, mainly consisting of water, is not a perfect emitter, pretty much like any real surface. Although it is not the scope of this article, it can be shown there is a significant deviation from 1 (in the 0.91 to 0.94 range). One needs to look up Fresnel equations, the refractive index of water and so on to sort out this subject.
  2. Clouds interfere massively with LWIR emissions. Actually this is common wisdom, as “clear nights are cold nights” and most people have made the according experience. Even the IPCC states clouds would block a 50W/m2 of SW radiation, retain a 30W/m2 of LWIR and thus have a net CRE (Cloud Radiative Effect) of -20W/m2 [1]. Of course those -50W/m2 of SW CRE are already included in the formula above (part of the 30% albedo), while the 30W/m2 of LW CRE are not.

For this reason we need to make some minor corrections to the GHE as presented above. Basically Earth receives some 240 W/m2 of solar radiation (= 0.7 x 342) and is meant to emit some 390 W/m2 at 288°K at the surface. Next to a temperature of 33°K, the GHE would thus amount to about 150W/m2 respectively (=390-240).

Since in reality the surface is not a perfect emitter, the 390W/m2 are totally inaccurate. In fact it is easy to call it “fake science” whenever someone claims this number, or an even higher one. Rather we need to reduce this figure by at least 20 W/m2 to allow for a realistic surface emissivity. Next we need to allow for the 30 W/m2 that clouds provide and thus our GHE shrinks from 150 W/m2 to a maximum of only 100 W/m2 (150 – 20 – 30).

For the sake of clarity we should rename the GHE to GHGE (greenhouse gas effect) as this is the pivotal question. How much do GHGs warm up the planet? It is important so see that GHGs were attributed with their specific role in a kind of “diagnosis of exclusion”. If it were not for GHGs, what would be the temperature of Earth? Any delta to the observed temperature can then be attributed to GHGs.

Such a “diagnosis of exclusion” is always prone to failure, be it in the medical field or anywhere else. Essentially a large number of variables need to be taken into account and the slightest mistake in the process, will necessarily cause a faulty outcome. For that reason it should be considered an approach of last resort, maybe helpful to treat a patient or solve a criminal case. As a starting point in physics it is a no-go, and as we can see, it delivers wrong results. But maybe that is the reason why it was chosen in the first place. Faulty approaches give a certain freedom of creativity.

GHGE being notoriously exaggerated

Still, we have not broken any barriers so far. Yes, the GHGE is notoriously being exaggerated and anyone who claims Earth would be 255°K cold if it were not for GHGs, is either incompetent, or simply lying. You cannot excuse such a claim as “simplification”, since exaggerating the GHGE by some 50% at least is certainly beyond negligible.

On the other side, this does not deny the global warming narrative at all. One might consider downgrading climate sensitivity a bit, which would only result in climate models better matching reality. Even then, this will only put things on a healthier and more appropriate basis, eventually supporting the theory of CO2 induced global warming.

Digging deeper

So far I have not introduced anything substantially new, but only pointed out to what is known and yet constantly forgotten. Especially the CRE in its quoted magnitude is pretty much an undisputed fact of science. Although I do not know exactly what the origins of these estimates were, experts like Veerabhadran Ramanathan already zeroed in on it in the 1970s. Satellite driven projects like ERBE or CERES later confirmed and specified those estimates.

The net CRE of -20W/m2 thus can be found in the IPCC reports, NASA gives detailed satellite data on it, and even “sceptics” like Richard Lindzen name and endorse it[2]. Such a solid agreement is not just good for my argumentation above, it is also great for the GHGE itself. In fact the negative CRE is pretty much a conditio sine qua non. If clouds were not cooling the planet, the scope for GHGs might become marginal.

There are indeed some issues with the CRE I need to talk about and things are not nearly as settled as I just suggested.

  1. Whatever experts name a net CRE of about -20 W/m2, they refer to the same sources, which are ERBE and CERES satellite data.
  2. This are not satellite data at all, but models which are getting fed with some satellite data, among others.
  3. These models were largely developed by the same people who predicted the negative CRE in the first place. They might not even have a (significant) GHGE if the result would not turn out how it did.
  4. A closer look on these model results show totally inconsistent outcomes over time. Regions with massively negative local CREs turned into having positive CREs, and vice verse.[3]
  5. The only thing which really held constant over time was the overall negative CRE of the named magnitude. Of course, that is a precondition to the GHGE and cannot be put into question, if “climate science” wants to have an agenda.

There is yet another side to it. Obviously the net CRE is the sum SW and LW CREs, which can easily be formulated as CREsw + CRElw = CREnet. Since the CRElw is what is being forgotten so notoriously (as it diminishes the GHGE), we could assume there might be a motivated tendency to minimize the CRElw. Given the logical restrictions, this can be achieved by making the CREnet as negative, and the CREsw as small as possible. In other words, there is a trinity of issues with the CRE.

  1. The -50 W/m2 of SW CRE. This figure is pretty low as compared conventional wisdom, according to which clouds make up for about 2/3s of the albedo, or almost -70W/m2.
  2. The net CRE of some -20 W/m2. We are going to have a look into this hereafter.
  3. The LW CRE of +30 W/m2 which is reducing the GHGE as shown above, but for some strange reason tends to be “forgotten”.

Putting things to the test

Since the net negative CRE is “confirmed” by nothing but models of dubious nature, since logic might suggest the opposite (to cut a long story short) and the whole GHGE theory totally depends on it, this question made a perfectly legit target for fact checking. It is the one pivotal question it all boils down to. Is the CRE negative indeed and how could we possibly put it to the test?

As on my previous works in forensic science it seemed mandatory to pass by any conventional approach subject to predictable restrictions. Rather you will have to go beyond the understanding of those who might conspire so that their possible defences turn futile. And of course this would require brute force of intellect, creativity and a bit of luck to find an appropriate leverage.

At least the latter turns out to be a friendly gift by the NOAA. Under the title “QCLCD ASCII Files” the NOAA provided all METAR data from US weather stations[4]. Regrettably they pulled these valuable data from their site soon after I downloaded it, and the alternative “Global-Hourly Files” is not quite working[5].

The METAR data, as far I understand, are taken at airports and contain, next to usual meteorologic data, cloud conditions originally meant to assist aircraft operating around these airports. The data are anything but perfect for our scope and are subject to a couple of restrictions. As a rule, cloud condition is only reported up to 12,000 ft, yet individual exceptions may occur. Then this cloud condition is reported in 5 different “flavours”, which are CLR, FEW, SCT, BKN and OVC, or combinations of which. For our purpose any combination will be reduced to the maximum cloud condition.

Even if this is not an ideal data pool, it meets a lot of necessary requirements. First it is a totally independent data source, which has never been meant to be used for climate research. Second these data have been collected by many people, who may have made individual mistakes in the process, but were certainly not systemically biased. Third these data are thus “democratic” in nature, not controlled by the bottle neck of a few experts. Eventually, and that is the most important point, we need no models here, but we can look straight onto the empiric evidence.

The Result

First I need to tell how such basic research gives you amazing insights otherwise not available anywhere.

You will not get to see what you like to, or expect to see, but what there is. Just like Christopher Columbus searching for India and finding America, you will have to take things for face value. Analyzing the data back and forth, using different perspectives, this was not a simple look up to confirm a certain expectation, but rather a process of continuous learning. Accordingly there are lots of results giving excellent insights into the nature of clouds, or their impacts on climate respectively.

This graph was taken from Harvard’s educational site [6] on the subject. Here, like in later iterations of the ERBE / CERES modelling, the northern Pacific is meant to be one of the areas with a massively negative CRE, which are of special interest to me.

Since the Aleutian islands are US territory, my NOAA data set included 10 stations located right there.

For the years 2016 and 2017, these stations report about 325,000 valid datasets, with almost 60% of which being overcast. So it is indeed a very cloudy region.

Once we resolve the cloudiness/temperature correlation by season, we find a very typical outcome. OVC skies are correlated with lower temperatures in spring and early summer, and with higher temperatures throughout the rest of the year. This pattern has been seen in all subsets featuring distinct seasons and is due to surface temperatures lagging behind solar intensity.

It is an analogy to the day/night cycle, where clouds hold down day time temperatures, while keeping nights relatively warm. It is about the relation of incoming SW to outgoing LW radiation. As clouds interfere with both radiative fluxes, their primary effect will be relative to which of these fluxes is stronger. In spring surface temperatures lag behind solar intensity, LW emissions will be relatively weak and thus clouds are cooling. In autumn this relation naturally reverses and then clouds are warming.

Note: These “tidal effects” are a direct representation of the LW CRE. Although it goes way beyond the scope of this article, such data are very helpful in assessing the actual magnitude of the LW CRE.

A huge surprise

Finally, if we add up the above results and look at the annual average (thus seasonally adjusted), we are in for a huge surprise (or possibly no more at this point). The correlation between clouds and temperature is strictly positive. The more clouds, the warmer it is, and that is in a region where models suggest a massively negative CRE.

Obviously something is totally wrong here.

I am confident the METAR data are correct, and I am certain my analysis is correct, since I have gone over it many times and the outcome is consistent with all the different perspectives. Instead, the ERBE/CERES models are wrong when compared to empiric evidence a.k.a. “reality”. That is not much of a surprise given a track record of inconsistent results.

And as much as the Bering Sea looks like “a perfect match” to fact check these models, the problems go far beyond the region. No matter where ever I looked, a negative CRE could not be found.

Just the tip of the ice berg

Yet this is just the tip of the iceberg. Of course you need to check for biases to see how much a correlation also means causation, and there are a few. Humidity, as much as it may serve as an indicator for the assumed GHG vapour, is indeed correlated with cloudiness (78% rel. humidity with CLR, 85% with OVC), but this delta is a) influenced by rain and b) too small to explain what we see.

More importantly, this analysis is all about low clouds up to 12.000 ft and it is undisputed the net CRE turns more positive the higher up clouds are. Then there is the subject of rain chill, which makes clouds look statistically colder than they are. Finally temperatures are sluggish relative to ever changing cloud conditions and we would certainly see a larger delta in temperatures if respective cloud conditions were permanent.

Systemic GHE failure

Unlike what I named before, this is not just a scratch on the GHE theory, but systemic failure. If clouds warm the planet indeed, and all the evidence points this way, the very foundation of the theory is getting annihilated. Not that GHGs might not play a certain role in Earth’s climate, but the size of the GHGE will be only a fraction of 33°K, and one that is yet to be precisely determined.

[1] 5th AR of the IPCC, page 580

[2] https://wattsupwiththat.com/2020/06/29/weekly-climate-and-energy-news-roundup-414/

[3] https://www.researchgate.net/figure/Comparison-of-annual-mean-SW-LW-and-net-CRE-of-E55H20-E61H22-and-E63H23-to-CERES-40_fig2_335351575

[4] https://www.ncdc.noaa.gov/data-access/quick-links

[5] Documentation does not fit the data format, for some reason I am unable to locate temperature readings, the format itself is hard to read, and finally for some reason these data, station by station, do not correspond to those of the “QCLCD ASCII Files”

[6] https://www.seas.harvard.edu/climate/eli/research/equable/ccf.html




Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Two New Temperature Records Show No Warming In Central Asia Since 1766 A.D. Or In Spain Since 1350 A.D.

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Scientists continue to publish papers revealing no unusual climate trends for the last several centuries in many regions of the world.

Despite the 135 ppm increase in CO2 concentration (275 ppm to 410 ppm) since the 1700s, a new 250-year temperature (precipitation) reconstruction (Peng et al., 2020) shows there has been no net warming in Central Asia since 1766. Two other reconstructions from this region also show no warming trend in recent centuries.

Image Source: Peng et al., 2020

Earlier this year we highlighted a new study that indicated France was up to 7°C warmer than today about 7800 years ago after cooling by 3°C in the last 200 years.

Another new study (Esper et al., 2020) suggests there has been no net warming in Spain since 1350 A.D.

The years that spanned 1474-1606 A.D. scored 7 of the 10 warmest years in the record. In contrast, there has been only 1 warmest year (1961) and 4 of the 10 coldest years since 1880.

The 2 warmest 30-year (climate) periods occurred in the decades surrounding the ~1530s and ~1820s.

The authors record a “striking” and abrupt (within decades) 1°C warming trend during the late 1700s to early 1800s that exceeds any temperature change in the modern record.

Image Source: Esper et al., 2020
Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Green Dream Arrives In Germany! But Repowering Obstacles Pose “Imminent Catastrophe” For Wind Power

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

The  green dream, with all its scenic beauty and nature conservation, has arrived in northern Germany. But now that green dream faces more obstacles. 


All is not so wunderbar when it comes to Germany’s wind power outlook.

Germany’s Renewable Energy Sources Act, passed in 2000, was intended to ensure the generation of “green” electricity. Operators of wind turbines were guaranteed subsidies for a period of twenty years  – with the hopes the technology would develop to such an extent that it would operate economically without subsidies.

20 years later, the wind turbines are still not competitive reports trendsderzukunft.de here.

Plagued by high costs

The first problem with the old turbines? The costs. They require comparatively frequent maintenance. “This drives up the costs, which is why operation is not economical in many cases.” reports trendsderzukunft.de. “A study has shown that at an electricity price of 3.375 cents euro per kilowatt-hour, only 23 percent of the old plants can be operated without subsidies.”

Feed-in requirement running out

The second problem: “The Feed-in priority” law which forced power grid operators to purchase wind electricity. “However, it is unclear whether this regulation will continue to apply despite the expiration of subsidies,” says trendsderzukunft.de. “This question will probably have to be cleared up by the courts in the end. However, many operators will probably not wait for this and prefer to shut down the old wind turbines instead.”

Sites will have to be abandoned

Trendderzukunft.de. adds: “By 2025, there is a risk of losing 2,300 to 2,400 megawatts of capacity every year.”

Moreover, legal hurdles prevent repowering, which involves “replacing several old turbines with one new and larger one.” The problem, reports trendsderzukunft.de  is that at many existing locations “there is a height limit for wind turbines. The installation of the latest generation of wind turbines is therefore not possible there. However, smaller turbines are no longer available on the market. In many cases, therefore, the sites simply have to be abandoned.”

“Imminent catastrophe”

For that reason, around 1,000 of 1691 wind turbine sites are affected in Lower Saxony alone and are currently not available for so-called repowering. “Lower Saxony’s Minister of Energy and Environment, Olaf Lies (SPD), speaks of an imminent ‘catastrophe’ for wind power in Germany,” writes trendsderzukunft.de.




Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Year Of Global Cooling. JRC Analysis Show Land And Sea Surface Temperatures Continue To Fall

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

By Prof. Fritz Vahrenholt

The global mean temperature of the satellite-based measurements remained almost unchanged in August compared to July. The deviation from the 30-year average (1981 to 2010) was 0.43 degrees Celsius.

Temperature measurements on land and in the sea continue to decrease, as the graph of the JRC analysis shows, especially in the southern hemisphere (blue).

Approaching La Nina

The research institutes predict with high probability a La Nina in the Pacific Ocean next winter. Therefore, a further decrease in global temperatures is expected until next spring. The following diagram shows the incipient cooling effect in the Pacific.

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Scientists Just Discovered Their Past Carbon Budget Guesses Have All Along Been Twice As Wrong As They Thought

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

A new assumption about carbon budgets reveals climate scientists have been vastly underestimating (by a factor of 2) the amount of carbon absorbed by the ocean for decades. Every past carbon budget estimate has been twice as wrong as the current estimate.

When it comes to the ocean heat fluxes and source vs. sink carbon budget estimates, climate scientists have been providing little more than educated guesses for decades.

For example, climate models have long suggested the ocean heat fluxes may only vary around 1 W/m². But “objective” analyses of oceanic latent heat flux (LHF) using different assumptions (equations) reveals fluxes were likely closer to 10 W/m² during 1981-2005 (Yu and Weller, 2007). So our modeled guesses were off by a factor of 10 compared to newer analyses.

Image Source: Yu and Weller, 2007

Ocean carbon sink processes not understood and driven by natural variability

McKinley et al. (2017) analyzed ocean carbon sink estimates and was willing to admit that due to a lack of observation, we lack a “detailed, quantitative, and mechanistic understanding of how the ocean carbon sink works…”

In addition, because internal variability in oceanic carbon uptake is so massive and largely unobserved, we cannot yet detect an anthropogenic influence.

McKinley and co-authors go so far as to acknowledge the “change in CO2 flux over 10 years (1995-2005)…is due almost entirely to the internal variability” because in most ocean regions “the forced [human-induced] trends in CO2 flux are too small to be statistically significant” and the “variability in CO2 flux is large and sufficient to prevent detection of anthropogenic trends in ocean carbon uptake on decadal timescales.”

Image Source: McKinley et al. (2017)

The Southern Ocean absorbs more than 10 times less carbon than previously thought

The Southern Ocean is where the largest portion of anthropogenic carbon (from our emissions) is said to be absorbed, or Earth’s largest oceanic CO2 sink.

Just 2 years ago, a carbon uptake analysis (Gray et al, 2018 with a Physics Today press release) that utilized estimates from biochemical floats instead of estimates from ships suggested the exact opposite of what had been previously thought. Instead of absorbing close to 1 petagram of carbon (PgC) per year, the Southern Ocean is barely even a carbon sink at all – just 0.08 PgC of yearly absorption. In fact, large regions of the Southern Ocean near Antarctica are a net source of CO2 to the atmosphere.

In other words, when estimates are float-based rather than ship-based, one estimate is more than 10 times different than the other.

Image Source: Physics Today

The global ocean absorbs 2 times more CO2 than previously thought

And now a new study (Buesseler et al., 2020) has scientists insisting that all of our previous estimates of global ocean carbon uptake are substantially wrong because we’ve been measuring from a fixed depth rather than varying depths.

Previously scientists had been using flux estimates from the “canonical fixed 150-m depth.” The new-and-improved way to assess carbon uptake is from varying but often much shallower depths: the euphotic zone (Ez). This is the section of the upper ocean layer that sunlight is able to penetrate, and it can “vary from less than 20 m to almost 200 m” in depth.

When we use the Ez to estimate carbon absorption versus export, the absorption changes from 2.8 petagrams of carbon (PgC) per year to 5.7. So the global ocean sink can be more than doubled just by varying the depth of measurement rather than using a fixed depth.

So, up to this point, scientists’ past guesses about ocean carbon uptake have been emphatically wrong. We can be assured, though, that our current guesses are anywhere from 2 to 10 times less wrong than the last ones.

Image Source: Buesseler et al., 2020 and press release
Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this. More information at our Data Privacy Policy

Close