Two European Professors: IPCC Climate Modeling Methodology Opens Door To “Fake Conclusions” …”Manipulations””

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Two European professors recently wrote that the IPCC projections of future warming are based on huge unknowns, and do not take the past properly into account.

This means that projections of the future of the world’s climate are unreliable, according to Samuel Furfari, Professor at the Free University of Brussels,
and Henri Masson, Professor (Emeritus), University of Antwerpen.

According to the two professors, the GISS Surface Temperature Analysis (GISTEMP v4) is an estimate of global surface temperature change (one that is often used by climate scientists for their reports to the media).

This estimate is computed using data files from NOAA GHCN v4 (meteorological stations), and ERSST v5 (ocean areas). In June 2019, the number of terrestrial stations was 8781 in the GHCNv4 unadjusted dataset; but in June 1880, that figure was a mere 281 stations, the two professors write.

Scientists ignoring climate’s cyclic features

Professors Furfari and Masson write: “The climate system, and the way IPCC represents it, is highly sensitive to tiny changes in the value of parameters or initial conditions and these must be known with high accuracy. But this is not the case.”

In other words, the IPCC method is fraught with great uncertainties and much guesswork.

“This puts serious doubt on whatever conclusion that could be drawn from model projections,” the two professors write.

Open door to “fake conclusions”… “manipulations”

Masson and Furfari say that IPCC scientists ignore that climate change occurs with cyclic behavior and that linear trend lines applied to (poly-)cyclic data of period similar to the length of the time window considered, open the door to any kind of fake conclusions, if not manipulations aimed to push one political agenda or another.

Sea surface temperature incomplete

Other factors also cast doubt over the reliability of climate models, among them the lack of data on sea surface temperatures (SSTs). The oceans represent about 70% of the Earth surface.

Sea surface temperature measurement and their (non) representativity.

“Until very recently, these temperatures have been only scarcely reported, as the data for SST (Sea Surface Temperature) came from vessels following a limited number of commercial routes,” report Masson and Furfari.

Such gaping data holes make it impossible to correctly calibrate models, and so that they are able to project the future climate. Scientists in many cases are simply free to guess whatever they want.

Masson’s and Furfari’s conclusions follow:

  1. IPCC projections result from mathematical models which need to be calibrated by making use of data from the past. The accuracy of the calibration data is of paramount importance, as the climate system is highly non-linear, and this is also the case for the (Navier-Stokes) equations and (Runge-Kutta integration) algorithms used in the IPCC computer models. Consequently, the system and also the way IPCC represent it, are highly sensitive to tiny changes in the value of parameters or initial conditions (the calibration data in the present case), that must be known with high accuracy. This is not the case, putting serious doubt on whatever conclusion that could be drawn from model projections.
  2. Most of the mainstream climate related data used by IPCC are indeed generated from meteo data collected at land meteo stations. This has two consequences:
    (i) The spatial coverage of the data is highly questionable, as the temperature over the oceans, representing 70% of the Earth surface, is mostly neglected or “guestimated” by interpolation;
    (ii) The number and location of theses land meteo stations has considerably changed over time, inducing biases and fake trends.
  3. The key indicator used by IPCC is the global temperature anomaly, obtained by spatially averaging, as well as possible, local anomalies. Local anomalies are the comparison of present local temperature to the averaged local temperature calculated over a previous fixed reference period of 30 years, changing each 30 years (1930-1960, 1960-1990, etc.). The concept of local anomaly is highly questionable, due to the presence of poly-cyclic components in the temperature data, inducing considerable biases and false trends when the “measurement window” is shorter than at least 6 times the longest period detectable in the data; which is unfortunately the case with temperature data
  4. Linear trend lines applied to (poly-)cyclic data of period similar to the length of the time window considered, open the door to any kind of fake conclusions, if not manipulations aimed to push one political agenda or another.
  5. Consequently, it is highly recommended to abandon the concept of global temperature anomaly and to focus on unbiased local meteo data to detect an eventual change in the local climate, which is a physically meaningful concept, and which is after all what is really of importance for local people, agriculture, industry, services, business, health and welfare in general.

=================================

Dr. Samuel Furfari is a recognized authority on energy policy based in Brussels. Between 1982 to 2018, he was a senior official on energy policy in the European Commission. For 15 years he is a professor of energy geopolitics and energy politics at the Free University of Brussels. He is the president of European Society of Engineers and Industrialists.

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

12 New Papers Provide Robust Evidence The Earth Was Warmer During Medieval Times

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Claims that modern temperatures are globally warmer than they were during Medieval times (~800 to 1250 A.D.) have been contradicted by a flurry of new (2019) scientific papers.

Southern Ocean/SE Pacific (SSTs)

The Medieval Warm Period (1100 years BP) was 1.5°C warmer than today (14°C vs. 12.5°C) in the SE Pacific or Southern Ocean.

Collins et al., 2019

Antarctica/Southern Ocean (SSTs)

Elephant seals used to breed on the Victoria Land Coast ~1000 years ago because there was open water access back then whereas this region is locked in sea ice today. To escape the reach of modern sea ice, elephant seal breeding colonies are now located ~2,400 km further north (sub-Antarctic islands) of where they bred during Medieval times.

Koch et al., 2019

Peru coastal region (SSTs)

There has been a rapid drop (approximately -1°C to -1.5°C) in sea surface temperatures during the last 50 years near the coast of Peru. 1-2°C warmer temperatures persisted ~1,000 years ago and there were 4°C warmer temperatures during the Mid-Holocene in this region.

Salvatteci et al., 2019

Canadian Arctic (SSTs)

During the Medieval Warm Period, bowhead whales occupied Canadian Arctic waters that are today sea ice-covered and inaccessible. During the Mid-Holocene (5000 to 3000 years ago), bowhead whales swam in the open-water Canadian Arctic.

Szpak et al., 2019

Mediterranean region

No warming during 1955-2013 is apparent in the northeastern Mediterranean region. There were far more prevalent warm extremes during the Medieval Warm Period than during the last 450 years (including just one single high temperature extreme year since 1900).

Klippel et al., 2019

Greenland Ice Sheet

Greenland was at least 1°C warmer during Medieval times. The ice sheet cooled by nearly 2°C from the 1930s to 1980s.

Adamson et al., 2019 (citing Kobashi et al., 2017).

Pacific, Atlantic Ocean (heat content)

“Finally, we note that OPT-0015 indicates that ocean heat content was larger during the Medieval Warm Period than at present”

Gebbie and Huybers, 2019

Northeastern China 

Modern mean annual temperatures range between 0.0 to 0.5°C, whereas Medieval Warm Period mean annual temperatures ranged between 2-6°C and Holocene Thermal Maximum temperatures reached 10.5°C.

Liu et al., 2019

Russia (360 km NW of Moscow)

Today’s mean annual temperatures are 4.1°C. During Medieval times mean annual temperatures were about 5°C, and during the Holocene Thermal Maximum mean annual temperatures were 2°C warmer than today.

Novenko et al., 2019

Alaska (western, Yukon-Kuskokwim Delta)

This region was about 1°C warmer during Medieval times and has undergone an overall cooling trend of about -0.7°C since 1800.

Sae-Lim et al., 2019

Antarctica (Whole)

Antarctica was approximately 0.5 to 1°C warmer than today during the entire 1st millennium, and the modest warming in the last century has in recent decades turned into a non-warming or cooling trend for West Antarctica, the Antarctic Peninsula, and East Antarctica.

Lüning et al. 2019

South China Sea

The South China Sea was about 0.5°C to 0.7°C warmer during Medieval times and still warmer than today during the Little Ice Age.

Lee et al., 2019

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Lingen Cheated: Germany’s New All-Time Record High Resulted From DWD Weather Service Lousy Station Siting

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

During last week’s record-setting European heat wave, Germany’s previous record of 40.3C was impressively shattered by the measurement station located at the northwest city of Lingen, near the Dutch border, some 50 kilometers from where I live. The German DWD weather service and media loved it!

Controversial siting

Yet, controversy now swirls about the new record setting measurement since it has come to light that the measurement is fraught with some considerable siting issues.

As the photo published by T-online here shows, the station is located right near a DWD office building, is shielded from the wind by grown trees and is located not far from a public swimming pool.

Meteorologist Michael Theusner told t-online.de: “The monthly average of the daily highs in Lingen has been deviating more and more upwards from the average of the highs in Lower Saxony since 2010.” The station has become increasingly shielded and thus tends to heat up more.

Swiss veteran meteorologist Jörg Kachelmann wrote the extra heat possibly could be heating the station by up to another 3 degrees!

DWD accepts overheated reading

Germany’s DWD national weather service has even confirmed that the station’s siting is no longer adequate and that a move to a new site had been already long planned. But despite the acknowledged poor siting, the DWD went ahead and confirmed the reading as valid. The 42.6°C reading now stands as Germany’s new all-time recorded high temperature, no matter how questionable the reading may be.

Valid? How does Lingen compare to other temperature stations located nearby? I did a check.

Fifty kilometers to the east of Lingen where I live (Quakenbrück), my home thermometer showed a high of 38.2°C. That’s unofficial, of course, but it made the Lingen reading look suspicious. So I decided to compare Lingen to the other official stations nearby to see if their readings were as hot as Lingen’s.

The surrounding readings do indeed confirm meteorologist Kachelmann’s gut suspicion. Lingen readings are suspiciously 2 – 3C hotter than those of its neighboring stations.

What follows is a chart depicting the readings recorded by the Lingen station over the past past week. The DWD itself stated that the mercury in fact reached 42.6°C on Thursday (Do. 25.07)!

Chart: wetter24.de

Now looking at the Nordhorn station, located some 20 kilometers away, here the mercury climbed to 40.9°C, which is 1.7°C below the Lingen reading:

Chart: wetter24.de

At the Meppen station, also located some 20 km away to the north of Lingen, the mercury reached 39.2°C, a far cry (3.4°C) from Lingen’s maximum reading:

Chart: wetter24.de

At the military base in Diepholz, some 65 km to the east, the high on Thursday reached 38.7 degrees, i.e. almost 4°C below the Lingen reading!

Chart: wetter24.

In Bersenbrück, some 40 km to the east, according to wetterkontor.de, the mercury rose to 39.6°C,  which is 3°C below Lingen’s record reading.

Looking at two other stations nearby, we see that Emsdetten recorded 40.6°C, and Ahaus saw 39.5°C, both significantly below Lingen’s recorded measurement.

More than 2°C hotter than surrounding stations!

A summary of all the temperature readings recorded at Lingen compared to those of 6 nearby stations for the past 5 days, July 23 – July 27:

On every single day, Lingen handily beat its neighbors by at times large margins. Did the climate aim some sort of giant heat ray at Lingen?

2.6°C hotter than adjacent stations

On the record-setting date of Thursday, July 25, 2019, the mean of the 6 neighboring stations listed in the table was an eyebrow-raising 2.6°C below the Lingen reading. All the stations cited around the Lingen station all share very similar topographical characteristics, and all are located on the north German flatlands, and thus comparable.

No doubt the reading at Lingen is accurate. But it looks like the siting of the Lingen station is causing overheated conditions. The DWD should think twice before accepting Lingen’s dubious 42.6°C reading as a valid new record and allowing it to flow into the statistics.

In the contest to be the hottest city in Germany, Lingen cheated.

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

US Justice Also Needs To Focus On Fabricated “NASA Climate Dossier” Which Aims To Frame CO2, Citizens

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

As it emerges the “Russia Affair” was based on a dossier of fabricated, phony data designed to wrongfully criminalize a President, it also has become clear that trace gas CO2 has been similarly set up using a phony climate dossier.

It too was made up – in great part by NASA.

Let’s hope the Department of Justice also pursues the cast of shady figures behind the “climate dossier” and that their “Day of Reckoning” is coming too.

Video explains NASA data forgery

When it comes to data manipulations, no one has worked as tirelessly as Tony Heller in gathering the evidence of all the wrongdoing going on within the US weather Agencies as they collude to rewrite the US temperature history in a manner that reminds us of George Orwell’s 1984.

In his latest video, “John Dillinger – An Early Victim Of Climate Change“, Tony Heller illustrates how global weather extremes were far worse in the 1930s than they are today, but are being deleted by NASA.

In his video, Heller, an expert software engineer, explains how we should think that scientists would want to try to understand why the weather back in 1934 went so off the rails as unprecedented drought ravaged the globe. However, today’s scientists are instead rewriting the data in order to cover up the extreme events of 1930s.

Straight out of Orwell’s Ministry of Truth”

For example, 1934 used to be the hottest year recorded in the US, hotter than 1998. But NASA scientists tampered with the data and made 1934 much cooler while fudging the 1998 data to be warmer (8:19 mark).

“They turned cooling into warming,” Heller says. “This is straight out of Orwell’s Ministry of Truth, rewriting the past.”

Why is this being done? Heller firmly believes that at NASA, it has everything to do with propaganda, and nothing to do with science.

“These people are not scientists; they’re propagandists pretending to be scientists, Heller says. “Actual scientists want to understand the past, not rewrite it.”

Running a racket

So blatant and rampant are the data alterations and history rewritings that Heller is compelled to compare the involved NASA scientists to 1930s Chicago gangsters. “We just have a different type of gangsters now. They dress in white lab coats, but basically they’re running the same racket,” says Heller.

Critics have been pointing out and documenting the known data alterations for years, so now is as good a time as any for the Department of Justice to investigate the data racket currently going on at NASA. Scientists are not above the law.

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

While Boasting About Cutting CO2, Europe In Fact Driving Up Carbon Dioxide Emissions …Through TROPICAL DEFORESTATION!

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

By Die kalte Sonne

Europeans are eating too many imports from tropical countries, which are cutting down the rainforest and thus fueling climate change, says Chalmers University of Technology in Gothenburg:

EU consumption linked to tropical deforestation

A sixth of all emissions resulting from the typical diet of an EU citizen can be directly linked to deforestation of tropical forests. Two new studies, from Chalmers University of Technology, Sweden, shed new light on this impact, by combining satellite imagery of the rainforest, global land use statistics and data of international trade patterns. “In effect, you could say that the EU imports large amounts of deforestation every year. If the EU really wants to achieve its climate goals, it must set harder environmental demands on those who export food to the EU,” says Martin Persson from Chalmers, one of the researchers behind the studies.

The link between production of certain foods and deforestation has been known before. But what Martin Persson and Chalmers colleague Florence Pendrill have now investigated is the extent to which deforestation in the tropics is linked to food production, and then where those foods are eventually consumed. In the first study, they focused on how the expansion of cropland, pastures, and forestry plantations has taken place at the expense of the rainforest. “We can see that more than half of deforestation is due to production of food and animal feed, such as beef, soy beans and palm oil. There is big variation between different countries and goods, but overall, exports account for about a fourth of that deforestation which is connected to food production. And these figures have also increased during the period we looked at,” says Florence Pendrill.

Using this information, the researchers investigated, in the second study, the amount of carbon dioxide emissions resulting from this production (see the picture below), and where the produce is then consumed. The figures for the EU are particularly interesting, since the EU is a large food importer. Furthermore, the EU shall soon present a plan for how to reduce its contribution to deforestation. The EU already has strict requirements in place connected to deforestation which producers of timber and wood products must adhere to in order to export their goods to the EU. This demonstrates their ability to influence other countries’ work in protecting the rainforest.

“Now, as the connection between food production and deforestation is made clearer, we should start to discuss possibilities for the EU to adopt similar regulations for food imports. Quite simply, deforestation should end up costing the producer more. If you give tropical countries support in their work to protect the rainforest, as well as giving farmers alternatives to deforestation to increase production, it can have a big impact,” says Florence Pendrill. The current studies were done in collaboration with researchers from the Stockholm Environment Institute in Sweden, Senckenberg Biodiversity and Climate Research Centre in Germany, and NTNU, the Norwegian University of Science and Technology. They are a continuation of research which was done through the Prince project (Policy Relevant Indicators for National Consumption and Environment), where the connections between Swedish consumption and emissions from deforestation were presented in the autumn.

The studies indicate that, although there is a big variation between different EU countries, on average a sixth of the emissions from a typical EU diet can be directly traced back to deforestation in the tropics. Emissions from imports are also high when compared with domestic agricultural emissions. For several EU countries, import emissions connected to deforestation are equivalent to more than half of the emissions from their own, national agricultural production. “If the EU really wants to do something about its impact on the climate, this is an important emissions source. There are big possibilities here to influence production so that it avoids expanding into tropical forests,” says Martin Persson.

Above all, Martin Persson believes the responsibility for achieving these changes lies with bigger actors, such as countries and large international organisations. But he also sees a role for the consumer to get involved and have an influence. “Public opinion is vital for the climate question — not least in influencing politicians, but also commercially. We can see already that several companies have made commitments to protecting tropical forests, through voluntarily pledging to avoid products which are farmed on deforested land. And in large part, that results from the fact that popular opinion is so strong on this issue,” he concludes.

More information on: Carbon dioxide emissions due to tropical deforestation:

For the period 2010-2014, the researchers estimate net emissions of 2.6 gigatonnes of carbon dioxide due to deforestation associated to the expansion of croplands, pastures and forestry plantations in the tropics. The main commodity groups associated with these emissions were cattle meat (0.9 gigatonnes of CO2) and oilseed products (including both palm oil and soybeans; 0.6 gigatonnes of CO2).

There are large geographic variations in what commodities are associated with deforestation-related emissions. In Latin America, cattle meat is the dominant contributor (0.8 gigatonnes of CO2), mainly attributed to Brazilian production. In Indonesia almost half of the emissions (0.3 gigatonnes of CO2) come from oilseeds (mainly oil palm). In the rest of Asia-Pacific and Africa, a more diverse mix of commodities drives emissions from deforestation.

Papers:

Florence Pendrill, U. Martin Persson, Javier Godar, Thomas Kastner, Daniel Moran, Sarah Schmidt, Richard Wood. Agricultural and forestry trade drives large share of tropical deforestation emissions. Global Environmental Change, 2019; 56: 1 DOI: 10.1016/j.gloenvcha.2019.03.002

Florence Pendrill, Martin Persson, Javier Godar, Thomas Kastner. Deforestation displaced: trade in forest-risk commodities and the prospects for a global forest transition. Environmental Research Letters, 2019; DOI: 10.1088/1748-9326/ab0d41

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

1970s: Earth Warmed 0.6°C From 1880-1940 And Cooled -0.3°C From 1940-1970. Now It’s 0.1°C And -0.05°C.

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Image Source: National Academy of Sciences,  Understanding Climatic Change

About 45 years ago, the “consensus” in climate science (as summarized by Williamson, 1975) was quite different than today’s version.

1. The Medieval Warm Period was about 1°C warmer than present overall while the “largely ice-free” Arctic was 4°C warmer, allowing the Vikings to navigate through open waters because there was “no or very little ice” at that time.
2. The island of Spitsbergen, 1237 km from the North Pole and home to over 2000 people, “benefited” because it warmed by 8°C between 1900 and 1940, resulting in 7 months of sea-ice free regional waters. This was up from just 3 months in the 1800s.
3. Central England temperatures dropped -0.5°C between the 1930s to the 1950s.
4. Pack-ice off northern and eastern iceland returned to its 1880s extent between 1958 and 1975.
5. In the 1960s, polar bears were able to walk across the sea (ice) from Greenland to Iceland for the first time since the early 1900s. (They had somehow survived the 7 months per year of sea-ice-free waters during the 1920s-1940s).

Image Source: Williamson, 1975

As of the mid-1970s, the “consensus” among climate scientists was the globe had warmed by +0.6°C from 1880 to 1940, and then cooled by -0.3°C (to -0.4°C) from 1940 to 1970.

Schneider, 1974,  Benton, 1970, Cimorelli and House, 1974, Skeeter, 1985

As recently as 1980, it was still acceptable to publish scientific papers saying the Northern Hemisphere alone had warmed by 1°C between 1880 and 1940 and then cooled by nearly the same amount during the next 3 to 4 decades.

Image Source: Agee, 1980

During the 2000s, climate scientists with a keen interest in shaping global temperature data sets (Tom Wigley, Phil Jones, Michael Mann, Gavin Schmidt, Stefan Rahmstorf) exchanged e-mails about “correcting” the temperature data by removing warming from the 1940s “blip” – which they said would be “good” and significant for the global mean because the 1940s were “too warm” .

Phil Jones, overseer of HadCRUT temperature data, admitted the pre-1980s sea surface temperatures in the Southern Hemisphere are “mostly made up” due to insufficient coverage. He also corresponded with a colleague about “inventing” monthly temperature anomalies – which apparently was “fun” to do.

Phil Jones even wrote to his colleagues about changing temperature data so as to support the “argument” that the cooling decades coincided with the timing of a rise in anthropogenic aerosol emissions.

Apparently as a consequence of “corrections” to the temperature data, the amplitude of the 1880 to 1940 warming trend has been slashed from +0.6°C (1970s) to +0.1°C today.

Image Source: WoodforTrees.org

The 1940 to 1970 global cooling has been transformed into a -0.05°C hiatus.

Image Source: WoodforTrees.org

Apparently over 80% of the amplitude of an inconvenient warming or cooling trend can be eliminated from temperature data sets 50-80 years after the original temperatures were recorded.

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Prominent German Economist Calls For 20-Hour Work Week, Less Housing, Real Role Models To Rescue Climate!

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

The online, center right Junge Freiheit of Germany here reports on how German award-winning economist Niko Paech is calling for a profound scaling back of industry in Germany, and a 20-hour workweek, in order to protect the climate.

The DLF article on the interview here summed it up: “If working hours and hence income were to fall, there would also be less need for mobility, consumption and housing. Unemployment would be overcome and everyone would have a livelihood based on a lower level of work.”

In the interview Paech spoke of rolling back the workweek to 30 or even 20 hours.

Communal living

So how would people spend all the extra free time?

Paech says the extra time could be used “to provide their own services, in addition to a less high income, for example cultivating food, the repair of goods and, thirdly, for communal use.”

This is how cars, lawnmowers or tools could be shared in the community, he suggests, which means “the demand form industrial production and transport would be decreased.”

People need to be turned off about “holiday flights, meat consumption, housing”

In the DLF interview he also criticized the construction of new housing in Germany: “Here every child knows that every square meter of living space that we develop is an ecological disaster.”

He added that a CO2 tax would only be effective if it “turned us off about holiday flights, meat consumption, housing, driving and excessive consumption.”

He also told the DLF there’s a need to start disputes among fellow citizens, saying: “that I tell my neighbor, listen, why did you book a cruise, who gives you the right to drive an SUV, why do you have to make a flight to the ski vacation, too?”

2 tonnes of CO2 per year

He also tells the DLF that every child needs to understand that in order to rescue the planet, every person should have two to two and a half tons of CO2 at their disposal every year.

He sees no chance of policymaking and technology ushering in the needed course change.

So what is needed to get society to change? It won’t be easy telling people to profoundly go without, so Paech tells the DLF it will take “a minority of people in Germany who are willing and able to live like this” and thus demonstrate “that it is possible”.

There’s no indication from the interview that Paech himself is setting an example and emitting only 2 tonnes of CO2 annually. And when we see the lifestyles of other leading activists calling for the same, then it all looks awfully hopeless for the global warmists’ movement.

 

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

More Data Shenanigans At NASA. “Unadjusted” Data Get Whole New Definition: No Longer “Raw”, But Now “Quality Controlled”

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

By Kirye
and P. Gosselin

What’s with NASA GISS? The data shenanigans there seem to have no end.

Most of us have noticed that with every new version of temperature data that gets generated by NASA GISS, the more warming that comes with it. The first half of the 20th century often gets cooled while the present gets warmed. This has been shown repeatedly.

One example is Cuiaba, Brazil. Though in this example the past appears not to have been cooled, the more recent v4 “unadjusted” data have been warmed by some 1.0°C throughout since 1960.

Now the early 20th century in Cuiaba, Brazil is not warmer than the early 21st century. Presto, warming! Data: NASA GISS.

Now “unadjusted” data get a new definition

But another peculiar thing has been found. It seems NASA GISS cannot make up its mind on whether the “unadjusted” data are “raw” or if they are “quality controlled”.

When we look at the Key beneath the Cuiaba temperature chart posted by NASA GISS here, we see that “unadjusted” gets called “quality controlled” (marked yellow):

Yet, when we go back to the archives and look at the data plot for Cuiaba, the Key beneath the chart has a different definition for “unadjusted data”. Here they were defined as “raw data as reported by the weather station”!

Well, which is it? Is this another adjustment phase they snuck in? The page from NASA website was saved on June 20.

It’s bad enough that NASA GISS is taking it’s Version 3 “unadjusted” data and adjusting them to Version 4 “unadjusted” (warming them up), but now NASA GISS number crunchers seem unable to make up their minds whether the unadjusted data are “raw” or if they are “quality controlled”.

With all the confusion surrounding all the different versions and changing definitions, we have to wonder what on earth is going on at NASA GISS.

Data used to be raw, but now “quality controlled”

NASA used to claim that GHCN unadjusted are the raw data as reported by the weather station. But now they say that GHCN unadjusted are “quality controlled monthly means constructed by NCEI and other groups from raw data”.

A lot of fishy data business going on at NASA GISS.

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

1980s Science: Ice Cores Show CO2 Naturally Rose 200 ppm (65 ppm/100 Years) During The Early Holocene

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

A few decades ago it was “consensus” science that CO2 levels had reached 400 ppm (and even up to 500 ppm) during the Early Holocene, with rising amplitudes of greater than 200 ppm and rates of 65 ppm in less than a century. Then the “consensus” opinion changed.

In 1982 it was still quite acceptable for Dr. Flohn, a climate scientist, to acknowledge that changes in CO2 concentration changes are significantly determined by temperature “rather independent of” fossil fuel emissions, but also that Holocene CO2 concentrations reached 350 to 400 ppm between 8,000 to 6,000 years ago (Flohn, 1982).

That same year, Neftel et al. (1982) had published a paper in the journal Nature documenting a CO2 rise of about 230 ppm (~190 ppm to 420 ppm) from roughly 12,000 to 10,000 years ago for a Greenland ice core. The CO2 record showed fluctuations of >100 ppm throughout the Holocene.

A few years earlier, Berner et al. (1980) found CO2 rose to amplitudes exceeding 500 ppm in Greenland during the Early Holocene, whereas Antarctica’s CO2 concentration rose to about 400 ppm during the same time period.

As late as 1993 scientists were still publishing papers on CO2 readings from ice cores that ranged between 243.3, 435.7 ppm, and 641.4 ppm for recent centuries (Schwander et al., 1993).

Wagner et al. (1999) published a paper in Science denouncing the “consensus” claim that CO2 gently and steadily rose for millennia, varying only beteen 270 to 280 ppm. They too determined CO2 rose and fell quite rapidly during the Holocene, reaching amplitudes of 65 ppm in less than a century.

More recent papers also indicate there were quite substantial ±100 ppm-per-century CO2 fluctuations in stomata reconstructions. Steinthorsdottir et al., 2013 record a rise of about 190 ppm (from ~230 ppm to ~420 ppm) within less than 50 years.

Stomatal reconstructions of 800-2000 A.D. CO2 variability (Kouwenberg et al., 2005) determined there was a ~125 ppm CO2 rise from 1850 (250 ppm) to 1940 (375 ppm).

Why the discrepancy between “consensus” CO2 and historically recorded CO2?

Polish physicist Dr. Zbigniew Jaworowski (1997) was a fierce critic of the means by which ice core data have been collected to assign CO2 concentration values to past epochs.

His criticisms center around the post-1985 tendencies for fellow scientists to openly employ selection bias in making pre-determined decisions about what measurements are “right” and which ones are “wrong” – effectively rendering their results meaningless.

He cites Pearlman et al. (1986), for example. These authors collected 74 Antarctic ice core CO2 samples. Of those, 32 (43%) were rejected because they had values that were too high or too low to match with the agreed-upon pre-determination.

In what other branch of science is it acceptable to discard measured data we don’t agree with?

Jaworowski provides an illustrative example of how the rejecting-data-we-don’t-agree-with practice has been ongoing since the 1950s.

G.S. Callendar’s CO2 measurements reached 375 to 550 ppm throughout the 1800s. These measurements were believed to be too high. So Callendar chose the measurements that he agreed with (circled).

Professor Tom Segalstad, a University of Oslo geologist, provides possibly the most thorough explanation of how the current CO2 “dogma” was formulated and maintained so as to advance the CO2 greenhouse warming cause.

Segalstad (1998) concludes:

“It is shown that carbon cycle modelling based on non-equilibrium models, remote from observed reality and chemical laws, made to fit non-representative data through the use of non-linear correction ‘buffer’ factors constructed from a pre-conceived hypothesis, constitute a circular argument and with no scientific validity.”

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

FridaysForFuture Fizzles With Kids On Summer Break… Greta Sees Only 2000 In Berlin, Mostly Adults/Organizers

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Not 20,000, rather only 2000 people see Greta in Berlin!


(Text translated/edited by P. Gosselin)

Four months ago in March, 20,000-25,000 pupils and demonstrators took part with Greta in the demonstration in Berlin. But last Friday, only about 2,000 people showed up despite the fact that the demonstration with Greta was greatly advertised in the social media by Fridays For Future and the Scientists For Future. It didn’t amount to anything. The hype about Greta and Fridays for Future now seems to be over.

Mostly a hard core over age 30

The students have flown with their parents to their summer holiday destinations, and so only a hard core of professional activists mainly over age 30 gathered. They held up signs with slogans like: “Milk Chocolate Kills”, “Kale Instead of Coal”, “Vegans for Future/ Planet”, “Kerosene Tax Now”, “Parents in Panic” and “Winter Is Not Coming!” Everything with climate experts, who are familiar with climate change, and no kooky people.

“Longhaul Luisa”

Greta was accompanied by Luisa Neubauer, a.k.a Longhaul-Luisa, and the lead staff of Fridays For Future. Most of them are no longer students, rather professional activists from various NGOs. At the podium Greta announced her well-known perseverance slogans and was then shielded off by her companions and the police and led by the demonstrators into a hidden off building.

A swan song from Fridays for Future?

I would see it that way. The hype around Greta died down already in the spring. A rebound was then provided to Greta by the award of the Golden Kamera, a visit to the Pope, her “spontaneous” appearance of Scientists For Future and finally the “spontaneous” appeal by YouTube vlogger and influencer Rezo and his YouTube friends. (Rezo in reality is called Yannick Frickenschmidt and is over age 30).

Thus the last chapter seems to have been opened, and a fresh new face coming on stage is not foreseeable. All trump cards have now been played, even if Greta still plans a sea voyage to America. In September the 16-year-old wants to participate in the United Nations climate summit in New York, in the world climate conference in Santiago de Chile in December, says her leadership staff.

It remains to be seen whether this will trigger new hype around Greta.

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

“Join The Skeptical Movement” …Hip German Youths Push Back On Climate Hysteria, Post Skeptic Videos, Go Viral!

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

What follows today is really quite cool, and highly encouraging in a country known for lockstep thought.

Over the past months we’ve seen great media hype in Germany surrounding climate alarmist youngsters like Greta, FFF and more recently Rezo, who have played major roles in stirring up a lot of climate hysteria, all aided and abetted by the established media.

But apparently in Germany there are a few young, hip persons pushing back on all the climate hype and hysteria with their own videos that have since gone viral.

“Join the Skeptical Movement”

The latest video comes from young German teen Naomi Seibt, who has decided to think for herself and check what’s really behind the climate “science” and hysteria.

Since she uploaded what she calls her “most elaborate project to date” on July 1st, her video — dubbed “Climate change – All hot air? — has been viewed more than 75,000 times and gotten over 8000 thumbs up.

YouTube takes Naomi down – temporarily

She writes at YouTube: “If you want to join the skeptical movement, please share this video.”

In her video Naomi explains how many large factors are at play in the climate system, how the UN IPCC is playing it loose with the facts and that politicians are attempting to use the issue to gain control over every aspect of our individual lives. The 18-year demonstrates an impressive knowledge on the subject, rarely seen among today’s youth.

Naomi’s success apparently has taken the climate activists by surprise and caused them to panic. Die kalte Sonne here reports how YouTube actually took down her video, before reinstating it.

JasonHD: “Manipulations and untruths”

Another spectacularly successful climate hysteria skeptical video was recently produced by German JasonHD on May 24th. In it he takes down climate alarmist and leftist political agitator Rezo (mentioned above) point by point.

JasonHD dismantles the “manipulations and untruths concerning climate change”.

So far JasonHD’s thoughtful video has racked up 190,000 views.

Rapper: “Climate Change – Climate Lies, Climate Swindle”

One of the earlier pioneers of German youth climate-hysteria pushback is Austrian rapper Kilez More, who already in 2011 uploaded his rap song “Climate Change – Climate Lies, Climate Swindle” song on YouTube.

As of today it’s been viewed some 209,000 times.

We need to get these young leaders at the climate conferences in place of the usual old, crusty figures. They’re well connected and are reaching their generation.

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

2019 Climate “Ship Of Fools” Runs Into 3-Meter Thick Ice… Baffin Inlets Mid Summer Ice Extent No Trend in 50 Years

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Our German skeptic friend Snowfan here keeps us up to date on the latest ODEN “Ship of Fools” attempt to travel across an Arctic that is supposed to be ice-free by now.

The incentive to cross the Arctic passages in the summer is huge. Doing so would mean at least a week of fame with the media blaring out your name along with grossly hyped headlines of an Arctic ice meltdown due to global warming. One of these years, a ship might get lucky and manage to get through the Northwest Passages.

Image from: Ship-Tracker ODEN  Snowfan

And the pressure to do so is enormous because over the last ten years Arctic ice volume has even rebounded slightly and if that trend continues, as some expect, the global warming alarmist may never get another chance to get through. Last year failed.

Mid-summer Arctic ice volume has grown modestly over the past 13 years, thus casting doubt Arctic is melting further. Chart: Kirye.

The latest “Ship of Fools” episode this year is an attempt by the above mentioned Swedish ice breaker ODEN, which hopes to get through. Unfortunately conditions so far this summer have not been as favorable as they hoped, Snowfan reports:

Image: here.

This year Arctic sea ice in the area of interest on July 14 is (unexpectedly) thicker than it was last year at the same time, and a heck of a lot more than what some climate models and Al Gore projected a bit more than 10 years ago:

In Lancaster Sound and the Barrow Strait (eastern access point to the NW-Passage) sea ice in mid July 2019 is up to 3 meters thick, i.e. 2 meters thicker than in mid-July 2018! Source: DMI Arctic Sea Ice Thickness. Image: here.

Also defying the models is the extent of ice cover for July 9 at the Baffin inlets Regent – Boothia. Over the last 50 years, there’s been little trend change:

 

Source: Canadian Ice Service

The latest on ODEN, according to CruiseMapper, is that it is currently well off course. Snowfan writes:

In the night of July 19, 2019, the Ship of Fools has deviated far from its planned route southeast of Coburg Island and is positioned this morning west of this island. We await with suspense to see if the climate fools will overcome the thick ice on July 20, 2019 and arrive on time in Pond Inlet.”

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Rapidly Fluctuating India Sea Levels Were 4 m Higher Than Today 6000 Years Ago, 1.5 m Higher 500 Years Ago

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Scientists have found sea levels on India’s eastern coast were still 1-1.5 m higher than today as recently as 500 to 300 years ago and 3-4 m higher than today between 6000 to 4000 years ago. Seas rose and fell by multiple meters (-5 m to +3 m) within 1250 years until as recently as 4000 to 2000 years ago.

A new paper (Loveson and Nigam, 2019) reveals sea levels were still rising at a rate of 2.2 meters per century between 8100 and 7200 years ago, reaching a highstand of 4 meters above today’s sea level 6050 years ago.

For the next several millennia sea levels rapidly rose and fell within a range 6 meters – between 4 meters above to -2 meters below present levels.

A drop in sea level at one point reached an amplitude of -5 meters in just 1250 years (4350 to 3100 years ago) followed by 3 meters of sea level rise within 1200 years (3100 to 1900 years ago).

As recently as about 300 years ago mean sea level on India’s eastern shore was still about 1 meter higher than today.

Image Source: Loveson and Nigam, 2019

Evidence there were sea port towns along India’s west coast that are presently located much further inland suggest sea levels were 2-3 m higher prior to 2500 years ago.

At the time of the 43 AD Roman invasion of Britain, the ocean shoreline, or beach, was located 2 miles (3.22 kilometers) from today’s shore.

Image Source: BBC

A new paper (Makwana et al., 2019) indicates there were sea port settlements that are today located “far inland” compared to where they were about 2500 years ago.

The scientists suggest sea levels may have been at least 2 meters higher than today at that time.

A visual example of how 2-meters-higher sea levels could have “submerged” the coast of western India is provided.

Image Source: Makwana et al., 2019
Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Another Blow! Two New Studies Show Climate Models Have “Large Deficits” …Running “Too Hot”

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Climate sensitivity and the warming pattern

By Die kalte Sonne
(German text translated/edited by P Gosselin)

In March 2018, we reported on a paper that derived the sensitivity of our climate system with the best data available. Lewis/Curry (2018) reached the result: 1.3 °C for doubling of the CO2 in the atmosphere with a rise (Transient Climate Response), long-term equilibrium (ECS) of 1.7 °C (see Table 3 of the paper).

The numbers hardly react sensitively to the choice of (larger) time windows, they fluctuate very little, whether one evaluates 1870…2016 or 1930…2016. There has been a whole series of precursor studies also from other authors who also arrived near these quite small values. Also papers examining historical periods (last glacial maximum to pre-industrial) do not contradict these low figures.

So the much more dramatic sensitivity estimates, especially from GCM model considerations (for General Circulation Models), — 1.86 °C for TCR and 3°C for ECS — are not applicable? “It’s not that simple,” some activists insist because then the low sensitivity of the Earth’s climate would not necessitate urgent action to reduce greenhouse gases.

So how can we save the GCMs from empiricism with their worrisome projections? A key argument so far is this: models predict a different spatial distribution of ocean warming than what we observe:

 

Fig.3: The warming patterns derived by models (top) and the observed patterns. Of particular importance is the fact that the CMIP5 models indicate a rather uniform warming of the tropical Pacific as a result of the (mainly man-made) forcing (hence the model-mean), but the observations show a significantly stronger warming of the western tropical Pacific compared to the eastern one.  The images were generated with the KNMI Climate Explorer.

So it could well be, activists say, that the deviation are just a “whim of nature”, an internal variability, and after the end of this rather random episode, the warming becomes much stronger on a global scale on accordance to the models. There is talk of “trajectories” which were and will be possible, and the observations strongly deviate negatively because they are a random one of the possible warming patterns. In short: “What we have observed so far is not the real reality, but it will certainly get much worse. Believe the climate models!”

2 new papers

Here we present two current papers that provide explanation. To start: The observations of the warming rate are correct, the deviating patterns of the climate models are caused by their inadequacies and these patterns will not change.

In Dong et al (2019), the authors show that if the convective regions with many clouds in the western Pacific warm up more strongly than those with hardly any convection in the eastern Pacific, the overall global warming is much less pronounced.

Let’s take a look at the clouds in the tropical Pacific:

Fig. 4: The convection(CAPE Index) over the tropical Pacific. The west-east gradient can be seen clearly. Source.

Convection in the western tropical Pacific leads to an increased heat radiation into space, which means that the warming there can be reduced much more effectively than would be possible with a stronger warming of the eastern Pacific with less convection, see Fig. 4.

The paper finds:

For the west Pacific patch, warming is communicated to the upper troposphere, which warms the whole troposphere across all latitudes, causing a large increase in outgoing radiation at the TOA. Furthermore, the patch of warming locally decreases tropospheric stability, measured here as estimated inversion strength (EIS), but increases EIS remotely over tropical marine low clouds regions, yielding an increase in global low cloud cover (LCC) which enhances the global SW reflection….The results first highlight the radiative response to surface warming in tropical ascent regions as the dominant control of global TOA radiation change both in the past and in the future. …This surface warming pattern yields a strong global outgoing radiative response at TOA that can efficiently damp the surface heating, therefore producing a very negative global feedback.”

It is therefore a clear physical mechanism that leads to the observed stronger warming of the tropical West Pacific leading to lower global sensitivities (= stronger negative global feedback).

The second paper, Seager et al (2019), deals with the same phenomenon and concludes that the observed pattern is not random, but a direct result of forcing. It states:

The main features of observed tropical Pacific climate change over past decades are consistent with a response to rising CO2, according to fundamental atmosphere and ocean physics….However, the strength of the tropical Pacific influence on global climate implies that past and future trends will diverge from those simulated by coupled climate models that, due to their cold tongue bias (ein Streifen kühleren Wassers in Äquatornähe des Ostpazifiks, d.A.), misrepresent the response of the tropical Pacific to rising CO2.”

Climate models have such large deficits in the depiction of events in the tropical Pacific that they are globally incorrect in determining the response to the forcing (see Fig. 3) and systematically overestimate the sensitivity to the forcing (according to Seager et al, and Dong et al).

So will we read anything about this in the media? A possible headline might be: “Climate models calculate the future too hot! Don’t hold your breath.

PlayStation climatology

We eagerly await to see whether the results of these two important studies will even be included in the IPCC’s forthcoming progress report. Here hundreds of pages dealing with model projections would have to be critically revised. One more reason for us to trust empiricism and “PlayStation climatology”.

But what is to become of the “panic” which Fridays for Future wishes to impose on us? Policymaking is hot because the models are too hot. Which scientists have the courage to be responsible and to enlighten FFF and policymaking?

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

NASA GISS Surface Station Temperature Trends Based On Sheer Guess Work, Made-Up Data, Says Japanese Climate Expert

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

By Kirye
and Pierre Gosselin

Whenever NASA GISS announces how recent global temperatures are much hotter than, for example, 100 years ago, just how statistically reliable are such statements?

Most will agree, based mainly on sundry observations, that today is indeed warmer than it was when surface temperatures began to be recorded back in 1880. But we will never really know by how much.

Surface station datasets full of gigantic voids

When we look at NASA GISS’s site here, we can see how many surface stations have data going back to earlier years. Today we see that 2089 stations are at work in Version 3 unadjusted data.

Yet, when we go back 100 years (to 1919), we see only 997 of these surface stations have Version 3 unadjusted data that is complete:

Source: NASA GISS

Note how the Version 3 unadjusted datasets going back to 1919 are poorly distributed and sorrowfully lacking over Africa, Canada, the Arctic and all across the Southern Hemisphere. Never mind the oceans.

Only a measly 174 surface stations go back to 1880!

And when we look at the number of stations in Version 3 unadjusted data going back to 1880, ONLY 174 stations actually provide us with a complete thermometer dataset:

As is shown, Version 3 unadjusted data going back to 1880 covers only some parts of the US and Europe. All of Canada and Russia are void of data, and so it is impossible to know what the temperatures there really was.

The same is true for the entire southern hemisphere, let alone the entire globe. The bottom line: There is no way of knowing what the global temperature really was back in the late 19th century and early 20th century.

Japanese expert: data of “no scientific value”

This tells us that global temperature trends since the start of the Industrial Revolutions presented by NASA are fraught with huge uncertainty.

“This is nothing new,” says Japanese climate expert Dr. Mototaka Nakamura in an email to NTZ. “We simply did not have many observing stations in the 1800s and early 1900s. They can produce ‘new data sets’ and claim that they have ‘better data sets’ all day long, but they just can’t make any meaningful difference for periods up to 1980.”

“Not real data”

“These datasets are products of simulation models and data assimilation software, not real data,” Dr. Nakamura added. “This problem has been present in data products produced by all institutions from the beginning – NASA, NOAA, NCEP, ECMWF, UMet, etc.”

“Spatial bias before 1980 cannot be dealt with”

But the data shortcomings get even worse. Dr. Nakamura wrote: “A far more serious issue with calculating ‘the global mean surface temperature trend’ is the acute spatial bias in the observation stations. There is nothing they can do about this either.  No matter what they do with the simulation models and data assimilation programs, this spatial bias before 1980 cannot be dealt with in any meaningful way. Just look at the locations of the observation stations used in GISS products for various years on their page.”

Dr. Nakamura commented earlier here at NTZ: “The global surface mean temperature change data no longer have any scientific value and are nothing except a propaganda tool to the public.”

So how can we be sure about the globe’s temperatures, and thus it’s trends before 1980? You can’t. The real data just aren’t there.

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Scientists Find Antarctica Is Rapidly Cooling And Any Ice Sheet Melt Is Not Due To CO2, But Natural

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Natural variability rules in Antarctica. Scientists identify clouds, wind, and localized solar heating – not CO2 – as the factors driving ice melt. Rising CO2 leads to Antarctic cooling.

Image Source: Lüning et al. 2019

Antarctica rapidly cooling in recent decades

In a review of the scientific literature, Lüning et al. 2019 report Antarctica as a whole has undergone a cooling trend in recent decades.

The Antarctic Peninsula has cooled at a rate of -0.5°C per decade since the late 1990s.

West Antarctica as a whole has “slightly cooled” (or the warming has “plateaued”) over the past two decades.

East Antarctica “has not experienced any significant temperature change since the 1950s” with  ice sheet mass gains and cooling during the past 15 years.

Rising CO2 leads to Antarctic cooling

Antarctica contains about 90% of the world’s ice.

Because the continent averages -28.2°C in summer and -60°C in winter, inducing even partial retreat for an ice sheet that averages 2.3 kilometers in height would require a substantial amount of heat energy.

This effectively rules out a significant human influence.

According to scientists, raising CO2 concentrations does not even lead to warming in Antarctica. Actually, scientists find Antarctica cools in response to rising CO2 concentrations, which means we humans may be contributing more to ice mass gains than to losses.

Image Source: Schmithüsen et al., 2015

Natural variability – clouds, wind, localized solar heating – drive Antarctic ice melt

The surface melting of portions of the West Antarctic Ice Sheet (WAIS) has received quite a bit of attention in media circles, often accompanied by scary warnings of ice sheet collapse and catastrophic sea level rise.

For example, Dr. James Hansen – admitting his doomsday predictions are tendentiously designed to be “persuasive” – has claimed sea levels will rise by 10 feet by 2065 mostly due to Antarctic ice sheet melt.

Image Source: Slate

These harrowing warnings often seem to arise in response to observations of glacier calving events – large glaciers fissuring and breaking off from the ice sheet.

But glaciologists know that calving events are indicative of ice sheet thickening, not thinning. Glaciers calve when the ice accumulation has become so heavy and thick that the base of the ice sheet can no longer bear the load.

Image Source: Christmann et al., 2016

Yes, portions of Antarctica are undergoing ice melt. But ice sheet recession and advancement are both natural. And modern ice melt is well within the range of what occurs naturally for Antarctica.

Indeed, as Jones et al. (2016) conclude, natural variability “overwhelms” any forced response in satellite era trend observations.

Image Source: Jones et al., 2016

In two new papers, scientists identify the natural mechanisms driving the recession of some of West Antarctica’s glaciers in recent decades.

Scott et al. (2019) conclude surface melt is driven by wind currents and downwelling longwave radiation from clouds.

Stewart et al. (2019) find localized solar heating of surface water can explain melting in small portions of the Ross Ice Shelf.

Considering the total Antarctic meltwater contribution to sea level rise may only amount to 0.34 of a centimeter since 1958 (Frederikse et al., 2018), it is quite reasonable to conclude that nothing unusual, unprecedented, or concerning is occurring in Antarctica that could be said to fall outside the range of natural variability.

Image Source: Scott et al. (2019)

Image Source: Stewart et al. (2019)
Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Surprise JULY SNOW Falls In Poland… June Temperature Trends See No Rise Across Canada, Iceland

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

“Snow in July – this surprised everyone. We remember times when it fell in April or even in May, but not during summer vacation.”

By Kirye
and Pierre

Ice Age now here reported, “snow and record low temperatures” in Poland — in July — earlier this week.

According to Polish sources, there was fresh snow on the highest peaks of the Tatras and the temperature fell below zero (-0.2 degrees).

June mean temperatures see no rise in Canada, Iceland

While Europe saw some record heat in June, temperatures have since fallen considerably, with many regions reporting well-below normal readings.

Elsewhere over the northern hemisphere June temperature trends show a decline over the past two decades or more.

Taken as a whole, 9 temperature stations scattered across Canada show June mean temperatures have not increased in 25 years. Chart: Kirye. Data:

Two of three stations in Iceland also show no warming. This is hardly what one would expect to find when scrutinizing behind the alarmist headlines and claims coming from global warming media and activists.

2/3 Iceland stations for June show no warming trend since 2000. Chart by Kirye. Data:

When it’s hot, the activists are so loud. But when cold offsets it all, then suddenly you can hear pins dropping.

As the saying goes regarding climate science: When it’s warm and stormy, it’s climate. But when it’s cool and calm, it’s just weather.

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Alarmists Red-Faced As Satellite Image Analyses Show Globe Has Greened Markedly Over Past 4 Decades

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

German climate science skeptic Michael Kruger of Science Skeptical here writes that the earth has become GREENER and more fertile due to more CO2 and warming.

Source: Zhu et al

Hard to believe, but the earth is not turning into a desert and more arid due to the CO2 increase in the atmosphere, like alarmist scientists and media like us to believe it is, but rather it is becoming greener and more fertile.

This is what scientists have found through the analysis of satellite data over the last four decades.

A study published in 2016 in Nature Climate Change proves that the earth has become considerably greener over the past decades.

For their study, the researchers led by Zaichun Zhu evaluated vegetation data recorded by three satellites between 1982 and 2009. The evaluation showed that since 1982, the plant world has become more luxuriant and thus greener on a large part of terrestrial land surfaces.

Area “twice the size of the USA”

“The biggest greening trends can be seen in the southeast of North America, in the northern Amazon region, in Europe, Central Africa and Southeast Asia,” said Zhu and his colleagues.  “This greening, which we have observed, is comparable in scale to an additional green continent twice the size of the USA,” says Zhu.

To find out exactly what is responsible for this increase in plant material, the scientists fed ten global ecosystem models with data on greenhouse gas emissions, land use and the development of climate factors such as temperature and precipitation. The result: 70% of the earth’s greening is due to the fertilizing effect of rising CO2 levels and 30% to climatic effects and other effects such as climate change, nitrogen deposition and changes in land cover.

Thus, in the high latitudes and in Tibet and other highlands of the mountains, the rise in temperatures is responsible for the fact that the vegetation there became more luxuriant. “Warming promotes photosynthesis and prolongs the growing season,” the researchers explain.

Increasing precipitation in the Sahel and South Africa

In the Sahel and South Africa, on the other hand, increasing precipitation is becoming noticeable. This makes the region more fertile and greener.

The rise in CO2 emissions and climate change therefore favor the greening of the earth and plant growth. Even Syria has greened.

The earth has become greener over the past 4 decades. This is the main conclusion of an international study published in Nature Climate Change on 25 April 2016. In 40 percent of the world’s regions, a significant increase in leaf biomass was observed between 1982 and 2015, only 4 percent showed significant losses of vegetation. The vegetation corresponds to the size of a continent twice the size of the USA.

The desert regions have also become greener, such as the Sahel on the border with the Sahara, the Fertile Crescent, which stretches across Turkey, Syria, Iraq and Iran, and the former region of Carthage in North Africa, which used to be the granary of Ancient Rome.

These areas were already green and fertile in the climate optimum of the Holocene directly after the last ice age. From there, in the course of the Neolithic revolution, agriculture spread to Europe and Northern Europe.

data6-africa-veg

The Sahel region has been greening for four decades:

This has been shown by a variety of studies.

Sahara shrinks by over 700,000 sq. km.

In 2018, Venter et al. recorded an eight percent increase in timber vegetation in sub-Saharan Africa over the last three decades using satellite imagery.

According to Wikipedia, the Sahara covers an area of around 9.2 million square kilometers. Eight percent of this corresponds to more than 700,000 square kilometers. This is an area almost as large as Germany and France together!

Lake Chad is growing

Even Lake Chad at the south edge of the Sahara is growing again and getting greener.

A greener Europe

Also in Germany it has not become more arid over the last four decades, as the media and climate impact researchers have recently informed us. Quite to the contrary:

Deutschland-Gruen

Above all the north, the east, the low mountain regions and the Alps have become greener, as the satellite data show. The forest area is also growing in Germany, which is over 30% wooded. Between 1992 and 2008, the forest area in Germany “grew by an average of 176 square kilometers per year”.

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Prominent Veteran Meteorologist Pleads For Moderation In The Climate Debate, Slams Both Hardcore Alarmists, “Deniers”

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Averting a bloody climate science civil war

It’s time to take the debate away from the extreme factions and to move it to cooler heads, says veteran Swiss meteorologist Jörg Kachelmann

Former German Public television meteorologist Jörg Kachelmann wrote an essay commentary for the Swiss online Die Weltwoche here titled “Tell me where you stand”, where he compares today’s white-hot intolerance seen in climate science to the political intolerance witnessed in former communist East Germany.

“Religious furor”

In East Germany, either you adhered to the state’s hard communist doctrine, or you were the enemy. Such is the atmosphere we see today in climate science. Kachelmann writes: “Again today it’s either-or, and nothing in between.”

“Debates are important, and we must conduct them with arguments and without the religious furor that is being practiced today by both sides.”

The veteran meteorologist describes how on the climate issue “there are two irreconcilable camps”: the climate deniers and climate hysterics – each arrogantly claiming “infallibility”. It is no longer possible to be in between. Even those in between get insulted and attacked equally by both sides.

Natural science at schools have been “almost completely gutted”

While Kachelmann writes “deniers” are mostly older, right wing persons, the alarmists are made up of mostly “ugly young people” who are the products of school systems that have had their “natural science subjects almost completely gutted out”.

“These people don’t need graphs, but rather the godlike feeling of being on the correct side, which is why you don’t have to be so precise with facts,” Kachelmann comments.

“Today, the divining militancy of both sides is preventing a serious debate on priorities,” He says. “The green and brown nuts try to recruit for their political advantage those who have a big opinion on the subject, but no idea about it.”

Academic (green) supremacists

One example of the militant intolerance Kachelmann describes is illustrated by a recent Twitter comment by Potsdam scientist Stefan Rahmstorf, who reacted to German parliamentarian Philipp Lengsfeld, who earlier had tweeted on a “remarkable” statement recently published by 90 leading Italian scientists who challenged the alarmist climate science and claims of consensus.

Lengsfeld wrote:

Remarkable statement by scientists in Italy. For my taste a bit to hard, but so is the climate debate now: Heated.
From this I discovered a new site, by @NoTricksZone – a list of interesting studies. https://twitter.com/notrickszone/status/1146728105761038336?s=21 …”

 

That comment by Lengsfeld was obviously too much for Rahmstorf, who was obviously unnerved that a parliamentarian would make such an observation. He not only attacks Lengsfeld and this site here, but he also insults and slanders the 70 Italian scientists.

His Twitter comment in English:

It is particularly noteworthy when a member of parliament cannot distinguish a climate denier website from serious science and falls for a list of signatures by predominantly unprofessional and emeritus people – even more untrustworthy than the lung physicians.”

In Rahmstorf’s view, anyone who challenges the Potsdam alarmist climate position is an enemy of the climate state, and any scientist who questions the science is equivalent to a tobacco scientist.

Yes, its’ time to move the debate and science over to moderate voices.

Jörg Kachelmann is an entrepreneur and a 40-year veteran meteorologist operating the site: www.kachelmannwetter.com.

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Physicists: Clouds ‘Practically Control’ Climate, Whereas Human Warming Amounts To 0.01°C Per 100 Years

Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

Two University of Turku (Finland) physicists have determined a) the climate’s sensitivity to a doubling of CO2 is 0.24°C, b) the human contribution to the warming of the past century is only about 0.01°C, c) the IPCC and climate modeling dramatically overestimate CO2’s climate impact, and d) variations in low cloud cover control the climate.

Cloud cover changes “explain the linear trend of global temperature” since the 1980s

In a new paper, O.M. Povrovsky of the Russian State Hydrometeorological University analyzes satellite-observed cloud cover changes during 1983-2009 and their relation to global temperature change.

Povrovsky found global and regional cloudiness decreased between 2-6% during these decades, and “the correlation coefficient between the global cloud series on the one hand and the global air and ocean surface temperature series on the other hand reaches values (–0.84) — (–0.86).”

Consequently, Povrovsky (2019) concluded changes in cloud cover explain both the increasing global temperature during 1984-2009, but even the interannual variability.

Anthropogenic climate change isn’t supported by experimental evidence

Dr. Jyrki Kauppinen was an expert reviewer for the IPPC’s last climate report (AR5, 2013).

In a comment to the IPCC overseers, Kauppinen strongly suggested the “experimental evidence for the very large sensitivity [to anthropogenic CO2 forcing] presented in the report” is missing (Kauppinen and Malmi, 2019).

In response, the IPCC overseers claimed experimental evidence could be found in the report’s Technical Summary.

But the Technical Summary merely contained references to computer models and non-validated assumptions. Kauppinen writes:

We do not consider computational results as experimental evidence. Especially the results obtained by climate models are questionable because the results are conflicting with each other.”

Upon examination of satellite data and cloud cover changes, Dr. Kauppinen concluded the IPCC’s claims of high climate sensitivity to CO2 forcing (2 to 5°C) are about ten times too high, and “the models fail to derive the influences of low cloud cover fraction on the global temperature.”

Evidence for natural climate change supported by satellite observations

When low cloud cover data from satellite observations are considered, a very clear correlation emerges.

As low cloud cover decreases, more solar radiation can be absorbed by the oceans rather than reflected back to space. Thus, decadal-scale decreases in low cloud cover elicit warming.

When cloud cover increases, cooling ensues.

In this manner, Kauppien and Malmi (2019) find “low clouds practically control the global temperature,” which leaves “no room for the contribution of greenhouse gases i.e. anthropogenic forcing.”

In fact, Kauppinen and Malmi boldly conclude that the total warming contribution from anthropogenic CO2 emissions reached only 0.o1°C during the last 100 years, which means “anthropogenic climate change does not exist in practice.”


Kauppinen and Malmi, 2019

No experimental evidence for the

significant anthropogenic climate change

“The IPCC climate sensitivity is about one order of magnitude too high, because a strong negative feedback of the clouds is missing in climate models. If we pay attention to the fact that only a small part of the increased CO2 concentration is anthropogenic, we have to recognize that the anthropogenic climate change does not exist in practice. The major part of the extra CO2 is emitted from oceans [6], according to Henry‘s law. The low clouds practically control the global average temperature. During the last hundred years the temperature is increased about 0.1°C because of CO2. The human contribution was about 0.01°C.”
“We have proven that the GCM-models used in IPCC report AR5 cannot compute correctly the natural component included in the observed global temperature. The reason is that the models fail to derive the influences of low cloud cover fraction on the global temperature. A too small natural component results in a too large portion for the contribution of the greenhouse gases like carbon dioxide. That is why IPCC represents the climate sensitivity more than one order of magnitude larger than our sensitivity 0.24°C. Because the anthropogenic portion in the increased CO2 is less than 10 %, we have practically no anthropogenic climate change. The low clouds control mainly the global temperature.”

Image Source: Kauppinen and Malmi, 2019
Share this...
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter

By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this. More information at our Data Privacy Policy

Close