NASA GISS Keeps Warming The Data, And Mysteriously Comes Out With 104 New Stations Going Back To 1882

By Kirye
and P Gosselin

It’s well known that scientists at NASA GISS have been rewriting the global temperature record, adjusting the data. Not surprising to many, it now appears much of the globe has been warming faster. This is quite remarkable.

One example is how cooling was transformed into warming in Australia.

What follows are the plots of six Australian station that go back to the late 19th century. The comparator shows the plots of GISS Version 4 “unadjusted” data compared to the Version 4 “adj homogenized” data:


Before the homogenization, the unadjusted data from 4 of the 6 stations showed cooling. But after NASA GISS “adjusted” the data, the cooling disappeared and all 6 stations showed warming. This seems Orwellian, to say the least.

Now: a mysterious growth in number of stations going back to 1882

Strangely last year when we looked at the NASA site, they showed only 6 stations having V4 unadj data from January 1882 to November 2019 for Australia and only 325 stations globally, see the following screenshot from last year:

But when we look at the NASA GISS site TODAY, we suddenly find they have 17 stations showing V4 unadj data from January 1882 to November 2019 for Australia.

Where did the 11 new stations which such a long dataset suddenly come from?

Extra 104 stations going back to 1882

And globally, as readers may have already noticed above, NASA GISS seems to have found 104 new stations that go all the way back to 1882, now showing a total of 429! Why weren’t these visible last year?

Version 3 only 144 stations going back to 1882

Things get even stranger at NASA GISS. You’d think that Version 4 unadj data are derived from a Version 3 dataset, meaning there would have to be as many stations with Version 3 data as with Version 4. But surprise!

So what are we missing here? Maybe someone knows more about these mysterious new stations and can tell us what’s going on.

25 responses to “NASA GISS Keeps Warming The Data, And Mysteriously Comes Out With 104 New Stations Going Back To 1882”

  1. Weekly_rise

    Lots of confusion going on here. First, NASA does not compile the GHCN data, that is done by the NOAA. NASA just has a tool on their website for finding GHCN station data (somewhat ironically it is a much better tool than those on NOAA’s website, maybe that’s why so many get confused). NOAA is the data steward, NASA takes NOAA’s data and uses it to compile their surface temperature analysis.

    Now, why do new versions of GHCN have new station records? Well, they just do. The documentation* for GHCN V4 states that they’ve added “many thousands” of new stations to the dataset. This happens for a number of reasons, but often it’s either because:

    a. They receive new station data from the provider (which I guess would be BOM for Australia).

    b. They digitize more station records.

    c. They determine that the quality of older station records is good enough to include in the database.

    As it stands, NOAA certainly does not have data for every station on earth, and it’s unlikely that they ever will. Every new update to the GHCN will contain new station records. No tomfoolery is happening, just good old fashioned incremental improvements.

    *Documentation here:

    1. S.K.

      Homogenizing the data is not good old fashioned incremental improvements, it’s propaganda.

      Sell your snake oil to someone else.

      In my opinion, NASA is losing credibility by the day.

      1. toorightmate

        Homogenization of raw temperature data is akin to winding back odometers EXCEPT the people who wind odometers back are more honorable.

      2. tom0mason

        Homogenization of temperature data is similar to blurring a picture to fill in missing parts of the image, then say you’ve improve the resolution of the image.

        1. Weekly_rise

          Rather, homogenization is taking an image where half of the pixels are missing and using the existing pixels to try and get some idea of what the image might look like.

          1. The Indomitable Snowman, Ph.D.


            But the analogy is useful and telling.

            “Homogenization” is indeed a lot like “image processing.”

            This type of “image processing”:


          2. Shoki Kaneda

            Your analogy is flawed, but to extend it…

            It is as if the interpolated pixels are only ever inserted in one spectrum extreme, red.

        2. tom0mason

          Glad you all enjoyed my analogy :-).
          You’ve all failed to convince me that it should be anything else, so I still feel it is the one closest the truth.
          To cover over the parts that are missing they just blur the whole picture to get a smear, or bleed, of color — mostly red — all over the image.

          Whichever way you look at it homogenization is method to hide the truth, the truth that many records are truncated, show cooling, or have other ‘unhelpful’ features that the ‘modelers’ and data torturers dislike.

          Temperature homogenization is as useful as a chocolate coffee-pot for rendering incomplete records with useful data (±0.1°C) 8-(

    2. Richard Greene

      Weekly rise

      You just typed a lot of words to say nothing meaningful.

      I guess your message isL “Just Trust NASA-GISS, They Are Experts”.

      A classic appeal to authority logical fallacy.

      Interesting that NASA-GISS “improvements” almost always increase the rate of global warming — their “adjustments” do not appear to be correcting random errors.

      The biggest NASA-GISS “adjustment” I know of is the change to the 1940 to 1975 period, originally reported as 0.3 to 0.5 degrees C. of global cooling.

      After many years of “adjustments, NASA-GISS now conveniently claims no global cooling in that period, while CO2 levels rose — that’s a better fit with their favored ‘CO2 is the Climate Controller’ narrative.

      The biggest problem with surface temperature data is not the frequent “adjustments”, that magically create more warming out of thin air.

      The biggest problem with surface numbers, by far, is the sparse surface coverage before World War II.

      The coverage of the Southern Hemisphere was near zero in the 1800s.

      Nothing could be more important than including more data for the Southern Hemisphere.

      But even with a few more “discovered” weather station records, the surface global average temperature is not fit for scientific analysis, due to excessive guessing (infilling) before 1979.

      Since 1979, we have had satellite temperature data with much better global coverage, with little infilling required, measured in a consistent environment, where the greenhouse gas effect occurs.

      The satellite methodology has the potential for better accuracy than surface data, because of much better global coverage and far less infilling.

      Your ‘just trust the government’ attitude lacks the skepticism needed for good science.

      This article presents some of that real science skepticism, without jumping to any conclusions.

      Your comment does jump to a ‘trust the government’ conclusion, but fails to persuade anyone why your conclusion should be believed.

      In plain English, you struck out here.

      Because this is not a pro-government, always trust the authorities, gullible crowd of leftists, who love appeal to authority comments !

      1. The Indomitable Snowman, Ph.D.

        Leftists only love appeals to “authority” when THEY are “the authority.”

        When they are not “the authority,” they drive around with bumperstickers that say “Question Authority.” (Remember those? Where’d they all go?)

        (The ghost of Joe Stalin approves their viewpoint.)

      2. Weekly_rise

        Richard, I am pointing out that the GHCN dataset is not produced by NASA, the article says things like, “NASA GISS seems to have found 104 new stations that go all the way back to 1882” that is factually incorrect – NASA has nothing to do with whether the NOAA adds new station records to the dataset.

        The NOAA has a station metadata repository online here:;jsessionid=B4FABC3CE7DAD6DA91E154E3CC1FC656

        So it should be quite simple to go identify the 104 stations. In fact it took me about 10 minutes to identify all of the Australian stations. Here is one of them:

        The Coonabarabran Namoi Street station in fact is real and has been in operation since 1879:

        Being quite frank, I think if the aim of this article were merely to dispassionately report on the appearance of these stations in the updated GHCN dataset, this kind of basic review should have taken place before it was published.

        1. Richard Greene

          A dataset that desperately needs a lot more temperature data prior to World War II, especially in the 1800s, should have found those “lost” weather stations many decades ago.

          At the least, every global average temperature announcement should be accompanied by an announcement that pre-WWII data are for the Northern Hemisphere only, and all pre-1900 data should be truncated due to insufficient global coversge in BOTH hemispheres.

          One again, your comment added no value to this website.

          The global average temperature estimates before 1979 are not fit for scientific analysis.

          It does not matter who collected the data.

          The planet has warmed since the Maunder Minimum period in the late 1600s.

          The claim that we know exactly how much warming there was since 1880 (or since 1850), to one decimal place, is nonsense.

          Sea surface temperature accuracy is likely to be even worse than the accuracy of land surface weather stations:

          Measuring ocean temperatures with buckets and thermometers for many decades, mainly in Northern Hemisphere shipping lanes, can not possibly result in an accurate average temperature for the planet’s oceans.

          And then there were about five different methodologies for sea surface temperature measurements that were never compared to see if a change in methodology was creating a warming or cooling trend.

          Where is any study comparing all ocean temperature measurement methodologies used IN ONE PLACE, to determine the effect of repeatedly changing sea surface temperature measurement methodologies?

  2. bdgwx

    Massive confusion indeed. In addition to the above comment GHCN-M provides the QCU (unadjusted) and QCF (adjusted) file. QCF applies pairwise homogenization to correct known bias like station moves, time of observation changes, instrument changes, etc. and even unknown biases by comparing each site to its neighbors.

    You can actually download the source code GHCN uses here.

    You can view the station metadata here.

    Record digitization projects are still ongoing and new stations are constantly being added so continue to expect changes with the GHCN repository in the future.

    It is a good thing NASA is using the complete GHCN-M station inventory and the QCF (adjusted) file. We want that. If NASA did otherwise it would be unethical at best.

    BTW…NASA switched from GHCNv3 to GHCNv4 in 2019. This changed increased their station consideration from about 7,500 to 27,000. So a 104 additional stations is hardly noteworthy.

    1. ColA

      bdgwx 6, Sir,
      You should understand in better detail what the BoM call “Neighbour” record sites. Australia is a very large barron land. it is not uncommon for the neighbour site to be 500 km or more away from the site to be “adjusted”. That could be on the other side of a mountain range, adjusting a coastal record with desert data etc. BoM do not justify why a site is “adjusted” or why the sites used for the ‘adjustment’ are more correct then the site being ‘adjusted’!!
      Add to that the number of sites that do not comply with BoMs basic physical site requirements and you might start to see what a farce our Australian records are! BoM refuse to release the homogenising algorithem or detail how it works, saying it is too complicated for us commoner to understand.
      Try this site:
      It is interesting NOAA are using Australian records back to 1882, our BoM do not like to use records prior to 1910 – there are a number of “ANOMOLIES” in the 1800’s that make them ‘uncomfortable’! (Don’t fit the narative)

      1. Aussie

        For those who have not read the kenskingdom study which analysed the siting of all BOM sites a complete horror story of non compliance with even the BOMs own siting standards was found.

        Gauges in enclosed yards, next to bitumen roads and even one on a roof…

        This review showed that over HALF of the BOMs sites seriously did not comply with their own standards, modelled on the WMO standards.

        This is a complete and utter farce. Garbage in, Garbage out. And all the siting issues mean higher temperatures…Appalling, unprofessional and if it occurred in company finances auditors would close the company down until they could demonstrate proper practice.

        But in climate anything is OK as long as it supports the farcical “global warming” case.

  3. Chris Hanley

    I can’t find a concise definition of ‘homogenization’ in quick search of the GISS website so relying on the Energy Matters site:
    “Homogenization is a process whereby raw temperature records in the same general area are adjusted by computer algorithms to match each other to within acceptable limits”.
    Sites like Cape Otway, Wilson’s Promontory, Yamba, Moruya are relatively isolated today let alone in 1880.
    These locations over time staffed by responsible diligent people would seem to be ideal locations to retrieve reliable unadulterated historical temperature data.
    The fact that all the ‘adjustments’ have the same net effect speaks for itself.

    1. The Indomitable Snowman, Ph.D.

      “Homogenization is a process whereby raw temperature records in the same general area are adjusted by computer algorithms to match the climate models.”


    2. Richard Greene

      Homogenization is like shaking up a carton of spoiled milk so it looks fresh, and then selling it for $5 as organic, non GMO, gluten free, “sour milk”, from cows fed fresh salad greens every day.

      In the climate business, homogenization hides raw data to such an extent that no one can ever verify that it makes sense!

  4. Scott

    Jo Nova will be able to help verify the Aussie data for you. The Aussie BOM have a history of heating the recent temp data even deleting Australia’s heat record back in the 1800’s.

    And the Aussie BOM is where NOAA get their data from.

  5. RoHa

    Jennifer Marohasy can help0 as well.

  6. oebele bruinsma

    As our progressive elites try to re-write history they also want to write the future.

  7. John Shewchuk

    Thanks for the great temperature comparison graphs. Now to get them shown on Fox News, etc. But Obama would say “nothing to see here” since John Holdren, Obama’s Climate Genius, has assured everyone that global warming creates more cold temperatures …

  8. Climate Mafia Downs The Satellites – Newsfeed Hasslefree Allsort

    […] Related: NASA GISS Keeps Warming The Data, And Mysteriously Comes Out With 104 New Stations Going Back To 188… […]

  9. 8 Longterm Australia Stations Altered From Warming To Cooling by NASA GISS. – Climate-

    […] In our last post, we examined how cooling was transformed into warming in Australia by NASA GISS. […]

By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this. More information at our Data Privacy Policy