German geologist Dr. Sebastian Lüning and chemist Prof. Fritz Vahrenholt react to Anthony Watts’s press release concerning the quality of US temperature data at their blog.
Unlike myelf (I expected an unsurvivable nuclear bomb to be dropped), Sebastian Lüning and Fritz Vahrenholt find the implications of the new findings to be potentially devatsting. They write (excerpt):
Anthony Watts did not disappoint. Indeed it is about the release of an important new publication that involves the warming trend of the last 30 years. We had already introduced that problem here at this blog a few days ago (see our blog article “The wonderful world of temperature data corrections: And suddenly the trend reversed…“). It gets down to how justified are ‘corrections’ to the official data? We had reported that the data changes oddly always produced a signficant acceleration in warming with respect to the raw data, even though factors like the urban heat island effect intuitively suggest the oppsite correction is needed.
Authors in addition to Anthony Watts included Stephen McIntyre and John Christy. McIntyre is known because of his impressive error analyses of the famous hockey stick digram. Christy is a renowned expert for satellite temperature data at the University of Alabama in Huntsville. The manuscript will soon be submitted for publication by a journal. […}
[…] The result of the new study is shocking: Instead of correcting downwards temperatures that were heated by the urban heat island effect, the official US administration offices apparently corrected data from qualitively reliable stations upwards, which appears to be unjustifiable. If the result is confirmed, then it would be a shocking development. The warming in the USA over the last years would be far less rapid than what has always been assumed. And because similar errors are supected elsewhere, the issue could quickly gain global relevance.”
I asked Dr. Lüning to comment more on why this could have global implications. He kindly replied by e-mail:
Bit by bit others who might jump onto the train will now use the same methodology worldwide and will probably find that it really affects the global curve.”
Lüning and Vahrenholt also provide the press release in German at their site.
Take it from them; they’re experts. This could very well have global consequences. One thing is sure: It’s a an utter embarassment for the NOAA, and the perceived reliability of US and international data will be in question for years to come. Faith in the surface station data is crumbling.
It seems that anyone who gains a large enough following in the climate debates, immediately comes down with Publicus Hubristicus: Inordinate pride in themselves, and a blatant and irritating partisan prejudice, that publicly trumpets any widely-recognized news that agrees with them. I don’t see any mention of Steven Goddard’s ongoing clear exposé of the blatant data manipulation, so evidently his work doesn’t get their attention as Watts’s work now has. I blame all of this on the suborning of climate science by our divided politics, as I just commented on in your last post, about self-deluded Australian shrinks attacking climate consensus skeptics:
Politics Spreading the Delusion: A Case in Point
I’m not sure I agree 100% here. Steve Goddard is widely read, is often featured at Climate Depot, for example. I visit his site at least twice a day. Steve provides lots of historical anecdotes that are evidence that climate is not really different today. However, we have to recognise that science doesn’t progress one anecdote at a time, but through one formal paper at a time. Anthony has presented an impressive paper.
Anthony deserves all the credit he’s received. The guy has so many things going it boggles the mind. The whole surface station project is really impressive. Not even the US government could have accomplished that without blowing billions. Anthony got it done on a shoestring! Where would we be without Watts and McIntyre?
In fact we cited a lot of Steve Goddard’s fantastic work in our recent overview of temperature corrections: http://www.kaltesonne.de/?p=4743
Steve’s work is highly regarded and appreciated !
It would have been nice if the canals on Mars really had existed but they disappeared with better observations. Even Lubos Motl in his blog of today does not appreciate what Luning and Vahrenholt apparently see much better as the crucial aspect of the study by Watts et al. It’s not a silly contest between Watts’ and Muller’s credentials or a contest of showing whether things are more or less worse than we thought. Neither is it about a new trend slope in the USA, being only a small part of the world, and that therefore the study adds little to the global result. It would have been more constructive if Motl had suggested that this study should be replicated in another big country. Why? Not because we should cover somewhat more of the world in a war lasting for decades. Scientifically, the war can be over within one or two years.
This study (if accepted) shows that there is a systematic relationship between quality of measurement and data handling at one side and the results at the other side.
I can understand that physicists do not see this because they know little about research problems in the biological, medical, and social sciences. If it is found in a meta-analysis that effect size depend on the quality of research, this relationship is considered devastating. We need a replication study, because if the found relationship is shown again, AGW is almost certainly based on fake. This study is really a game changer.
Agree about Lubos. He has good understanding of mathematics and likely string theory but as a physicist he has little experience in actual measurement and analyses of errors. While he is skeptical of AGW he has demonstrated that he really does not understand the engineering discipline of heat & mass transfer. It will be clear to anyone who has blown on their hot soup to cool it that radiation is not the only heat transfer mechanism. Strangely, for a expert in string theory he has little to say about gravity which comes into the lapse rate.
Getting back to the Watt paper, it is useful to point out the temperature is not being measured consistently and inappropiate adjustments have been made but temperature is only one aspect. Measurements need to be made of humidity, incoming radiation, cloud cover, precipitation, evaporation and wind velocity & direction at the same time. The work of W Kreutz (1941) should be replicated with automated stations around the world. One of the outcomes will be that temperature lags in coming radiation and that CO2 lags temperature on a daily and seasonal basis.ie CO2 gives no feedback.
As far as I can see the main novelty is that the weather station classification scheme of Leroy (2010) is better than Leroy (1999).
It would have been more elegant if Watts had stated in his press release that the differences between stations of various qualities he found in the temperature trends are only visible in the raw data. In the homogenized (adjusted) data the trends are about the same for all quality classes. No more sign of errors due to the urban heat island.
In the press release and this blog it is also emphasized that the temperature trend after homogenization is stronger than in the raw data. That the trend is stronger in the homogenized data is no surprise, the transition to automatic weather stations during the study period has caused an artificial cooling in the raw data. Maybe Mr Watts thinks this is new, but, e.g., Menne et al. (2009) already stated that the introduction of automatic weather stations (the transition from Liquid in Glass thermometers to the maximum–minimum temperature system) caused a temperature decrease in the raw data of 0.3 to 0.4 °C. This temperature jump has to be and was removed by homogenization.
The Urban Heat Island effect is best known in the blog-o-sphere, but many other inhomogeneities, are more important and many of them have the reverse sign.
For a bit more detailed “review, please visit my blog.
The problem with homogenization is that nowhere in the professional statistics literature this procedure can be found, neither the Mannian version of principal components analysis. If we start with some faith, we may homogenize one million of data points on NASA photographs, giving us those funny Mars canals back. Someone has to tell me how the validity of this method should be judged by outsiders like me.
If you know so little about homogenization, why do you think you know that statisticians are not working on it? Knowing something that does not happen requires a lot of expertise. Where do you get your information from?
Two statisticians currently working on homogenization of climate data are Philippe Naveau and Robert Lund. Hawkins (a mathematician) has performed great work on homogenization. Most statistician are only interested in the single-breakpoint problem, that allows you to solve nice theorems, but Hawkins (1972) actually worked on the climatological more important multiple-breakpoint problem.
The problem of homogenization does not lie in the statistics. That part is easy. Designing a specific implementation for climatology requires expertise about the statistical properties of the climate and the inhomogeneities, which a statistician does not have. It is thus logical that climatologists develop their own homogenization methods based on standard statistics. These methods have been validated in a blind benchmarking study. This study found that homogenization improves the accuracy of the temperature data and does not introduce artificial trends, as you may have expected.
Hawkins, D. M.: On the choice of segments in piecewise approximation, J. Inst. Math. Appl., 9, 250-256, doi:10.1093/imamat/9.2..250, 1972.
Hawkins, D. M.: Testing a sequence of observations for a shift in location, J. Amer. Stat. Assoc., 72, 180-186, 1977.
“the introduction of automatic weather stations (the transition from Liquid in Glass thermometers to the maximum–minimum temperature system)”
Do you want to imply that people were not able to record maximum/minimum temperatures before they had automatic weather stations? Or that it was not possible to record max/min temps with “liquid in glass” thermometers?
Dear DrikH. No they are just called that way in the USA. It makes it easier to look for more information if you know the right terms.
The Liquid in Glas thermometers are minimum and maximum thermometers.
The automatic weather stations of the voluntary network of NOAA are called MMTS.
Pierre
You are spot on. The Watts et al paper has involved an inordinate amount of effort and not a little heartache when NOAA/NASA tried to sabotage it at the intermediate stage. Now one can see why. This is a very large smear on the reputation of US government agencies and their data. I hope they come clean soon but will not be holding my breath.
This could be the BBCs response to the paper
http://www.bbc.co.uk/news/science-environment-19047501
When Muller says now “Call me a converted skeptic”, we can say that this is a False Flag operation, since he has earlier said
““It is ironic if some people treat me as a traitor, since I was never a skeptic — only a scientific skeptic,” he said in a recent email exchange with The Huffington Post. “Some people called me a skeptic because in my best-seller ‘Physics for Future Presidents’ I had drawn attention to the numerous scientific errors in the movie ‘An Inconvenient Truth.’ But I never felt that pointing out mistakes qualified me to be called a climate skeptic.”
”
http://suyts.wordpress.com/2012/07/29/muller-never-was-a-skeptic/
Muller is a publicity-hungry opportunist who speaks out of both ends of his mouth.
Muller is an impostor. He wrote in the Opinion Pages of NYT on 28 July 2012: “My total turnaround, in such a short time, is the result of careful and objective analysis by the Berkeley Earth Surface Temperature project, which I founded with my daughter Elizabeth (…). Moreover, it appears likely that essentially all of this increase results from the human emission of greenhouse gases”.
So he tries to sell his Greenhouse gases opinion under the heading of a careful and objective analysis. A good treatment of this was given by Monckton:
http://joannenova.com.au/2012/07/muller/
Monckton wrote: “The greatest error in the Berkeley team’s conclusion is in Dr. Müller’s assertion that the cause of all the warming since 1750 is Man. His stated reason for this conclusion is this: “Our result is based simply on the close agreement between the shape of the observed temperature rise and the known greenhouse gas increase.” No Classically trained scientist could ever have uttered such a lamentable sentence in good conscience. For Dr. Müller here perpetrates a spectacular instance of the ancient logical fallacy known as the argument from false cause – post hoc, ergo propter hoc”.
The latter sentence can be understood by reading the article about the ‘close agreement’ between number of storks around Berlin and the number of home-births (not in hospitals because these keep their windows closed preventing the stork from delivering his present) :
http://www.psychologytoday.com/blog/perfect-health-diet/201203/the-theory-the-stork-rises-again
Now we have one single paper to refer to when we tell the fake Global Warmers to SHUT UP!
And we have an interesting review of Watts et al. you tell rational people to SHUT UP!
http://www.skepticalscience.com/watts_new_paper_critique.html
[…] https://notrickszone.com/2012/07/30/reaction-from-germany-on-wattss-press-release-shocking-developmen… […]
NOAA manipulations seem similar to those being considered in New Zealand.
Although the NZ case was well underway before the Watt’s paper.
Both will be significant terms of global effect.
While NZ is a small country it represents a biggish chunk of the south Pacific.
[…] "Bit by bit others who might jump onto the train will now use the same methodology worldwide and will probably find that it really affects the global curve.” https://notrickszone.com/2012/07/30/reaction-from-germany-on-wattss-press-release-shocking-developmen… […]