Now that the preliminary draft of the IPCC Assessment Report 5 (AR 5) is available online, thanks to Alec Rawls, readers have had the chance to analyze it in detail.
Michael Krüger at Readers Edition here takes a look at some of the charts and graphics for temperature and sea level rise development, and compares the IPCC charts to those from alarmist Stefan Rahmstorf.
As Krüger shows, scientists have very different perspectives and everything but consensus exists:
IPCC Draft vs Rahmstorf’s Manipulations
Stefan Rahmstorf, one of the most well-known and influential climate scientists in Germany, and a lead author for the last IPCC report, announced at his KlimaLounge blog in 2008 that the temperature development at the time was at the upper limit of those of the IPCC scenarios:
Chart from Stefan Rahmstorf: Global annual mean temperatures using datasets from NASA (red) and the Hadley Center (blue) through 2007. The heavy line is the 11-year smoothing. The gray area with the dashed lines beginning in 1990 depicts the range of the IPCC scenarios. Here the temperature development is at the upper range of the IPCC scenarios.
But much to our astonishment, the recent draft of the IPCC AR 5 contains a completely different graphic. In this chart the temperature development is not in the upper range of the scenarios, rather it is barely at the very lower range of the IPCC scenarios:
Temperature development depicted in the draft of the coming IPCC AR5. The range of the scenarios for the four previous IPCC reports are shown along with the observed temperature development, which is depicted by black dots (with error bars). It is clear to see that the current temperature development lies in the lower range of the IPCC scenarios and that the temperature increase has not accelerated over the last years.
The same is true with sea level rise. Stefan Rahmstorf announced at his KlimaLounge blog in 2011 that sea level rise had accelerated and that it was in the far upper range of the IPCC scenarios:
Figure from Stefan Rahmstorf: Global sea level relative to 1990 in tide gauge data (red) and satellite data (blue), compared to the scenarios of the IPCC (gray area and dished lines).
But now take a look at the observed data and scenarios shown in the draft of the upcoming IPCC AR 5. It shows the opposite is true:
Sea level development depicted in the draft of the upcoming IPCC AR 5: The overall range of the four previous IPCC reports is shown along with the actual observed sea level plot (shown by the black dots with error bars). Clearly the current sea level rise has not accelerated and is in the lower range of the IPCC scenarios.
Summary: These examples show that data can be bent one way or the other. Some climate scientists, such as Mr. Rahmstorf, intentionally use such methods in order to suggest to the readers that we are actually in the middle of a crisis. And the climate protection allegiance is ready and willing to believe it. I believe the charts in the draft of the IPCC AR 5, which did not succumb to alarmism.
By the way, within climate protection circles, the new graphics of the IPCC are already being discredited as the works of skeptics. It’ll be interesting to see what changes will be made in the final draft of the IPCC report.”
I expect to see both graphics disappear entirely.
They just need to offset their old projections again. They don’t care about the absolute “average temperature” (a meaningless construct) anyway.
Schellnhuber said 15.3 deg C in 2009. He knew the average temperature of the planet to a decimal.
The IPCC said 14.5 deg C in 2007.
Nobody in the “scientific” institutions ever bothered to explain this difference.
Because they just don’t care.
Have you seen Donna’s blog article “Who is Hans Joachim Schellnhuber?”
http://nofrakkingconsensus.com/2012/12/21/who-is-hans-joachim-schellnhuber/
—————
According to his CV, Hans Joachim (John) Schellnhuber earned his PhD in theoretical physics in 1980 and spent the next eight years as a postdoctoral fellow, an assistant professor, and a visiting professor. In 1988 he landed a job as a full professor.
…
We’re advised that he is a member of the editorial boards of six scientific journals and that he has taught at a number of universities. But the bulk of Schellnhuber’s time isn’t spent in a physics lab. Rather, for at least two decades, he has been an administrator, a bureaucrat, and a political advisor.
…
Despite the long list of pompous pronouncements in that Guardian piece, not once did Schellnhuber explain why his physics training gives him special insight into economic, social, or political questions.
…
When Schellnhuber looks at the IPCC, he sees perfection. He sees a thoroughness that hundreds of other people say doesn’t exist.
———
The whole article is well worth reading. And do follow the link to Bishop Hill.
If you want a good laugh try this:
These are “clearly not freak events,” but “systemic changes,” said climate scientist Stefan Rahmstorf of the Potsdam Institute in Germany. “With all the extremes that, really, every year in the last 10 years have struck different parts of the globe, more and more people absolutely realize that climate change is here and already hitting us
http://news.slashdot.org/story/12/12/21/1552258/2012-another-record-setter-for-weather-fits-climate-forecasts
You’ll see from the comments that the Slashdot community are very pro AGW…
Oi. Our IT intelligentsia doesn’t know the first thing about modeling.
Those who have some real experience in modelling physics, etc. are HORRIFIED not just by the wild, baseless claims made on the basis of Playstation Climatology, we realize that the main models are a swill of uncontrolled software bits and pieces apparently written largely by undergrads and junior postgrads who probably think that their software works as the compiler generates an executable program.
Comments in code are more-often not updated as modifications are made. Factors are introduced out of thin air; often as constants when they should notionally be inter-dependent variables. There is no bounds-checking on input data. No trapping of execution errors. No tracing of accummulating errors.
Those “models” are then fed data which have been pre-mangled and in-filled, based on assumptions that predetermine the outcome.
Running the models on the newest supercomputers deifies their output.
GIGO: Garbage In; Grants Out.