Hamburg-Based Max-Planck-Institute for Meteorology: Aerosols Cool Less Than Previously Thought
By Sebastian Lüning, Fritz Vahrenholt
[Translated, edited by P Gosselin]
In our book “The Neglected Sun” we wondered a lot about the cooling effect of aerosols that was assigned in the climate models. Aerosols are tiny dust particles and droplets that act to diffuse sunlight and thus as a rule act to cool the earth. But by how much? In Chapter 5 of our book we wrote:
According to the IPCC, the cooling effect of aerosols offsets about two thirds of the power of CO2. In the IPCC’s view, aerosols reduce the warming generated by all greenhouse gases by 45 percent. But the uncertainty is large – it could be 15 percent, or even 85%, because we have only modest to low level of scientific “understanding of the relationships.”
Today very few are aware that the climate models generate far more warming than what we really produced over the last 100 years. The IPCC strategy: All the surplus heat is cancelled by aerosols until the models “fit”. The cooling joker is thus badly needed in order to maintain CO2’s high climate sensitivity.
In March 2015 we saw some progress in the aerosol discussion. One of the authors of the latest IPCC report claimed that the range of uncertainty concerning the effect of aerosols on climate had been greatly reduced thanks to new research findings, and in the meantime there’s been a lot of talk that the cooling potential of aerosols indeed had been significantly exaggerated in the past. The real cooling value is actually at the lower limits of the range assumed up to now by the IPCC.
The most important and boldest claims come from Bjorn Stevens, one of the three directors at the Hamburg-based Max Planck Institute for Meteorology (MPIM). That paper appeared in the Journal of Climate. What follows is the paper’s abstract:
Rethinking the lower bound on aerosol radiative forcing
Based on research showing that in the case of a strong aerosol forcing, this forcing establishes itself early in the historical record, a simple model is constructed to explore the implications of a strongly negative aerosol forcing on the early (pre 1950) part of the instrumental record. This model, which contains terms representing both aerosol-radiation and aerosol-cloud interactions well represents the known time history of aerosol radiative forcing, as well as the effect of the natural state on the strength of aerosol forcing. Model parameters, randomly drawn to represent uncertainty in understanding, demonstrates that a forcing more negative than −1.0 W m−2 is implausible, as it implies that none of the approximately 0.3 K temperature rise between 1850 and 1950 can be attributed to northern-hemispheric forcing. The individual terms of the model are interpreted in light of comprehensive modeling, constraints from observations, and physical understanding, to provide further support for the less negative ( −1.0 W m−2 ) lower bound. These findings suggest that aerosol radiative forcing is less negative and more certain than is commonly believed.
In general one should be careful not to overuse the word “sensational”. But here the word is most suitable. Surprisingly the German media has been deadly quiet on this. A Google news search reveals that there has not been a single article written about the paper. Undesirable news that the media prefer not to make public?
The implications of the paper were immediately recognized within the scientific community. On March 19, 2015, Nic Lewis explained the paper’s far-reaching implications at Steve McIntyre’s Climate Audit and Judith Curry’s Climate Etc.: Also the climate sensitivity gets further limited, and most likely is near the lower limit of the IPCC’s given range. Lewis’s calculations using the new Stevens value yield a most probable mean value for CO2 climate sensitivity (and indeed for the long-term “ECS”) of 1.45°C of warming for each doubling of CO2. The new total range suggested by Lewis ranges from 0.9 to 1.65°C per doubling of CO2. This is far below the IPCC’s latest range of 1.5 to 4.5°C per doubling of CO2.
Figure 1: Range of CO2 climate sensitivity according to calculations by Nic Lewis using the latest Stevens 2015 values. Source.
Bjorn Stevens was fully aware of the avalanche of reactions this would unleash. It is going to take awhile before his IPCC colleagues get over their indigestion and allow the new findings to flow into their modeling work. Until that happens, it is best to avoid any media storm. The MPIM intentionally did not issue a press release to announce the paper. As the English-language media busily discussed the logical consequences of the paper, the MPIM in Hamburg eventually found it necessary to put out a statement. On April 2, 2015, Stevens put out a statement saying that his paper only addressed aerosols and would not be appropriate for speculation on CO2 sensitivity. With it he buys himself a little public peace – for the time being. However the scientific community will not be able dodge the consequences of the paper over the mid to long-term.
37 responses to “Science Under Siege: Max Planck Institute Study Shows Climate Models Severely Overstate Warming”
Cue the attack-dogs… I hope Stevens has never had any fossil-fuel funding of his research.
Further to my comment above, I note in his letter of April 2nd that Stevens does re-state his belief in The Cause. Is anybody here familiar with Steven’s writing style? To me, his letter seems very polished – it has a PR feel to it. Any views?
The authors write:
“Today very few are aware that the climate models generate far more warming than what we really produced over the last 100 years.”
I’d like to see proof of that. Because when I looked at one of GISS’s big models, I found it slightly understates the warming we’ve had. Those results:
OK, that’s one model out of 90. We will need to wait another ten years to see if that one is falsified.
Yes, it’s one model, but I doubt the big models will differ much.
Spencer’s graph measures is about the lower troposphere, not the surface. (Most people live on the surface.)
One could also ask why they used 5-year running means, instead of cumulative warming. The total warming from UAH’s LT dataset is now +0.51 C. That’s certainly not the impression given by Spencer’s graph.
Moreover, why are their no error bars or confidence limits on Spencer’s graph? It would never pass peer review as it is.
Why doesn’t Spencer *ever* give error bars on his data? I’ve asked for them several times, but AFAIK he’s never responded. He’s even said that whether he gives his data to 2 significant figures or 3 depends on how he feels.
“I’ve asked for them several times, but AFAIK he’s never responded”
Why would he ever bother responding to a non-entity like you? Your email would go straight to the garbage where it belongs.
Why would he waste his time responding to a non-entity like you ?
Seriously, your egotistical self-importance is quite sickening.
Well I notice that the “observations” used by GISS obviously are GISS’s own fiddled forged temperatures, not e.g. the UHA or RSS series, which have the great advantage of not violating the Shannon theorem as much.
So. Not impressed here. GISS as reliable as the US CPI or as Pravda in Soviet Times.
“Well I notice that the “observations” used by GISS obviously are GISS’s own fiddled forged temperatures, not e.g. the UHA or RSS series, ”
Is this your approach to science?
So you want to compare models based on land data to temperature data from satellites, which do not measure that very thing?
sod 15. April 2015 at 20:28 | Permalink | Reply
“Is this your approach to science?”
Nope. It’s my approach to scoundrels.
GISS temperatures are adjusted to remove biases.
So are RSS’s and UAH’s. Their adjustment algorithms are even more complicated than GISS’s. See
Climate Algorithm Theoretical Basis Document (C-ATBD)
RSS Version 3.3 MSU/AMSU-A
Mean Layer Atmospheric Temperature
GISS is rewriting the past, pure and simple.
There is a big difference between adjustments for know scientific and technical issues, as in UAH and RSS..
And the wholesale cooling of the whole of the past once-was-data to create a warming trend to forward an agenda.
The first is science, (which DA ignores and does not understand)
The second is political and tantamount to fraud.
As said above GISS is rewriting the past.
GISS must be using a model to compile the history data which then o wonder, confirms the data of the other model.
When the relative values to each other change in several iterations it invalidates the whole series.
GISS’s raw data are adjusted to remove biases.
So are the raw data from RSS and UAH. Their adjusment processes are even more complicated than for surface data. See
Climate Algorithm Theoretical Basis Document (C-ATBD)
RSS Version 3.3 MSU/AMSU-A
Mean Layer Atmospheric Temperature
Oh, David, you’re paranoid again. You think Pierre has inserted “Appell” into the spam list.
Try less caffeine. Known side effect is paranoia.
Of course, not many climate scientists think that ECS < 2 C. And not many agree with Lewis & Curry, which has some problems:
1) They did not use the latest numbers for ocean heat content, but only up to 2011. Changes since then have been large, especially for the 0-2000 m region:
2) They did not use Cowtan & Way's dataset, which many think is superior (because it uses lower tropospheric data to assist in the temperature calculation over large regions with no temperature stations). Since 2005, C&W has averaged +0.05 C warmer than HadCRUT4.
Note that Stevens himself wrote in his statement:
"However others have used my findings to suggest that Earth’s surface temperatures are rather insensitive to the concentration of atmospheric CO2. I do not believe that my work supports these suggestions, or inferences."
1. No, the change below 2000m is tiny and insignificant. Even if its real and not a modelled artefact or from the buoys moving with warming currents, it is equivalent to some tiny insignificant fraction of a degree.
2. Cowtan and Way has been busted.. only people from the cult pay any attention to it. It smears slightly warm temperatures over whole regions that they don’t apply to, just like Gavin does.
Proof has been hiding in plain sight that change to the level of atmospheric carbon dioxide (CO2) does not cause climate change. Only existing data and the relation between physics and math are needed or used.
The proof and identification of the two factors that do cause climate change are at http://agwunveiled.blogspot.com
Let us keep looking at this graph from your link, for a couple of years. Shall we?
The graph is also Figure 9 in the peer reviewed paper at Energy and Environment, vol. 25, No. 8, 1455-1471 so it will be available for more than ‘a couple years’.
Pierre, if you’re going to keep censoring legitimate comments, then the hell with this. You can keep your little bubble.
I’m not censoring. Your proneness to becoming obsessed with things is once again getting the better of you. I explained all this to you in an earlier comment. I’m not available to moderate 24 hours a day, and so often readers have to wait to get their comments approved. You’re no exception. If you want censorship, got to your friends at RC.
Yes, you’re censoring — at times my initial comments appear immediately, until you realize you’d better turn on moderation.
I don’t apologize for being obsessed with good science. So much of it here is bad.
You have no manners and obviously your upbringing was a failure. That was your last comment here.
Cry me a river. !
At least 50% of my comments go into auto-moderation.
Get over yourself, you are NOT important or special.
That’s so funny. He runs a blog himself yet he is totally devoid of any knowledge how stuff works. Just the right mind to get behind climate models one would think.
And now he even managed to get booted out finally. David, start a Hollywood career, something like Dumb and Dumberer, team up with John Cook, he’s got the looks.
Does he actually have any visitors?
Or is it one of the “only me” ego blogs?
No, don’t post a link. !!
Davey reminds me of Lemongrab: ‘Unacceptabllllllllllllllllllllle!’
Paranoia, need destroyer. Paranoia, they destroy ya’ …
Amazingly, our toy troll army still defends GISS.
For your entertainment, all the blink comparators you need to understand the science of history rewriting:
Here’s a thought: NASA, a space agency that earns 1.2 bn USD in tax payer money for “climate research”, produces two types of temperature products:
1) From an office in NYC, a fudged adjusted hmogenized extrapolated hodgepodge of land based thermometer data… (which constantly changes its past)
2) Via satellites – which is NASA’s key domain, obviously – the UAH and RSS temperature products, which show a strikingly different behaviour.
Now.. which one of these get the big play in the media…
…why the imaginary product from NYC of course… something’s gotta make those 1.2 bn USD roll in every year after all…
…while NASA keeps a close eye on the real climate via their satellites (Is the name “realclimate” an inside joke by Gavin? When there’s a REAL climate, does this not imply that there’s a fake one?)
This post inspires a too-long-delayed thank you from this lurker for your work.
For someone like me who has been unable to dig into the recent developments in aerosol forcing, this pulled things together nicely. (And, of course, thanks to Drs. Lüning and Vahrenholt.)
The MOST CONVINCING PROOF EVER that GREENHOUSE RADIATIVE FORCING is FALSE PHYSICS
I’ve shown below, using the AGW conjecture, that we should expect some areas on Earth to reach temperatures of over 100°C when the flux from the atmosphere is added to the Solar flux in Stefan Boltzmann (S-B) calculations. Hence the whole concept that such flux can be added is obviously false, and so too are the rest of the energy budget diagrams (K-T, NASA, IPCC etc) and of course the computer models.
It is obvious that in climatology physics courses they brainwash students who end up like Joel Shore and Roy Spencer being adamant that we must add the back radiation to the solar radiation. That is shown to be wrong in my March 2012 paper and also by a professor of applied mathematics in “Mathematical Physics of BlackBody Radiation“ written towards the end of that year.
Let me explain it simply:
The AGW proponents correctly estimate the mean solar radiation being absorbed by the surface as 168W/m^2. They then add 324W/m^2 of back radiation and deduct 102W/m^2 to allow for the simultaneous losses by evaporative cooling and conduction, convection etc. This gives them the “right” 390W/m^2 that coincides with 288K that they claim is the mean surface temperature. (Personally I think it’s closer to 10°C than 15°C.)
Now we need to understand that this 168W/m^2 of solar radiation is a 24-hour annual mean for an average location at a latitude of 45°(S or N) that is half way between the Equator and the relevant Pole. Even that location will receive a mean of twice that in 12 hours of average daylight with average cloud cover. Locations such as this in the far south of New Zealand have mean annual temperatures around 9°C or 10°C.
You need to remember that we start with the solar constant of 1360W/m^2 which is what the one location on Earth where the Sun is directly overhead would receive if there were no atmosphere. But, again on average, there is about 30% reflection and 20% absorption which reduces that to about half. So, on an average day with average cloud cover at noon where the Sun is directly overhead, that location would receive half of that 1360W/m^2, namely 680W/m^2. But on a clear day there is only about 10% reflection, not 30% because two-thirds of the albedo is due to clouds. So that location receives about 70% of 1360W/m^2, namely about 950W/m^2.
However, the AGW proponents claim that an average location at 288K receives back from the atmosphere 83% (324/390) of the radiation it emits. I don’t dispute that. But the electro-magnetic energy in that radiation from a cooler source is not converted to thermal energy in the surface. The AGW proponents say it is. Hence they add it to the solar radiation in S-B calculations.
Now, the solar radiation does not achieve the S-B temperature we might expect for two reasons, the first being that there is simultaneous energy loss by non-radiative processes and the second reason is because there may not be enough time in the day for the solar radiation to reach the maximum temperature. However, if we deduct 200W/m^2 from that 950W/m^2 as a reasonable estimate for losses by non radiative processes (that are only half that amount at 288K) the resulting 750W/m^2 does explain the observed temperatures which have been recorded in the forties and fifties °C. But let’s just use 600W/m^2 (which has a blackbody temperature of 48°C that is realistic) thus making an allowance for the limited time in the day.
But, if we were to now add 83% back radiation (that is, 83% of something like 600W/m^2 that would be emitted by regions like this on clear days) we get about 1100W/m^2 which of course gives ridiculously high temperatures in the vicinity of 100°C.
Hence it is obviously wrong to add back radiative flux to solar flux and use the total in Stefan Boltzmann calculations. And so the whole GH radiative forcing paradigm is wrong, as are those models, and that’s why you need to consider the totally different 21st century paradigm that is based on the Second Law of Thermodynamics.
The abstract clearly states, models overestimate the negative forcing of aerosols, or c̲o̲o̲l̲i̲n̲g̲, not any warming. (»These findings suggest that a̲e̲r̲o̲s̲o̲l̲ r̲a̲d̲i̲a̲t̲i̲v̲e̲ f̲o̲r̲c̲i̲n̲g̲ i̲s̲ l̲e̲s̲s̲ n̲e̲g̲a̲t̲i̲v̲e̲ and more certain than is commonly believed.«)
In my opinion that man is trying to sell his books.
Matthias. I fear you have no idea what this discussion is about. Please read again and think.