Three of the world’s most distinguished scientists, Professors Happer, Koonin and Lindzen, recently offered as evidence in a Californian court a “Climate science overview”. It begins:
Our overview of climate science is framed through four statements:
- The climate is always changing; changes like those of the past half-century are common in the geologic record, driven by powerful natural phenomena.
- Human influences on the climate are a small (1%) perturbation to natural energy flows.
- It is not possible to tell how much of the modest recent warming can be ascribed to human influences.
- There have been no detrimental changes observed in the most salient climate variables and today’s projections of future changes are highly uncertain.
The third statement is crucial to the Paris Agreement. There is little point in constraining human-caused global warming to an additional 0.5 °C if, during the same period, non-human-caused temperatures were to increase or reduce by 3 °C.
In a previous post, Unpacking Climate Alarm, I identified the two great uncertainties that lie at the heart of climate science. There are two simple prerequisites for the “dangerous anthropogenic global warming” hypothesis (DAGW) to apply:
- The Equilibrium Climate Sensitivity (ECS) temperature must be a high number (>3 °C).
- Natural Internal Variability (NIV) and Natural Forcings (NF) must sum to a low percentage (<50%).
Let’s start by looking at the available evidence for the second of these key postulates.
Natural variability
NIV encompasses all the known vagaries of nature, such as the El Nino Southern Oscillation (ENSO), the Atlantic Multidecadal Oscillation (AMO), the Pacific Decadal Oscillation (PDO), as well as all the unknown unknowns (the “Unk-Unks” as engineers call them). NF refers to known events that directly impact the earth’s ‘energy budget’ such as volcanoes or changes in solar irradiance. Taken together, NIV and NF are the sum total of all non-anthropogenic influences on the global mean surface temperature (GMST).
In the IPCC’s latest report, AR5 (Working Group 1 – the Physical Science Basis (WG1)), this statement is made in bold and highlighted in the Summary for Policy Makers (SPM):
Human influence has been detected in warming of the atmosphere and the ocean, in changes in the global water cycle, in reductions in snow and ice, in global mean sea level rise, and in changes in some climate extremes (see Figure SPM.6 and Table SPM.1). This evidence for human influence has grown since AR4. It is extremely likely that human influence has been the dominant cause of the observed warming since the mid-20th century. {10.3-10.6, 10.9}
This bullet then follows in standard font:
- It is extremely likely that more than half of the observed increase in global average surface temperature from 1951 to 2010 was caused by the anthropogenic increase in greenhouse gas concentrations and other anthropogenic forcings together. The best estimate of the human-induced contribution to warming is similar to the observed warming over this period. {10.3}
It is no surprise that the world’s press focused on these two paragraphs to the virtual exclusion of the other 1500+ pages of AR5. The really big news was that “extremely likely” had replaced the “very likely” used in AR4, and “more than half” and “dominant” had replaced the term “majority” found in AR4.
Opinion, not science
A word search of Chapter 10, WG1, for the word ‘dominant’ in the referenced sections 10.3-10.6, scores no hits at all, but the phrase “more than half” is found in one section (10.3.1.1.3). After an extensive analysis of publications, some of which paint natural variability as having greater importance and others don’t, the authors conclude:
Overall, given that the anthropogenic increase in GHGs likely caused 0.5 °C to 1.3 °C warming over 1951-2010, with other anthropogenic forcings probably contributing counteracting cooling, that the effects of natural forcings and natural internal variability are estimated to be small, and that well-constrained and robust estimates of net anthropogenic warming are substantially more than half the observed warming (Figure 10.4) we conclude that it is extremely likely that human activities caused more than half of the observed increase in GMST from 1951 to 2010 [emphasis added].
The conclusion in the paragraph is expressed as an opinion, not a scientific finding. Given that this is undoubtedly the most important paragraph in the entire AR5, let’s look more closely at the three arguments that the authors have selected from the slew of conflicting views in the literature.
- the anthropogenic increase in GHGs likely caused 0.5 °C to 1.3 °C warming over 1951-2010, with other anthropogenic forcings probably contributing counteracting cooling.
This appears to beg the question. The stated warming range is merely the modelled arithmetic result of assuming that ECS (the effect of doubling GHGs) is 1.5 – 4.5 °C. It is no more ‘likely’ than any other assumed ECS and has nothing to do with observations or other evidence. The level of counteracting cooling, mainly by anthropogenic aerosols, is unknown—and might be anything from 1-100% of the theoretical warming. There is nothing here to show that the net anthropogenic contribution was “more than half.”
- the effects of natural forcings and internal variability are estimated to be small.
This is simply circular reasoning—the natural effects are small because they are “estimated to be small.” This argument is an embarrassment. It tells us nothing at all.
- well-constrained and robust estimates of net anthropogenic warming are substantially more than half the observed warming (Figure 10.4).
The “robust estimates” are set out at Figure 10.4(b) (p. 110), which shows “the estimated contributions of forced changes to temperature trends over the 1951-2010 period.” But this includes only forced changes, and takes no account whatever of internal variability. As a result, it can tell us nothing about the NIV percentage of the 0.6 °C observed warming. The estimates are based on a completely unjustified (and unjustifiable) assumption that the value of NIV is always zero.
In any event, the Figure 10.4(b) results are mere guesswork. They are taken from eight model runs analysed in Jones1 et al. (2012), which estimated GHG-attributable warming in a range of 0.6-1.4 °C, reflecting the range of ECS assumptions. Jones et al. then equated the top of this range to the actual warming of 0.6 °C by subtracting a made-up figure of 0.8 °C for anthropogenic aerosols.
Equilibrium climate sensitivity
So, on the basis that ECS is 4.5°C, Jones et al. have gross anthropogenic global warming (AGW) at 1.4°C, which reduces to 0.6°C (100% of the observed figure) after netting off 0.8°C for human aerosols. But if ECS should be 1.5 °C, then AGW would be 0.6 °C gross and (after subtracting aerosols) would be less than zero. In order to get to a net 0.31 °C—i.e., “more than half”—gross AGW needs to be at least 1.1 °C, which could only happen when ECS is in the top one-third of its range (i.e., 3-4.5 °C).
Clearly, the AR5 conclusion is based solely on model runs, which are in turn based on assumptions. Observation plays no role in arriving at the conclusion and there is a total absence of scientific evidence. Even if the HadCRUT models had been validated (and they have not) human activities cause “more than half” the 0.6 °C of observed warming only where the model is programmed to find that climate sensitivity is greater than 3.0 °C.
Put at its highest, the WG1 report could only lay the basis for this conclusion:
Our preferred climate models estimate that human activities caused more than half of the observed increase in GMST from 1951 to 2010, in the extremely unlikely event that (a) ECS exceeds 3.0 °C, and (b) the net impact of NIV was zero during that period, and (c) anthropogenic aerosols contributed cooling of 0.8 °C.
Do not know… unable to quantify… models at odds with observations…
The IPCC do not know how much NIV occurred during any period, because their studies have always been restricted to human-caused climate change. AR5 does, however, suggest that NIV has indeed been occurring in the period since 1998 and that this explains “the pause” of 1998-2010. A finding of zero net NIV for six decades requires that the aggregate of all the natural warming impacts was exactly offset by the aggregate of the natural cooling impacts, to a pin-point accurancy of one-hundredth of a degree. This is extremely unlikely.
By the AR5 cut-off date, the IPCC had been unable to quantify the volume of anthropogenic aerosols, but several lead authors have subsequently agreed with Nicholas Lewis that the previously assumed quantities were exaggerated. Nothing about this component is “extremely likely”.
As to ECS, the very same WG1 report was unable to fix a “likely” value for ECS because models were at odds with observations. So they cannot subsequently claim the “more than half” statement to be “extremely likely”, when that statement relies upon ECS being greater than 3 °C.
Two implausible assumptions
The second number (NIV + NF <50%) mentioned at the beginning of this post turns out to be dependent on the first number (ECS >3 °C). The uncertainty about NIV is merely the ECS uncertainty in another guise. So the entire house of cards of modern climate science, and the international policy structure built atop that science, is wholly dependent upon a single driver—climate sensitivity—and ECS is its single metric.
This analysis exposes that there is no evidence at all for the “more than half” finding of WG1 Chapter 3, which is the foundation stone of the entire AR5. Nor is there even a climate model run that supports it. Instead, it is merely a product of at least two implausible assumptions.
— Barry Brill is Chairman of the New Zealand Climate Science Coalition
Views: 769
Thanks Barry, a beaut winkling-out from AR5. I think NZ 1st should read this. The others too, though they seem less likely to give credence……
By definition, NIV=0 for long time periods. NF is currently < 0 as we would be slowly heading towards an ice age under the current Milankovitch cycle. Jones (2013) is not the only attribution paper to consider. Read http://www.realclimate.org/index.php/archives/2013/10/the-ipcc-ar5-attribution-statement/
I remember an interview with Simon Upton where he laughs at the ham-fisted ways that Richard Lindzen tried to mislead him. Unfortunately, other National MPs were more gullible.
Simon, the Real Climate piece merely discusses the Chap 10 conclusions – with an esoteric commentary on degrees of likelihood. Jones et al (2013), and the other two referenced papers, were published after the cut-off date for WG1 and I presume that’s why their CMIP5 runs were not mentioned in Chap 10.3.1.1.3. The sole authority for the “more than half” statement was Jones et al (2012).
You say that natural internal variability is zero “by definition”, because it all cancels out over the long term or “at equilibrium”. Changes caused by cycles or oscillations will eventually revert to the mean.That view is explained at https://judithcurry.com/2013/08/29/what-is-internal-variability.
Two problems: (a) this is all unprovable theory/opinion; and (b) “the long term” is as long as a piece of string. It is most unlikely that this theoretical equilibrium has ever
occurred in the 4.3 billion years of the planet’s existence, to date. If equilibrium were to be reached in (say) the year 4,000AD, how does this help the accuracy of climate models or the content of policy making in the current century?
In assessing the influences on temperature change within a defined period – eg
1951-2010 – it is absurd to ignore the NIV that actually occurred because you think it will revert in a future period. How is it relevant that the warming from the super-El Nino of 1998 will likely be cancelled out at some unknowable point in the future? In reality, that El Nino itself caused “more than half” of the observed warming of 0.6°C during 1951-2010, but the models are instructed to pretend it (and other NIV) just didn’t happen. With nowhere else to go, the models attribute that warming to humans.
Really, Simon, Upton vs Lindzen? What were you smoking?
It’s incorrect to think of NIV as a mean reversion process. NIV is the random error which can’t be forecast.
A history lesson for you, Brett: https://youtu.be/IGf4maDU7Ps?t=713
The difference Simon is that I have studied the Physics of the problem for ten years . You just recite what suits your beliefs. Which are different, too, to what you pretend…..
I have also spent seven decades deaing with Climate as it happens and have questions you have not dreamt of. Lindzen however is far better versed than I. Your outright lies are just showing desperation, as your cloud-castle vanishes. Money stopped, cooling oceans from Quiet Sun plus normal cycles (ignored but not ignorable), the bell tolls for thee. And Upton.
Please try to really understand statistics. And data as opposed to scenarios. Just don’t bother with propaganda videos here.
Simon – I don’t think of NIV as either means reversion or random error. But you are right to suspect that the CMIP5 bundles all items “which can’t be forecast” (ie all influences other than GHGs) under the heading of “internal variation” and then simply leaves them out of all calculations.
If every influence except human-caused GHGs is omitted, it is no surprise that AGW is seen to be the dominant cause of temperature change.
If NIV does comprise aggregated random errors, then its omission obviously ensures that all forecasts will be in error, and also invalidates the IPCC’s “hindcast” of the 1951-2010 period.
James Hansen omitted NIV from his 1988 model because there was no way to forecast a chaotic non-linear system. He thought it sufficient to model the GHG impacts alone, because they would dwarf such effects as ENSO (which had been mild for 40 years) and PDO (which wasn’t identified until 1996). Besides, in his view, long-term GMST was all about the energy balance at equilibrium. Despite subsequent experience, the climate modellers have continued to treat these early assumptions as axioms.
SR15 is out, with forecasts that GMST will increase by 0.5°C over the next 30-40 years.
If AGW was “more than half” 0.6° for the last 60 years, then the only reasonable prediction is that it will continue on the same trend-line for the next 60 years.
I make that an increase of 0.31°C by 2078 – if we believe the IPCC and if we continue as before to add 2ppm per annum to atmospheric GHGs. But, as we all know, the warming effects are logarithmic, so we should be confident this 1.31°C (since 1850) won’t happen until about 2100.
And what about the 0.29°C of non-anthropogenic forcing? What if that is a minus in the coming decades? And then there’s the question of natural internal variation … and technology changes…
ENSO and PDO are heat fluxes between ocean and atmosphere, despite some cranks here suggesting that heat exchange can’t take place. The deep ocean warms during a “cool” phase, which releases further CO2…
The attribution of AGW is “almost certainly” more than half, i.e. at a 95% confidence interval. AGW attribution is “likely” to have been slightly higher than 100% of the observed warming.
SR15 actually says:
A1. Human activities are estimated to have caused approximately 1.0°C of global warming above pre-industrial levels, with a likely range of 0.8°C to 1.2°C. Global warming is likely to reach 1.5°C between 2030 and 2052 if it continues to increase at the current rate. (high confidence) {1.2, Figure SPM.1}
A1.1. Reflecting the long-term warming trend since pre-industrial times, observed global mean surface temperature (GMST) for the decade 2006–2015 was 0.87°C (likely between 0.75°C and 0.99°C) higher than the average over the 1850–1900 period (very high confidence). Estimated anthropogenic global warming matches the level of observed warming to within ±20% (likely range). Estimated anthropogenic global warming is currently increasing at 0.2°C (likely between 0.1°C and 0.3°C) per decade due to past and ongoing emissions (high confidence).
A1.2. Warming greater than the global annual average is being experienced in many land regions and seasons, including two to three times higher in the Arctic. Warming is generally higher over land than over the ocean. (high confidence) {1.2.1, 1.2.2, Figure 1.1, Figure 1.3, 3.3.1, 3.3.2}
A1.3. Trends in intensity and frequency of some climate and weather extremes have been detected over time spans during which about 0.5°C of global warming occurred (medium confidence). This assessment is based on several lines of evidence, including attribution studies for changes in extremes since 1950. {3.3.1, 3.3.2, 3.3.3}
I won’t ask Simon the scientific basis for his above quotes. But, seeing as he seems interested, how about telling us what gas caused the last recent warming of the Arctic, and how far above freezing it got? Also, why did it cause the recorded lift in Global Temperatures?
Regarding natural internal variability, Tamino posted this morning on how to quantify these factors and the effect on trend:
https://tamino.wordpress.com/2018/10/09/the-global-warming-signal/
The only way surface water can enter the deeps is from the polar environs where seaice formation results in sinking of super-saline water at c. -1.9C. Where shallow, this can cause those fearsome freezups of bottom-dwelling life. Before the loss of circulation eg Central america rising, deep waters reached c.18C IIRC; now c.2-6C. Life did fine back then, better than now in the Ice Age…… yet still the claimed 200/day species-loss cannot be actually observed. Like the modelled warming. The Polar Deep Bottom-Water Formation however been observed since the Challenger Expedition.
You might want to read up on thermohaline circulation.
Simon: You might want to read up on the sun.
Simon please give us your version. Also please answer my questions or be just another troll. Brett
Simon – your SR15 quotes are from the SPM which is drafted by Policymakers – imbued with propaganda incentives and having no particular insight into relevant science. If you want to treat the report as an authority, please cite the scientists and the particular chapters/sections. Reading chapter 1, I can’t find any evidence to depart from the “more than half” statement in AR5.
Tamino starts his piece by acknowledging that “many things affect global temperature” but then deals with only one aspect of NIV. He then somehow
separates ENSO effects from greenhouse effects without explaining how he does it.
This is the very trick AR5 was unable to perform.
And yes, I understand that ENSO (not PDO) comprises part of the heat fluxes between ocean and atmosphere. But why does an El Nino increase (and a La Nina decrease) the entire planetary temperature anomaly, which is supposed to represent the average of BOTH oceanic and atmospheric temperatures? Is the ocean heat content an unknown quantity until it emerges into the open air?
The global mean temperature refers to the surface temperatures. (Higher for land than sea surface.)
No. The energy budget is calculated with reference to both atmospheric and oceanic heat uptakes. That’s why the consensus scientists were unable to explain “the pause” of (then) 1998-2012 which was discussed in AR5. Trenberth and others then suggested that the missing heat must have been hiding in the “deep ocean”, below 2,000 metres.
However, Liang (2015) https://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-14-00550.1?af=R found that “the global integral of vertical heat flux shows an upward heat transport in the deep ocean, suggesting a cooling trend in the deep ocean.”
So how can models hindcast the AGW contribution to the observed warming in a past period without knowing how much heat was contributed by ENSO (and other natural variation) during that period? Only by making evidence-free assumptions about NIV always being zero and surpluses being neutered by aerosols.
Michael is correct.
Ocean heat content is increasing. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5345929/
ENSO must by definition has a net zero effect on the trend over long time periods. If El Niño events were predominating, then ocean heat content would have to decrease (or at least increase more slowly).
Read the discussion section of Liang (2015) for an explanation of what is going on:
“Evidently, if the deep ocean in 1992 was slightly warmer than its equilibrium value, no physical contradiction exists with an upward movement of heat.”
“All existing estimates of the deep ocean states, including this present one, are based on very limited in situ observations in the deep ocean, and the uncertainties are large. Furthermore, upper ocean warming may have been generally underestimated: Any bias errors in the initializing state rendering the upper ocean warmer than is correct would produce such an underestimate. Note the historical emphasis on measurements of the relatively warm North Atlantic Ocean and the tendency for shipborne observations to focus on lower latitudes generally, particularly in winter ”
“An upward heat transport in the deep ocean may appear to be in conflict with the widespread idea that a large portion of the extra heat added to the Earth system in the past decades should be transported into the deep ocean (e.g., FAQ 3.1, Fig. 1 in Stocker et al. 2013). That inference is based on the assumption that the ocean was in equilibrium with the atmosphere before any extra heat entered. “
Simon,
Liang (2015) says:
The primary heater of the ocean and thus the biosphere is the sun, whose strong short-wave radiation penetrates 100 m and more into the water with substantial warming effect. Long-wave infrared radiation from our airborne carbon dioxide emissions penetrates less than 1 mm, with minuscule or no warming effect. The topmost molecules evaporate, immediately cooling the surface layer and there’s no evidence that energy from the surface layer penetrates further. If any warming eventuates from accumulated airborne carbon dioxide, we would be interested to know its contribution to total warming and in the fraction attributable to the human contribution. It’s not easy to find an authority on the human greenhouse contribution, but it’s possibly less than 5% of total greenhouse warming.
Still, the major impediment to accepting your implication that human activity is warming the ocean is, as I’ve told you before multiple times, the lack of a credible mechanism.
Liang also says:
So modern observations of ocean heat content are unlikely to reflect only recent influxes of energy and thus it’s an unreliable proxy for anthropogenic warming.
The oceans are warming, expanding and melting the Antarctic ice shelves.
Whether you believe it or not.
The explanation is simple physics.
The minor Antarctic ice shelf melting is not caused by a rising ocean but by warmer currents and geothermal activity beneath the ice shelf.
If you can show that the oceans are warming enough to measurably expand them, you will note that it’s the sun doing the warming, not our emissions. Or you can bring evidence for that also.
Of course it’s energy from the Sun warming the oceans.
The greenhouse effect slows the cooling. Without the GHE the oceans would be frozen.
Simple physics. (Geothermal activity is trivial.)
Science available to any layman who wants to learn.
“Without the GHE the oceans would be frozen.”
Is this “settled science” or a conjecture?
Michael,
Don’t believe you. Prove it.
“Science available to any layman who wants to learn”
My guess is that you’re not a layman then, Michael? …. ( is the name short for Michael Joseph Savage?)
Let me guess, you’re a science teacher?…primary, secondary, or tertiary, Michael ?
Simon says “ENSO by definition has a net zero effect over long time periods”. True. How long is a long period? How long will it be until the earth is in equilibrium? Nobody knows but it doesn’t appear to have been in equilibrium during the last million years or so.
How much utility is there in referring to nebulous concepts which are relevant to nobody alive today and are highly unlikely to be relevant at any time during humankind’s span of existence. This equates science to absence of knowledge.
Oh, Barry! “Equates science to absence of knowledge.” Nice!!
The AR5 reported that equilibrium climate sensitivity (ECS) was between 1.5° and 4.5°C (the range assumed since 1979), but WG1 was unable to agree on a “most likely” figure because there was disagreement between observations and climate models.
Paul Matthews now points out that the AR5 itself clearly admitted that the CMIP5 climate models were wrong, in that they exaggerated recent warming. http://tinyurl.com/y2ntulqz
The following excerpt is from Box TS3 of the Technical Summary:
“an analysis of the full suite of CMIP5 historical simulations reveals that 111 out of 114 realizations show a GMST trend over 1998–2012 that is higher than the entire HadCRUT4 trend ensemble. This difference between simulated and observed trends could be caused by some combination of (a) internal climate variability, (b) missing or incorrect RF, and (c) model response error.”
The 97% systemic error rate leaves little doubt that the main cause is “model response error” ie CMIP5 models are all programmed to exaggerate ECS.
The IPCC’ AR5 itself produced a graph of land-based model errors that supports John Christy’s famous graph of troposphere-based CMIP5 errors: https://wattsupwiththat.com/2016/05/25/climate-models-dont-work
This graph is to be found at Figure 11.25 of the WG1 report and is repeated in the Technical Summary as TS-14. Paul Matthews reproduces it at http://tinyurl.com/y2ntulqz
“Box TS.3, Figure 1a; CMIP5 ensemble mean trend is 0.21°C per decade” reveals the extent of the exaggeration. If the models were made consistent with the ACTUAL mean trend of 0.12°C per decade, they would be projecting that the Paris Agreement target of 2°C will not be exceeded during this century.
All of this, of course, is based on the evidence-free assumption that human activities will be responsible for 100% of the changes in temperature trends that occur over the next 80 years.