How to use open threads

This page is for discussion of climate change generally.


Open threads

To access the open threads, click on the menu item “OPEN THREADS” in the navigation bar above, hover over “HOW TO USE OPEN THREADS” then over “CLIMATE SCIENCE”. The resulting menu, if it’s too long to fit on the screen, will scroll up with the mouse wheel or the down cursor key.


Views: 1225

173 Thoughts on “How to use open threads

  1. THREAD on 16/10/2010 at 8:15 pm said:

    CLIMATE BLOGS

  2. THREAD on 16/10/2010 at 8:16 pm said:

    CLIMATE MODELS

    • THREAD on 17/10/2010 at 5:17 am said:

      CLIMATE MODEL PAPERS

    • Richard C (NZ) on 21/10/2010 at 10:57 am said:

      WHAT SURFACE TEMPERATURE IS YOUR MODEL REALLY PREDICTING?
      Roy Clark

      SUMMARY
      The whole anthropogenic global warming (AGW) argument falls apart when we ask the rather simple question: What surface temperature is your model really predicting? Most large scale climate models still use radiative forcing to predict changes in surface temperature. The original radiative forcing derivation, by Manabe and Wetherald in 1967 clearly defined an ‘equilibrium surface’ that interacted with an ‘equilibrium surface flux’ to produce an ‘equilibrium surface
      temperature’.[1] However, the Earth’s surface is never in thermal equilibrium so radiative forcing is, by definition, based on invalid assumptions. The surface temperature is set by the dynamic energy flux balance at the surface. When real surface temperatures are calculated using the thermal properties of the surface material and measured values of the surface energy flux terms, the small change in downward LWIR flux from a 100 ppm increase in atmospheric CO2 has no measurable effect on the surface temperature. The whole AGW argument disappears. Now, the meteorological surface temperature (MSAT) has indeed shown an increase over last 50 years or so, but this has nothing to do with the ground surface temperature or CO2. The MSAT is the temperature of the air in an enclosure at eye level, 1.5 to 2 m above the ground. To make their models appear to work, the modelers have switched from the ground surface temperature to
      the MSAT without changing their ‘equilibrium’ predictions. The minimum MSAT is usually a measure of the bulk air temperature of the local weather system as it passes by the weather station. The maximum MSAT is just a measure of the temperature of the warm air that is circulated by convection from the ground as it is heated by the sun during the day. Since 75% of the Earth’s surface is ocean, most of the weather systems are formed over the oceans. The bulk air temperature of the weather system is therefore set by the ocean surface temperatures along the approach path of the local weather systems. The ‘hockey stick’ temperature increase is just the part of the ocean surface temperature that shows up in the MSAT. There was a convenient coincidence between the increase in ocean surface temperatures and the increase in CO2 concentration that has now ended. However, instead of simply shouting ‘fraud’ and trading insults we can make good use of the ocean temperature signal. It provides a baseline temperature reference that can be used to probe urban heat island effects and look for anomalies in the weather station record. Once we understand which surface temperature we are dealing with we can make quantitative predictions of both the ground temperature and the MSAT. This provides us with a way forward. We no longer have to hide the decline. We simply require that our climate models predict it – quantitatively.

    • Richard C (NZ) on 21/10/2010 at 2:23 pm said:

      A HotLinked, HotList of HotPapers.

      Generally model-centric

      Wyant, M.C., Khairoutdinov, M. & Bretherton, C.S., 2006. Climate sensitivity and cloud response of a GCM with a superparameterization.

      Bretherton, C.S., 2006. Low-Latitude Cloud Feedbacks on Climate Sensitivity.

      [Steve McIntyre at CA noted that Bretherton (2006) that shows negative cloud feedbacks contrary to the positive feedback orthodoxy, was NOT cited in AR4 even though the paper must have been known to the authors.]

      David H. Douglass, John R. Christy, Benjamin D. Pearson and S. Fred Singer, A comparison of tropical temperature trends with model predictions

      P. W. Thorne, D. E. Parker, B. D. Santer, M. P. McCarthy, D. M. H. Sexton, M. J. Webb, J. M. Murphy, M. Collins, H. A. Titchner, G. S. Jones, 2007, Tropical vertical temperature trends: A real discrepancy? – Abstract only, behind paywall.

      Pincus, R., C. P. Batstone, R. J. P. Hofmann, K. E. Taylor, and P. J. Glecker (2008),
      Evaluating the present-day simulation of clouds, precipitation, and radiation in climate models

      Limits on CO2 Climate Forcing from Recent Temperature Data of Earth, David H. Douglass and John R. Christy, 2008 – [PDF] from arxiv.org

      On the Effective Number of Climate Models, Pennell, C.; Reichler, T., 2009 – Fulltext Article not available

      What Do Observational Datasets Say about Modeled Tropospheric Temperature Trends since 1979?, John R. Christy, Benjamin Herman, Roger Pielke, Sr., Philip Klotzbach, Richard T. McNider, Justin J. Hnilo, Roy W. Spencer, Thomas Chase and David Douglass, 2010

      Spencer, R. W., and W. D. Braswell (2010), On the diagnosis of radiative feedback in the presence of unknown radiative forcing

      A STATISTICAL ANALYSIS OF MULTIPLE TEMPERATURE PROXIES: ARE RECONSTRUCTIONS OF SURFACE TEMPERATURES OVER THE LAST 1000 YEARS RELIABLE? BY BLAKELEY B. MCSHANE AND ABRAHAM J. WYNER, 2010

      Panel and Multivariate Methods for Tests of Trend Equivalence in Climate Data Series, Ross McKitrick, Stephen McIntyre, and Chad Herman, 2010, [MMH10]

      Evaluation of tropical cloud and precipitation statistics of Community Atmosphere Model version 3 using CloudSat and CALIPSO data, Y. Zhang, S. A. Klein, J. Boyle, G. G. Mace, 2010 – Abstract only, behind paywall

      The two following links are from The 2009 report of the Nongovernmental International Panel on Climate Change (NIPCC), Climate Change Reconsidered

      Global Climate Models and Their Limitations

      Feedback Factors and Radiative Forcing

    • Richard C (NZ) on 26/10/2010 at 12:32 pm said:

      Scafetta on 60 year climate oscillations

      George Taylor, former Oregon State climatologist writes:

      Nicola Scafetta has published the most decisive indictment of GCM’s I’ve ever read in the Journal of Atmospheric and Solar-Terrestrial Physics. His analysis is purely phenomenological, but he claims that over half of the warming observed since 1975 can be tied to 20 and 60-year climate oscillations driven by the 12 and 30-year orbital periods of Jupiter and Saturn, through their gravitational influence on the Sun, which in turn modulates cosmic radiation.

      If he’s correct, then all GCM’s are massively in error because they fail to show any of the observed oscillations.

      See “Controversy and scandal”

    • THREAD on 26/10/2010 at 3:52 pm said:

      Tuesday, October 5, 2010

      Paper: Models lead to overly confident climate predictions

      A paper published today in the Journal of Climate finds that ensembles of climate models used by the IPCC to predict future climate change “may lead to overly confident climate predictions.” The authors find that many models share the same computer code, have the same limitations, and “tend to be fairly similar,” resulting in confirmation bias. Indeed, empirical observations have shown far less warming than the “90% confident” IPCC models in AR4, as shown in this poster by John Christy: [Graphic]

      Journal of Climate doi: 10.1175/2010JCLI3814.1

      On the Effective Number of Climate Models

      Christopher Pennell and Thomas Reichler

      Department of Atmospheric Sciences, University of Utah, Salt Lake City, UT

      Abstract:

      Projections of future climate change are increasingly based on the output of many different models. Typically, the mean over all model simulations is considered as the optimal prediction, with the underlying assumption that different models provide statistically independent information evenly distributed around the true state. However, there is reason to believe that this is not the best assumption. Coupled models are of comparable complexity and are constructed in similar ways. Some models share parts of the same code and some models are even developed at the same center. Therefore, the limitations of these models tend to be fairly similar, contributing to the well-known problem of common model biases and possibly to an unrealistically small spread in the outcomes of model predictions.

      This study attempts to quantify the extent of this problem by asking how many models there effectively are and how to best determine this number. Quantifying the effective number of models is achieved by evaluating 24 state-of-the-art models and their ability to simulate broad aspects of 20th century climate. Using two different approaches, we calculate the amount of unique information in the ensemble and find that the effective ensemble size is much smaller than the actual number of models. As more models are included in an ensemble the amount of new information diminishes in proportion. Furthermore, we find that this reduction goes beyond the problem of “same-center” models and that systemic similarities exist across all models. We speculate that current methodologies for the interpretation of multi-model ensembles may lead to overly confident climate predictions.

    • Richard C (NZ) on 27/10/2010 at 6:22 pm said:

      Bob D says:
      October 27, 2010 at 5:56 pm

      On the performance of models:

      G. G. Anagnostopoulos et al. (2010) “A comparison of local and aggregated climate model outputs with observed data”

      http://pdfserve.informaworld.com/943561__928051726.pdf

      Abstract:
      We compare the output of various climate models to temperature and precipitation observations at 55 points around the globe.We also spatially aggregate model output and observations over the contiguous USA using data from 70 stations, and we perform comparison at several temporal scales, including a climatic (30-year) scale.
      Besides confirming the findings of a previous assessment study that model projections at point scale are poor, results show that the spatially integrated projections are also poor.

    • Richard C (NZ) on 31/10/2010 at 6:00 pm said:

      Download error – alternative link:-

      http://pdfserve.informaworld.com/218198__928051726.pdf

    • Richard C (NZ) on 31/10/2010 at 7:13 pm said:

      On the performance of models:

      “A comparison of local and aggregated climate model outputs with observed data”

      G. G. Anagnostopoulos et al. (2010)

      CONCLUSIONS AND DISCUSSION
      It is claimed that GCMs provide credible quantitative estimates of future climate change, particularly at continental scales and above. Examining the local performance of the models at 55 points, we found that local projections do not correlate well with observed measurements. Furthermore, we found that the correlation at a large spatial scale, i.e. the contiguous USA, is worse than at the local scale.

      However, we think that the most important question is not whether GCMs can produce credible estimates of future climate, but whether climate is at all predictable in deterministic terms. Several publications, a typical example being Rial et al. (2004), point out the difficulties that the climate system complexity introduces when we attempt to make predictions. “Complexity” in this context usually refers to the fact that there are many parts comprising the system and many interactions among these parts. This observation is correct, but we take it a step further. We think that it is not merely a matter of high dimensionality, and that it can be misleading to assume that the uncertainty can be reduced if we analyse its “sources” as nonlinearities, feedbacks, thresholds, etc., and attempt to establish causality relationships. Koutsoyiannis (2010) created a toy model with simple, fully-known, deterministic dynamics, and with only two degrees of freedom (i.e. internal state variables or dimensions); but it exhibits extremely uncertain behaviour at all scales, including trends, fluctuations, and other features similar to those displayed by the climate. It does so with a constant external forcing,which means that there is no causality relationship between its state and the forcing. The fact that climate has many orders of magnitude more degrees of freedom certainly perplexes the situation further, but in the end it may be irrelevant; for, in the end, we do not have a predictable system hidden behind many layers of uncertainty which could be removed to some extent, but, rather, we have a system that is uncertain at its heart.

      Do we have something better than GCMs when it comes to establishing policies for the future? Our answer is yes: we have stochastic approaches, and what is needed is a paradigm shift. We need to recognize the fact that the uncertainty is intrinsic, and shift our attention from reducing the uncertainty towards quantifying the uncertainty (see also Koutsoyiannis et al., 2009a). Obviously, in such a paradigm shift, stochastic descriptions of hydroclimatic processes should incorporate what is known about the driving physical mechanisms of the processes. Despite a common misconception of stochastics as black-box approaches whose blind use of data disregard the system dynamics, several celebrated examples, including statistical thermophysics and the modelling of turbulence, emphasize the opposite, i.e. the fact that stochastics is an indispensable, advanced and powerful part of physics. Other simpler examples (e.g. Koutsoyiannis, 2010) indicate how known deterministic dynamics can be fully incorporated in a stochastic framework and reconciled with the unavoidable emergence of uncertainty in predictions.

    • Richard C (NZ) on 31/10/2010 at 8:03 pm said:

      Modeling the Dynamics of Long-Term Variability of Hydroclimatic Processes

      OLI G. B. SVEINSSON AND JOSE D. SALAS
      Department of Civil Engineering, Colorado State University, Fort Collins, Colorado
      DUANE C. BOES
      Department of Statistics, Colorado State University, Fort Collins, Colorado
      ROGER A. PIELKE SR.
      Department of Atmospheric Science, and Colorado Climate Center, Colorado State University, Fort Collins, Colorado

      (Manuscript received 26 December 2001, in final form 9 September 2002)

      ABSTRACT
      The stochastic analysis, modeling, and simulation of climatic and hydrologic processes such as precipitation, streamflow, and sea surface temperature have usually been based on assumed stationarity or randomness of the process under consideration. However, empirical evidence of many hydroclimatic data shows temporal variability
      involving trends, oscillatory behavior, and sudden shifts. While many studies have been made for detecting and testing the statistical significance of these special characteristics, the probabilistic framework for modeling the temporal dynamics of such processes appears to be lacking. In this paper a family of stochastic models that can be used to capture the dynamics of abrupt shifts in hydroclimatic time series is proposed. The applicability of such ‘‘shifting mean models’’ are illustrated by using time series data of annual Pacific decadal oscillation (PDO) indices and annual streamflows of the Niger River.

      Concluding remarks (abridged)
      Empirical evidence has shown that some hydroclimatic processes exhibit abrupt shifting patterns in addition to autocorrelation. Also, it has been documented that outputs from a physically based climate model exhibit abrupt changes on decadal timescales. Modeling
      the dynamics of such type of processes by using stochastic methods has been the main subject of the research reported herein. Certain stochastic models can replicate such abrupt shifting behavior particularly when apparent abrupt shifts in the mean occur. Such stochastic models do not seek to explain the underlying physical mechanism of the observed sudden shifts. However, they can be capable of generating or simulating equally likely traces or scenarios with features (e.g., the abrupt shifts) that are statistically similar or comparable to those shown by the observed records. Such simulated traces of the hydroclimatic process under consideration (e.g., annual rainfall over an area) may be useful for assessing the availability of resources in a future horizon and identifying the vulnerability space of specified resources.

    • Richard C (NZ) on 31/10/2010 at 8:25 pm said:

      Stochastic Hydrology

      Advanced application of statistics and probability to hydrology as applied in the modeling of hydro-climatic sequences

      Computer rendering of stochastic models

      Fournier 1982

    • Richard C (NZ) on 29/10/2010 at 10:18 am said:

      Radiative forcing by well-mixed greenhouse gases:
      Estimates from climate models in the IPCC AR4

      Collins Et Al 2010

      Abstract.
      The radiative e ffects from increased concentrations of wellmixed
      greenhouse gases (WMGHGs) represent the most signi ficant and best understood anthropogenic forcing of the climate system. The most comprehensive tools for simulating past and future climates in
      uenced by WMGHGs are fully coupled atmosphere-ocean general circulation models (AOGCMs). Because of the importance of WMGHGs as forcing agents, it is essential that AOGCMs compute the radiative forcing by these gases as accurately as possible. We present the results of a Radiative-Transfer Model Intercomparison (RTMIP) between the forcings computed by the radiative parameterizations of AOGCMs and by benchmark line-by-line (LBL) codes. The comparison is focused on forcing by CO2, CH4, N2O, CFC-11, CFC-12, and the increased H2O expected in warmer climates. The models included in the intercomparison include several LBL codes and most of the global models submitted to the Intergovernmental Panel on Climate Change (IPCC) 4th Assessment Report (AR4). In general, the LBL models are in excellent agreement with each other. However, in many cases, there are substantial discrepancies among the AOGCMs and between the AOGCMs and LBL codes. In some cases this is because the AOGCMs neglect particular absorbers, in particular the near infrared e ffects of CH4 and N2O, while in others it is due to the methods for modeling the radiative processes. The biases in the AOGCM forcings are generally largest at the surface level. We quantify these diferences and discuss the implications for interpreting variations in forcing and response across the multi-model ensemble of AOGCM simulations assembled for the IPCC AR4.

      [Beware – IPCC incestuosity, scepticism required]

    • Richard C (NZ) on 31/10/2010 at 5:50 pm said:

      From – Discussion and conclusions

      This paper discusses the findings from the Radiative Transfer Model Intercomparison Project (RTMIP). The basic goal of RTMIP is to compare the radiative forcings computed with AOGCMs in the IPCC AR4 against calculations with LBL models.

      [But no comparison with real-world measurements of radiation – Note]

      These results suggest several directions for development of the radiative parameterizations in AOGCMs. First, tests of the accuracy of shortwave and longwave forcings at the surface should be given special attention. Second, the shortwave parameterizations in all the AOGCMs should be enhanced to include the e ffects of CH4 and optionally N2O on near-infrared radiation. Third, AOGCMs should evaluate the convergence of shortwave radiation in the atmosphere using benchmark calculations. This is a particularly clean test of the radiation physics, and the current models exhibit an improbably large spread of the convergence.

      E fforts to address these issues would have several benefi ts for the climate-modeling community and for groups using their models in scientifi c and societal applications. Better agreement of AOGCMs with LBL calculations would lead to greater con fidence in simulations of past and future climate. It would also facilitate the analysis of forcing-response relationships from the complex and heterogeneous multi-model ensembles that have become a standard component of international climate-change assessments.

      [But no extension of the “Better agreement of AOGCMs with LBL calculations” recommendation to – “LBL calculations agreement with empirical observations” (i.e. scientific method) as in the AERI Radiative Transfer Working Groups stated foundation of “validation of line-by-line radiative transfer calculations with accurate high-resolution measurements”. See “Climate Science” Atmospheric & Environmental Research, Inc.’s (AER) Radiative Transfer Working Group

    • Richard C (NZ) on 29/10/2010 at 11:00 am said:

      Effects of bias in solar radiative transfer codes on global climate model simulations

      Arking 2005

      Department of Earth and Planetary Sciences, Johns Hopkins University, Baltimore, Maryland, USA

      Discussion and Conclusion
      [19] The radiative properties of the clear atmosphere are such that about half the solar radiation incident at TOA is absorbed by the surface, and only 25% is absorbed by the atmosphere. Hence, it is the surface that is the primary source of heat for the troposphere, most of which is in theform of emitted (infrared) radiation, but some of it is in the form of a thermodynamic heat exchange at the surface (comprising sensible and latent heat) that is carried upward by convection. Enhancing atmospheric absorption of solar radiation would transfer to the atmosphere some of the solar energy that would otherwise heat the surface. [20] As one might expect from a change in the radiation code which increases the absorption of solar radiation in the atmosphere, energy that is otherwise primarily absorbed by the surface is captured by the atmosphere. Hence, as we see in Figure 3, the convective flux necessarily decreases. The magnitude of the change seen, 15–17 W m2 at the surface, probably exaggerates the effect because the simulations were done under clear sky conditions. There is also a small increase in tropospheric temperatures, related to a small increase in column absorption below the tropopause, and a larger increase in stratospheric temperatures due to the increase of absorption above the tropopause. [21] In examining the response of the atmosphere to a doubling of CO2, we find the effects of enhanced absorption are much smaller because we are now looking at differences of differences. The effect on the tropospheric temperature is negligible, and the effect on the convective flux response is non-negligible only near the surface when we allow water vapor feedback.

    • Richard C (NZ) on 31/10/2010 at 8:33 pm said:

      Note – Arking 2005 predates recent findings of negative feedbacks from clouds

    • Richard C (NZ) on 29/10/2010 at 11:29 am said:

      Effects of bias in solar radiative transfer codes on global climate model simulations – Google Scholar Search

      Note: this is a better search than:

      radiative transfer codes global climate model simulations

      and contains for example;

      “An accurate parameterization of the infrared radiative properties of cirrus clouds for climate models” Yang 1998

      Also see – “Clouds in Climate Models”

    • THREAD on 17/10/2010 at 6:06 am said:

      On Climate Models, the Case For Living with Uncertainty

      by fred pearce

      (or not as the case may be)

      http://e360.yale.edu/feature/on_climate_models_the_case_for_living_with_uncertainties/2325/

    • THREAD on 17/10/2010 at 6:24 am said:

      My Communications With Dr David Wratt, Chief Scientist, NIWA

      Re A challenge to him in regard to natural forcings simulation

      https://www.climateconversation.org.nz/2010/10/hal-lewis-resigns-from-the-aps-in-protest/#comment-25336

    • Richard C (NZ) on 25/10/2010 at 10:55 am said:

      Modelling the climate – UKMO

      Introduction to Climate Models

      The Unified Model (UM)

      The atmospheric part of the model used by climateprediction.net is the UK Met Office’s state-of-the-art Unified Model;

      NIWA employs the “atmospheric part of the UM model”

      i.e. The “A” part of the Atmosphere-Ocean coupled General Circulation Model (AOGCM)

    • Richard C (NZ) on 25/10/2010 at 10:59 am said:

      Met Office climate prediction model: HadGEM2 family

    • Richard C (NZ) on 25/10/2010 at 11:01 am said:

      Met Office Hadley Centre technical notes

    • Richard C (NZ) on 25/10/2010 at 11:05 am said:
    • THREAD on 25/10/2010 at 1:42 pm said:

      “circular reasoning” at work in NIWA’s UKMO UM model

      Please Note:

      IPCC simulations categories:-

      1. Models using only natural forcings.

      2. Models using both natural and anthropogenic forcings

      Don’t be fooled by this internal IPCC distinction.

      1. and 2. simulations are run on the SAME models: SAME formulations; SAME spin-up datasets (just addition or deletion of the ACO2 driver); and, SAME IPCC Radiative Forcing (RF) methodology (i.e. in their own terms).

      NIWA uses the “A” module of UKMO’s UM model that is in the IPCC’s 1. and 2. categories.

    • Richard C (NZ) on 09/11/2010 at 9:47 am said:

      From

      Overconfidence in IPCC’s detection and attribution: Part III

      October 24, 2010

      by Judith Curry

      Circularity in the argument

      Apart from the issue of the actual logic used for reasoning, there is circularity in the argument that is endemic to whatever reasoning logic is used. Circular reasoning is a logical fallacy whereby the proposition to be proved is assumed in one of the premises.

      The most serious circularity enters into the determination of the forcing data. Given the large uncertainties in forcings and model inadequacies (including a factor of 2 difference in CO2 sensitivity), how is it that each model does a credible job of tracking the 20th century global surface temperature anomalies (AR4 Figure 9.5)? This agreement is accomplished through each modeling group selecting the forcing data set that produces the best agreement with observations, along with model kludges that include adjusting the aerosol forcing to produce good agreement with the surface temperature observations. If a model’s sensitivity is high, it is likely to require greater aerosol forcing to counter the greenhouse warming, and vice versa for a low model sensitivity. The proposition to be proved (#7) is assumed in premise #3 by virtue of kludging of the model parameters and the aerosol forcing to agree with the 20th century observations of surface temperature. Any climate models that uses inverse modeling to determine any aspect of the forcing substantially weakens the attribution argument owing to the introduction of circular reasoning.

    • Richard C (NZ) on 09/11/2010 at 10:00 am said:

      From

      Overconfidence in IPCC’s detection and attribution: Part III

      October 24, 2010

      by Judith Curry

      Circularity in the argument

      Richard S Courtney | October 25, 2010 at 4:48 am | Reply

      Dr Curry:

      I think the problem with the models is more profound than you state when you write:

      “The most serious circularity enters into the determination of the forcing data. Given the large uncertainties in forcings and model inadequacies (including a factor of 2 difference in CO2 sensitivity), how is it that each model does a credible job of tracking the 20th century global surface temperature anomalies (AR4 Figure 9.5)? This agreement is accomplished through each modeling group selecting the forcing data set that produces the best agreement with observations, along with model kludges that include adjusting the aerosol forcing to produce good agreement with the surface temperature observations.

      I stated my assessment on a previous thread of your blog and I take the liberty of copying it here because I think it goes to the heart of the issue of “Overconfidence”.

      My comment was in the thread titled “What can we learn from climate models” that is at
      http://judithcurry.com/2010/10/03/what-can-we-learn-from-climate-models/

      It was as follows:

      “Richard S Courtney | October 6, 2010 at 6:07 am | Reply Ms Curry:
      Dr Curry:

      Thank you for your thoughtful and informative post.

      In my opinion, your most cogent point is:

      “Particularly for a model of a complex system, the notion of a correct or incorrect model is not well defined, and falsification is not a relevant issue. The relevant issue is how well the model reproduces reality, i.e. whether the model “works” and is fit for its intended purpose.”

      However, in the case of climate models it is certain that they do not reproduce reality and are totally unsuitable for the purposes of future prediction (or “projection”) and attribution of the causes of climate change.

      All the global climate models and energy balance models are known to provide indications which are based on the assumed degree of forcings resulting from human activity resulting from anthropogenic aerosol cooling input to each model as a ‘fiddle factor’ to obtain agreement between past average global temperature and the model’s indications of average global temperature. This ‘fiddle factor’ is wrongly asserted to be parametrisation.

      A decade ago I published a peer-reviewed paper that showed the UK’s Hadley Centre general circulation model (GCM) could not model climate and only obtained agreement between past average global temperature and the model’s indications of average global temperature by forcing the agreement with an input of assumed anthropogenic aerosol cooling.

      And my paper demonstrated that the assumption of anthropogenic aerosol effects being responsible for the model’s failure was incorrect.

      (ref. Courtney RS ‘An assessment of validation experiments conducted on computer models of global climate using the general circulation model of the UK’s Hadley Centre’ Energy & Environment, Volume 10, Number 5, pp. 491-502, September 1999).

      More recently, in 2007, Kiehle published a paper that assessed 9 GCMs and two energy balance models.

      (ref. Kiehl JT,Twentieth century climate model response and climate sensitivity. GRL vol.. 34, L22710, doi:10.1029/2007GL031383, 2007).

      Kiehl found the same as my paper except that each model he assessed used a different aerosol ‘fix’ from every other model.

      He says in his paper:

      ”One curious aspect of this result is that it is also well known [Houghton et al., 2001] that the same models that agree in simulating the anomaly in surface air temperature differ significantly in their predicted climate sensitivity. The cited range in climate sensitivity from a wide collection of models is usually 1.5 to 4.5 deg C for a doubling of CO2, where most global climate models used for climate change studies vary by at least a factor of two in equilibrium sensitivity.

      The question is: if climate models differ by a factor of 2 to 3 in their climate sensitivity, how can they all simulate the global temperature record with a reasonable degree of accuracy.

      Kerr [2007] and S. E. Schwartz et al. (Quantifying climate change–too rosy a picture?, available at http://www.nature.com/reports/climatechange ) recently pointed out the importance of understanding the answer to this question. Indeed, Kerr [2007] referred to the present work and the current paper provides the ‘‘widely circulated analysis’’ referred to by Kerr [2007]. This report investigates the most probable explanation for such an agreement. It uses published results from a wide variety of model simulations to understand this apparent paradox between model climate responses for the 20th century, but diverse climate model sensitivity.”

      And Kiehl’s paper says:

      ”These results explain to a large degree why models with such diverse climate sensitivities can all simulate the global anomaly in surface temperature. The magnitude of applied anthropogenic total forcing compensates for the model sensitivity.”

      And the “magnitude of applied anthropogenic total forcing” is fixed in each model by the input value of aerosol forcing.

      Kiehl’s Figure 2 can be seen at http://img36.imageshack.us/img36/8167/kiehl2007figure2.png
      Please note that it is for 9 GCMs and 2 energy balance models, and its title is:
      ”Figure 2. Total anthropogenic forcing (Wm2) versus aerosol forcing (Wm2) from nine fully coupled climate models and two energy balance models used to simulate the 20th century.”

      The graph shows the anthropogenic forcings used by the models show large range of total anthropogenic forcing from 1.22 W/m^2 to 2.02 W/m^2 with each of these values compensated to agree with observations by use of assumed anthropogenic aerosol forcing in the range -0.6 W/m^2 to -1.42 W/m^2. In other words, the total anthropogenic forcings used by the models varies by a factor of almost 2, and this difference is compensated by assuming values of anthropogenic aerosol forcing that varies by a factor of almost 2.4.

      Anything can be adjusted to hindcast obervations by permitting that range of assumptions. But there is only one Earth, so at most only one of the models can approximate the climate system which exists in reality.

      The underlying problem is that the modellers assume additional energy content in the atmosphere will result in an increase of temperature, but that assumption is very, very unlikely to be true.

      Radiation physics tells us that additional greenhouse gases will increase the energy content of the atmosphere. But energy content is not necessarily sensible heat.

      An adequate climate physics (n.b. not radiation physics) would tell us how that increased energy content will be distributed among all the climate modes of the Earth. Additional atmospheric greenhouse gases may heat the atmosphere, they may have an undetectable effect on heat content, or they may cause the atmosphere to cool.

      The latter could happen, for example, if the extra energy went into a more vigorous hydrological cycle with resulting increase to low cloudiness. Low clouds reflect incoming solar energy (as every sunbather has noticed when a cloud passed in front of the Sun) and have a negative feedback on surface temperature.

      Alternatively, there could be an oscillation in cloudiness (in a feedback cycle) between atmospheric energy and hydrology: as the energy content cycles up and down with cloudiness, then the cloudiness cycles up and down with energy with their cycles not quite 180 degrees out of phase (this is analogous to the observed phase relationship of insolation and atmospheric temperature). The net result of such an oscillation process could be no detectable change in sensible heat, but a marginally observable change in cloud dynamics.

      However, nobody understands cloud dynamics so the reality of climate response to increased GHGs cannot be known.

      So, the climate models are known to be wrong, and it is known why they are wrong: i.e.

      1. they each emulate a different climate system and are each differently adjusted by use of ‘fiddle factors’ to get them to match past climate change,

      2. and the ‘fiddle factors’ are assumed (n.b. not “estimated”) forcings resulting from human activity,

      3. but there is only one climate system of the Earth so at most only one of the models can be right,

      4. and there is no reason to suppose any one of them is right,

      5. but there is good reason to suppose that they are all wrong because they cannot emulate cloud processes which are not understood.

      Hence, use of the models is very, very likely to provide misleading indications of future prediction (or “projection”) of climate change and is not appropriate for attribution of the causes of climate change.

      Richard”

      Richard

    • THREAD on 17/10/2010 at 6:34 am said:

      Climate Modeling Under Fire From Other Fields – nuclear, chemical,aeronautics etc

      https://www.climateconversation.org.nz/2010/10/filmed-for-free-but-for-nothing/#comment-25129

    • THREAD on 17/10/2010 at 6:40 am said:

      A SIMULATION that is NOT SIMILAR to the observed condition is NOT a SIMULATION.

      https://www.climateconversation.org.nz/2010/09/seventy-years-is-plenty/#comment-24665

    • THREAD on 18/10/2010 at 6:18 pm said:

      Shock! Climate models can’t even predict a linear rise – JoNova

      http://joannenova.com.au/2010/10/shock-climate-models-cant-even-predict-linear-rise/

    • Richard C (NZ) on 19/10/2010 at 12:12 am said:

      Comments are a must read – builds up to some good stuff circa # 100

    • THREAD on 18/10/2010 at 7:40 pm said:

      Feed Backs

      Peer Reviewed Study: CO2 warming effect cut by 65%, climate sensitivity impossible to accurately determine

      http://wattsupwiththat.com/2010/10/12/peer-reviewed-study-co2-warming-effect-cut-by-65-climate-sensitivity-impossible-to-accurately-determine/#comment-505817

    • THREAD on 20/10/2010 at 4:17 pm said:

      How to Lie with Bad Data
      Richard D. De Veaux and David J. Hand

      “To create accurate models, however, it can be important to know the source and therefore the accuracy of the measurements. Consider a study of the effect of ocean bottom topography on sea ice formation in the southern oceans (De Veaux, Gordon, Comiso and Bacherer, 1993). After learning that wind can have a strong effect on sea ice formation the statistician, wanting to incorporate this predictor into a model, asked one of the physicists whether any wind data existed. It was difficult to imagine very many Antarctic stations with anemometers and so he was very surprised when the physicist replied, “Sure, there’s plenty of it.” Excitedly he asked what spatial resolution he could provide. When the physicist countered with “what resolution do you want?” the statistician became suspicious. He probed further and asked if they really had anemometers set up on a 5 km grid on the sea ice? He said, “Of course not. The wind data come from a global weather model—I can generate them at any resolution you want!” It turned out that all the other satellite data had gone through some sort of preprocessing before it was given to the statistician. Some were processed actual direct measurements, some were processed through models and some, like the wind, were produced solely from models. Of course, this (as with curbstoning) does not necessarily imply that the resulting data are bad, but it should at least serve to warn the analyst that the data may not be what they were thought to be.

      Each of these different mechanisms for data distortions has its own set of detection and correction challenges. Ensuring good data collection through survey and/or experimental design is certainly an important first step. A bad design that results in data that are not representative of the phenomenon being studied can render even the best analysis worthless. At the next step, detecting errors can be attempted in a variety of ways, a topic to which we will return in Section 4.”

    • THREAD on 21/10/2010 at 10:20 pm said:

      Climate Model Deception – See It for Yourself

    • Richard C (NZ) on 22/10/2010 at 3:25 pm said:

      McKitrick, McIntyre, and Herman 2010 [MMH10]

      “It’s a big step forward in the search for the hot-spot. (If the hot spot were a missing person, McKitrick et al have sighted a corpse.)

      In 2006 the CCSP quietly admitted with graphs (in distant parts of various reports) that the models were predicting a hot spot that the radiosondes didn’t find (Karl et al 2006).”

      Joanne Nova

      A MUST READ

    • Richard C (NZ) on 22/10/2010 at 3:34 pm said:
    • THREAD on 22/10/2010 at 8:46 pm said:

      Model projections for 2100 in perspective

    • Andy on 22/10/2010 at 10:33 pm said:

      There’s a video presentation of Prof Mike Hulme talking about climate models

      Challenging Models in the Face of Uncertainty – Conference Keynote: Professor Mike Hulme (UEA): How do Climate Models Gain and Exercise Authority?

      http://www.crassh.cam.ac.uk/page/195/media-gallery.htm

      h/t Bishop Hill, who has a thread on this here:

      http://bishophill.squarespace.com/blog/2010/10/22/mike-hulme-on-climate-models.html

      The comments at Bishop Hill are usually enlightening, sometimes entertaining.

    • Richard C (NZ) on 23/10/2010 at 9:47 am said:

      Which is why we should be watching NIWA’s use of their UKMO UM like hawks – I know I am, and hopefully that realization has set in with Dr David Wratt now that he knows I can’t be fobbed off like some ignoramus.

      I’ll be putting UM info and links in “Climate” – “Climate Models” periodically along with NASA GISS ModelE and a few others, but I suggest we get expert in UM.

      That means: parameterizations, spin-up datasets (garbage in), formulations and the general functionality and limitations IMO. The info is VERY hard to obtain as compared to ModelE so this is no easy task.

      Remember, UM was the model that predicted the “barbecue summer” in Britain and NIWA predicted a “mild” spring without the assistance of UM.

    • Richard C (NZ) on 23/10/2010 at 9:51 am said:

      Just to give an indication of what other people are doing elsewhere.

      Steve Mosher has trawled through EVERY LINE of GISS ModelE.

      100,000+ lines!

    • Richard C (NZ) on 24/10/2010 at 6:28 pm said:

      Gavin Schmidt (GISS) in defensive mode here.

      Scroll down to “Some comments at RealClimate on models”

    • Richard C (NZ) on 25/10/2010 at 11:16 am said:

      See – IPCC AR4 Climate Model Simulations

    • Richard C (NZ) on 30/10/2010 at 9:50 am said:

      See – Climate Models

      NON IPCC and Natural Forcings ONLY

      Atmospheric & Environmental Research, Inc.’s (AER)
      Radiative Transfer Working Group

      The foundation of our research and model development is the validation of line-by-line radiative transfer calculations with accurate high-resolution measurements.

  3. THREAD on 16/10/2010 at 8:17 pm said:

    CLOUDS in CLIMATE MODELS

    • THREAD on 17/10/2010 at 6:01 am said:

      The Effect of Clouds on Climate: A Key Mystery for Researchers

      (Warm biased article – but readable nevertheless)

      http://e360.yale.edu/feature/the_effect_of_clouds_on_climate_a_key_mystery_for_researchers/2313/

    • Richard C (NZ) on 18/10/2010 at 1:13 pm said:

      Some relevant links here:

      https://www.climateconversation.org.nz/2010/10/royal-society-humiliated/#comment-26206

      Also:

      Does CO2 Drive the Earth’s Climate System? Comments on the Latest NASA GISS Paper” – Dr Roy Spencer.

      http://www.drroyspencer.com/2010/10/does-co2-drive-the-earths-climate-system-comments-on-the-latest-nasa-giss-paper/

    • Richard C (NZ) on 18/10/2010 at 11:49 pm said:

      Top link above leads to:

      “Comments on Miskolczi’s (2010) Controversial Greenhouse Theory”

      The classic Spencer – Miskolczi show-down at the Spencer corral – amazing

      “Our JGR Paper on Feedbacks is Published”

      Spencer, one of THE BIG papers – a must read.

      “The Persistence of Paradigms” – Cloud Feedbacks

      Spencer

      “Five Reasons Why Water Vapor Feedback Might Not Be Positive”

      Spencer – brilliant!

      “The saturated greenhouse effect theory of Ferenc Miskolczi”

      A very significant and controversial paper – an unfinished story

    • Richard C (NZ) on 18/10/2010 at 11:59 pm said:

      Comment On The Science Paper “Atmospheric CO2: Principal Control Knob Governing Earth’s Temperature” By Lacis Et Al 2010″ (NASA GISS)

      Pielke

      http://pielkeclimatesci.wordpress.com/2010/10/15/comment-on-the-science-paper-atmospheric-co2-principal-control-knob-governing-earth%E2%80%99s-temperature-by-lacis-et-al-2010/

    • Richard C (NZ) on 19/10/2010 at 2:47 am said:

      “Does CO2 Drive the Earth’s Climate System? Comments on the Latest NASA GISS Paper” – Dr Roy Spencer.

      I asked Richard S. Courtney to comment on my analysis that is posted here:

      https://www.climateconversation.org.nz/2010/10/royal-society-humiliated/#comment-26206

      Richard Courtney replied at JoNova, comment # 141
      here:

      http://joannenova.com.au/2010/10/shock-climate-models-cant-even-predict-linear-rise/#comments

      [He is on fire versus oh dear et al]

      My question:

      BTW, I would really value your observations on this:

      https://www.climateconversation.org.nz/2010/10/royal-society-humiliated/#comment-26206

      Not the RS post, but O/T discussion brought about by Dr Roy Spencers post:

      “Does CO2 Drive the Earth’s Climate System? Comments on the Latest NASA GISS Paper”

      http://www.drroyspencer.com/2010/10/does-co2-drive-the-earths-climate-system-comments-on-the-latest-nasa-giss-paper/

      i.e. Is my analysis way off?

      His answer:

      It is spot on and not “way off”.

      He goes on to make an extensive analysis of the GISS paper (way more than I requested – thank you, Richard Courtney):

      [The following is better viewed at JoNova]

      Please see my post at #60 above.

      I there quoted (and referenced) Kiehl reporting:

      The cited range in climate sensitivity from a wide collection of models is usually 1.5 to 4.5 deg C for a doubling of CO2, where most global climate models used for climate change studies vary by at least a factor of two in equilibrium sensitivity.

      It shows that the applied forcing is assumed (n.b. not estimated) anthropogenic effect in each model. But, in each case, the assumed anthropogenic forcing is very large.

      As I said at #60,

      The graph shows the anthropogenic forcings used by the models show large range of total anthropogenic forcing from 0.8 W/m^2 to 2.02 W/m^2 with each of these values compensated to agree with observations by use of assumed anthropogenic aerosol forcing in the range -0.6 W/m^2 to -1.42 W/m^2. In other words, the total anthropogenic forcings used by the models varies by a factor of over 2.5, and this difference is compensated by assuming values of anthropogenic aerosol forcing that vary by a factor of almost 2.4.

      And, as Kiehl reports, the models use a variety of climate sensitivities from 1.5 to 4.5 deg C for a doubling of CO2 to get this large range of assumed anthropogenic forcings. But there is good reason to consider that the real climate sensitivity is much lower. For example, Idso snr. reports his 8 natural experiments that indicate a “best estimate” of climate sensitivity of 0.10 C/W/m2 which corresponds to a temperature increase of 0.37 Celsius for a doubling of CO2.
      His summarry can be read at http://members.shaw.ca/sch25/FOS/Idso_CO2_induced_Global_Warming.htm
      and lists a range obtained from his experiments of 0.1 to 0.173 C/W/m2 from the 8 natural experiments.
      And there is a link from that URL to his paper in Climate Research (1998).

      So, nobody knows what climate sensitivity really is.
      The models use a wide range of assumed climate sensitivities.
      The lowest assumed climate sensitivity used by a model is probably too large by about an order of magnitude.
      And on the basis of that, the modellers assert that climate sensitivity is large.
      Their assertion is a purely circular argument.
      It is the logical fallacy of arguing the affirmative.
      And it is a clear example pure pseudoscience.

      I hope this answer is sufficient. [Yes, more-so]

      Richard [Richard S. Courtney]

    • Mike Jowsey on 01/11/2010 at 11:53 pm said:

      Richards S. Courtenay rocks

      Love this one:
      “So, the difference between a model’s results and observed reality informs about the model, and this difference is not “evidence” for the existence or otherwise of any postulated effect – for example, anthropogenic global warming – in the real climate system.
      If you cannot grasp this simple point then you should consider the following. Computer models based on fundamental physical laws can very accurately emulate the behaviours of battling spaceships, but this cannot provide any “evidence” for the existence of alien monsters in real space.”

    • Mike Jowsey on 01/11/2010 at 11:55 pm said:

      Oops – apologies for mis-spelling name : hate it when that happens! S/b Richard S. Courtney

    • Andy on 02/11/2010 at 9:06 am said:

      I’d be getting a bit worried about this particular computer model:

      Climate change game launched

      An educational computer game in which users have to save the world from climate change offers an interesting solution – decide the problem is overpopulation and design a virus to kill millions.

      http://www.telegraph.co.uk/earth/environment/climatechange/8101281/Climate-change-game-launched.html

      Fancy a bit of genocide in your stocking this Christmas, Johnny?

    • Mike Jowsey on 02/11/2010 at 9:31 am said:

      OMG – this is for real! Reminds me of Hitler Youth camps – indoctrinate, indoctrinate, indoctrinate. THE END JUSTIFIES THE MEANS!

    • Andy on 02/11/2010 at 9:49 am said:

      Mike,
      Sounds like you and Richard North are reading the same messages into this:

      http://eureferendum.blogspot.com/2010/11/hitler-youth.html

    • Richard C (NZ) on 02/11/2010 at 10:01 am said:

      Is this another own goal like 10:10?

      I’m putting this in “Controversy and scandal” anyway.

    • Richard C (NZ) on 21/10/2010 at 7:06 pm said:

      From JoNova

      Author: allen mcmahon
      Comment:
      John@96;

      An argument is not automatically good because it supports your position, you actually need to sift the useful from the rubbish.

      A good point but on that seems to be lost on many who support the AGW hypothesis. One could look at many issues over the past decade or so such as ice cores, the combined arctic/antarctic ice extent,the ‘hot spot’, paleo evidence or lack thereof, the failure of model forecasts where reason suggests that the hypothesis has failed. But this has not happened, the general fallback position is not evidence based but relies heavily on the manipulation and distortion of existing data coupled with the latest, and generally more alarming, model senarios.
      Coupled with this is the extension of the doomsday time frame, for example, Keelyside 2008 suggests AGW will be back with a vengeance in 2015 while Tsonis 2009 opts for 2020. The discourse from the AGW camp has changed as well and a good example of this is Michael Tobis who has gone from act now or your children suffer to act within the next coupled of decades or your grandchildren will suffer. The cynic in me suggests that the main players are providing enough wiggle room to remain relevant, and employed, until retirement.
      What I find interesting is that the evidence that casts serious doubt and I would say refutes the AGW hypothesis is found within the IPCC reports.
      There are many examples but for brevity I will choose only one, aerosols. The most difficult problem for the hypothesis is the cooling from the 1940s to the 1970s and the solution for the GCMs is the cooling properties of aerosols, the weighting given to aerosols is also a major factor in the differences between models. In the wonderful world of GCMs without the infinite elasticity of aerosols the model output would not hind-cast as well as they do. When we go to the font of all knowledge, the IPCC, we find that the understanding of aerosols is classified as poor. How can one have a <90% confidence in the evils of C02 when a key factor that could lead to this conclusion is poorly understood.To my mind it suggests the modelers are basically guessing and it becomes worse when you look at clouds, the failure of the models to incorporate natural cycles such as PDO/AMO. Even for cycles that are better understood such as El Nino/La Nina events the models are split roughly 50/50 on which will dominate this century. The increased frequency and intensity of El Nino events are a feature of the more apocalyptic model scenarios but the few models that approximate the timing of El Nino events reasonably well overestimate the intensity and duration substantially.
      Should you wish to investigate or dispute any of these claims I can provide you with a list of peer reviewed articles by scientists and in journals that the most ardent AGW supporter would find acceptable.

      See all comments on this post here:
      http://joannenova.com.au/2010/10/is-the-western-climate-establishment-corrupt-part-8-do-most-western-climate-scientists-believe-global-warming-is-man-made/#comments

    • THREAD on 29/10/2010 at 9:43 am said:

      An Update on Radiative Transfer Model Development at Atmospheric and Environmental Research, Inc.

      J. S. Delamere, S. A. Clough, E. J. Mlawer, Sid-Ahmed Boukabara, K. Cady-Pereira, and M. Shepard Atmospheric and Environmental Research, Inc. Lexington, Maine

      Introduction
      Over the last decade, a suite of radiative transfer models has been developed at Atmospheric and Environmental Research, Inc. (AER) with support from the Atmospheric and Radiation Measurement (ARM) Program. These models span the full spectral regime from the microwave to the ultraviolet, and range from monochromatic to band calculations. Each model combines the latest spectroscopic advancements with radiative transfer algorithms to efficiently compute radiances, fluxes, and cooling rates. These models have been extensively validated against high-resolution spectral measurements and broadband irradiance measurements. Several of these models are part of the broadband heating rate profile value-added product (BBHRP VAP), currently being established at the ARM Southern Great Plains (SGP) site.

      A Web site has been established to host the AER radiative transfer models (http://rtweb.aer.com). The Web site facilitates access to the models and is a convenient platform on which to provide model updates

      Also see – “Atmospheric Thermodynamics and Heat”

      Radiative Transfer Climate Models – Google Search

    • Richard C (NZ) on 29/10/2010 at 10:42 am said:

      Radiative Transfer Climate Models – Google Search

    • Richard C (NZ) on 29/10/2010 at 11:24 am said:

      Effects of bias in solar radiative transfer codes on global climate model simulations – Google Scholar Search

      Note: this is a better search than:

      radiative transfer codes global climate model simulations

      and contains for example;

      “An accurate parameterization of the infrared radiative properties of cirrus clouds for climate models” Yang 1998

    • Richard C (NZ) on 30/10/2010 at 1:33 pm said:

      See – “Climate Models”

      NON IPCC and Natural Forcings ONLY

      Cloud Resolving Model (CRM)

      Superparameterization

  4. THREAD on 16/10/2010 at 8:19 pm said:

    CLIMATE SCIENCE PAPERS

  5. THREAD on 16/10/2010 at 8:21 pm said:

    CLIMATE SCIENTISTS

  6. THREAD on 16/10/2010 at 8:22 pm said:

    CLIMATE SCIENCE INFORMATION RESOURCES

  7. THREAD on 17/10/2010 at 5:23 am said:

    Climate Change: Natural Cycles, Phenomena, NH v SH and Weather

  8. THREAD on 17/10/2010 at 6:18 am said:

    The “Missing Heat”, Model Simulations, and Knox-Douglass 2010

    https://www.climateconversation.org.nz/2010/09/seventy-years-is-plenty/#comment-24668

  9. THREAD on 18/10/2010 at 6:36 pm said:

    Overconfidence in IPCC’s detection and attribution: Part I

    http://judithcurry.com/

  10. val majkus on 19/10/2010 at 9:26 pm said:

    Might I suggest all sympathetic contributers visit WUWT Announcements and wish him and his wife well; his wife has to under go an operation and I’m sure he will read all comments at some stage and will be glad of each
    http://wattsupwiththat.com/2010/10/18/announcement/#comment-510941
    Anthony says ‘As a result, I’ll be pretty much offline this week. But there is good news in all of this.
    WUWT has a life of its own now. It will continue without my help, thanks to the tireless work of guest authors, our moderation team, and of course the tips and notes brought in by our volume of readers.’

    What dedication! and it deserves our appreciation

  11. val majkus on 01/11/2010 at 10:47 am said:

    Here’s the inimitable Dr Tim Ball on data manipulation
    http://canadafreepress.com/index.php/article/28360
    Data collection is expensive and requires continuity – it’s a major role for government. They fail with weather data because money goes to political climate research. A positive outcome of corrupted climate science exposed by Climategate, is re-examination beginning with raw data by the UK Met Office (UKMO). This is impossible because much is lost, thrown out after modification or conveniently lost, as in the case of records held by Phil Jones, director of Climategate. (Here and Here)

    Evidence of manipulation and misrepresentation of data is everywhere. Countries maintain weather stations and adjust the data before it’s submitted through the World Meteorological Organization (WMO) to the central agencies including the Global Historical Climatology Network (GHCN), the Hadley Center associated with CRU now called CRUTEM3, and NASA’s Goddard Institute for Space Studies (GISS).

    They make further adjustments before selecting stations to produce their global annual average temperature. This is why they produce different measures each year from supposedly similar data.

    There are serious concerns about data quality. The US spends more than others on weather stations, yet their condition and reliability is simply atrocious. Anthony Watts has documented the condition of US weather stations; it is one of governments failures.” (end of quote)

    and Ken Stewart has shown how this has happened in Australia http://kenskingdom.wordpress.com/2010/07/27/the-australian-temperature-record-part-8-the-big-picture/
    his conclusion?
    In a commendable effort to improve the state of the data, the Australian Bureau of Meteorology (BOM) has created the Australian High-Quality Climate Site Network. However, the effect of doing so has been to introduce into the temperature record a warming bias of over 40%. And their climate analyses on which this is based appear to increase this even further to around 66%.

    Has anyone done similar checking in New Zealand?

  12. val majkus on 16/05/2011 at 11:39 pm said:

    For Australians:

    Here’s a copy of an e mail I received today
    A group set up with Allen Jones as patron and some very eminent advisors including Warwick Hughes
    And another media personality Andrew Bolt
    I heartily endorse the site; it’s time to get the message that people like Warwick have been researching for so long before the public and into the media

    so I hope you can donate and encourage this effort

    and here’s the e mail

    It’s time: an opportunity to support true scientists presenting real-world climate science to inform the public.

    http://www.galileomovement.com.au

    7:10am Tuesday morning, May 17th, and Wednesday, May 18th tune into radio 2GB, Sydney: http://www.2gb.com

    16 May, 2011
    Hi,
    We’re delighted to be able to introduce you to the web site of the newly-formed Galileo Movement (www.galileomovement.com.au). The principal aim of this Movement is to first win the battle against the currently threatened tax on carbon dioxide and then to win the war against any drive for ever putting a price on it.
    From recent public polls it is obvious that the majority of Australians are opposed to this tax and we believe you are probably part of that majority.
    Our efforts are non-political as there are politicians in all parties who need to be convinced of the futility of taxing a beneficial trace gas in the atmosphere. So many people still think that carbon dioxide is a harmful pollutant; we need your help to educate these people that this is not so! We have strong connections to excellent scientific and other advisers worldwide. We’re ready.
    How can you help? Well, on the web site there is a printable flyer which you can print and distribute into letter boxes, at bus stops, shopping centres and railway stations; it is at http://www.galileomovement.com.au/threat_freedom.php#O and click on “flyer”.
    More importantly, we intend to run a major, national, professionally-managed campaign to gain access to significant members of the mass media (print journalists, radio and TV personalities) who need to understand that the Intergovernmental Panel on Climate Change (IPCC) has been spreading truly false information. This campaign will be costly, so we are appealing to you and all like-minded people to make a contribution towards this effort. You will note from our web site that virtually all our efforts are voluntary; there are some minor costs in maintaining formal financial records and in producing and maintaining the web site. Once we have won the war, any remaining funds are destined to go to the Royal Flying Doctor Service.
    Please distribute this email widely to make people aware of http://www.galileomovement.com.au as this may be our last opportunity to unite to defeat this negative, economic and socially life-changing legislation. It is time to act: “The triumph of evil requires only for good men (and women) to do nothing”.
    Thank you,

    John Smeed, Director
    Case Smit, Director
    Malcolm Roberts, Project Manager
    P.S. Our apologies if you receive more than one copy of this email.
    P.P.S. Please let us have your suggestions for enhancing our web site.
    P.P.P.S. Listen to Australia’s most influential radio personality Alan Jones of Radio 2GB (www.2gb.com), and Patron of the Galileo Movement, who will be interviewing distinguished American meteorologist Professor Lindzen at 7:10am Tuesday (May 17th) and at 7:10am on Wednesday, The Galileo Movement’s voluntary Project Manager, Malcolm Roberts

    Malcolm Roberts
    BE (Hons), MBA (Chicago)
    Fellow AICD, MAIM, MAusIMM, MAME (USA), MIMM (UK), Fellow ASQ (USA, Aust)

    http://www.conscious.com.au

    My personal declaration of interests is at:
    http://www.conscious.com.au/__documents/additional%20material/Personal%20declaration%20of%20interests.pdf
    (manually go to http://www.conscious.com.au and look for ‘Summaries’ and then click on ‘Aims, background and declaration of interests …’)

    180 Haven Road
    Pullenvale QLD 4069
    Phone:
    Home 07 3374 3374
    Mobile 04 1964 2379
    E-mail: catalyst@eis.net.au

    Please note: Apart from suburb and state, my contact details are not for publication nor broadcasting and are provided only for your own personal use to respond.

    For care to be effective, care needs to be informed

  13. THREAD on 16/11/2011 at 7:59 am said:

    Climate Extremes and Extreme Weather

    • Richard C (NZ) on 16/11/2011 at 8:06 am said:

      Climate Extremes and Global Warming

      By ANDREW C. REVKIN – Dot Earth

      […]

      Passions are heightened by extraordinary recent climate-related disasters and concerns in poor countries that they’re already being affected by a greenhouse-gas buildup mainly caused (so far) by rich countries.

      But despite decades of work on greenhouse-driven warming, research aimed at clarifying how greenhouse-driven global warming will affect the rarest categories of raging floods, searing droughts and potent storms is both limited and laden with uncertainty.

      […]

      See Q and A with Chris Field, a leader of the panel’s [IPCC] Working Group 2 “Managing the Risks of Extreme Events and Disasters to Advance Climate Change Adaptation (SREX)”

      >>>>>>>>

      http://dotearth.blogs.nytimes.com/2011/11/15/closeup-climate-extremes-and-global-warming/

    • Richard C (NZ) on 16/11/2011 at 8:19 am said:

      Details and reaction to the IPCC Special Report:

      “Managing the Risks of Extreme Events and Disasters to Advance Climate Change Adaptation (SREX)”

      can be found in this (off-topic) thread under the “‘Monster’ increase in emissions” post starting here:-

      https://www.climateconversation.org.nz/2011/11/monster-increase-in-emissions/#comment-70844

    • Richard C (NZ) on 18/11/2011 at 7:42 am said:

      Review fails to support climate change link

      * by: Graham Lloyd, Environment editor
      * From: The Australian
      * November 18, 2011 12:00AM

      WIDELY-HELD assumptions that climate change is responsible for an upsurge in extreme drought, flood and storm events are not supported by a landmark review of the science.

      And a clear climate change signal would not be evident for decades because of natural weather variability.

      Despite the uncertainties, politicians – including US President Barack Obama in his address to federal parliament yesterday – continue to link major weather events directly to climate change.

      Greens leader Bob Brown yesterday highlighted Mr Obama’s climate change comments and said the extreme weather impacts were “not just coming, they are happening”.

      But rather than bolster claims of a climate change link, the scientific review prepared by the world’s leading climate scientists for the UN’s Intergovernmental Panel on Climate Change highlights the level of uncertainty. After a week of debate, the IPCC will tonight release a summary of the report in Kampala, Uganda, as a prelude to the year’s biggest climate change conference, being held in Durban, South Africa.

      The full report will not be released for several months but a leaked copy of the draft summary, details of which have been published by the BBC and a French news agency, have provided a good indication of what it found.

      While the human and financial toll of extreme weather events has certainly risen, the cause has been mostly due to increased human settlement rather than worse weather.

      >>>>>>>>

      http://www.theaustralian.com.au/national-affairs/climate/review-fails-to-support-climate-change-link/story-e6frg6xf-1226198360121

    • Richard C (NZ) on 18/11/2011 at 6:44 pm said:

      IPCC scientists test the Exit doors

      RE: Mixed messages on climate ‘vulnerability’. Richard Black, BBC.

      AND UPDATED: The Australian reports the leaked IPCC review, AND a radio station just announced it as “IPCC says we don’t know if there is a reason for the carbon tax”. See more below.
      ———————————-
      This is another big tipping point on the slide out of the Great Global Scam. IPCC scientists — facing the travesty of predictions-gone-wrong — are trying to salvage some face, and plant some escape-clause seeds for later. But people are not stupid.

      A conveniently leaked IPCC draft is testing the ground. What excuses can they get away with? Hidden underneath some pat lines about how anthropogenic global warming is “likely” to influence… ah cold days and warm days, is the get-out-of-jail clause that’s really a bombshell:

      “Uncertainty in the sign of projected changes in climate extremes over the coming two to three decades is relatively large because climate change signals are expected to be relatively small compared to natural climate variability”.

      http://joannenova.com.au/2011/11/ipcc-scientists-test-the-exit-doors/

    • Richard C (NZ) on 07/12/2011 at 9:09 am said:

      Common link in extreme weather events found – and no, it isn’t AGW

      Posted on December 5, 2011 by Anthony Watts

      From the University of Wisconsin-Madison something you’ll never see posted on Climate Progress or mentioned by weepy Bill McKibben because it mellows their harshness

      Global winds could explain record rains, tornadoes

      MADISON –Two talks at a scientific conference this week will propose a common root for an enormous deluge in western Tennessee in May 2010, and a historic outbreak of tornadoes centered on Alabama in April 2011.

      Both events seem to be linked to a relatively rare coupling between the polar and the subtropical jet streams, says Jonathan Martin, a University of Wisconsin-Madison professor of atmospheric and oceanic sciences.

      But the fascinating part is that the change originates in the western Pacific, about 9,000 miles away from the intense storms in the U.S. midsection, Martin says.

      The mechanism that causes the storms originates during spring or fall when organized complexes of tropical thunderstorms over Indonesia push the subtropical jet stream north, causing it to merge with the polar jet stream.

      The subtropical jet stream is a high-altitude band of wind that is normally located around 30 degrees north latitude. The polar jet stream is normally hundreds of miles to the north.

      Martin calls the resulting band of wind a “superjet.”

      >>>>>>>>

      http://wattsupwiththat.com/2011/12/05/common-link-in-extreme-weather-events-found-and-no-it-isnt-agw/

    • Richard C (NZ) on 26/12/2011 at 7:30 am said:

      Looks like alarm will continue to focus on extreme weather in 2012 – a great new greenfield opportunity for climate science funding. Here’s something to watch for mid-year (a headsup, if you will):-

      Harsh Political Reality Slows Climate Studies Despite Extreme Year

      By JUSTIN GILLIS
      Published: December 24, 2011

      At the end of one of the most bizarre weather years in American history, climate research stands at a crossroads.

      But for many reasons, efforts to put out prompt reports on the causes of extreme weather are essentially languishing. Chief among the difficulties that scientists face: the political environment for new climate-science initiatives has turned hostile, and with the federal budget crisis, money is tight

      […page 2]

      Some steps are being taken. Peter A. Stott, a leading climate scientist in Britain, has been pressing colleagues on both sides of the Atlantic to develop a robust capability to analyze weather extremes in real time. He is part of a group that expects to publish, next summer, the first complete analysis of a full year of extremes, focusing on 2011.

      In an interview, Dr. Stott said the goal was to get to a point where “the methodologies are robust enough that you can do it in a kind of handle-turning way.”

      But he added that it was important to start slowly and establish a solid scientific foundation for this type of work. That might mean that some of the early analyses would not be especially satisfying to the public.

      “In some cases, we would say we have a confident result,” Dr. Stott said. “We may in some cases have to say, with the current state of the science, it’s not possible to make a reliable attribution statement at this point.”

      http://www.nytimes.com/2011/12/25/science/earth/climate-scientists-hampered-in-study-of-2011-extremes.html?pagewanted=1&_r=1

    • Richard C (NZ) on 26/12/2011 at 7:37 am said:

      Tom Nelson headlines the article a little differently:-

      Another lie from your New York Times: “the weather becomes more erratic by the year”

      http://tomnelson.blogspot.com/2011/12/another-lie-from-your-new-york-times.html

    • Richard C (NZ) on 08/01/2012 at 11:42 am said:

      Useful tutorials from Stephan Goddard at Real Science

      Learning To Distinguish Between Low CO2 And High CO2 Droughts | Real Science

      Droughts on the left side of the blue line were below 350 ppm CO2, and droughts on the right side were above 350 ppm. Before you can distinguish between them, you need to accurately determine how many angels can dance on the head of a pin. Then you have to tell the press that droughts seem like they are getting worse.

      Learning To Distinguish Between Low CO2 Tornadoes And High CO2 Tornadoes | Real Science

      Tornadoes to the left of the pink line are below 350 PPM tornadoes, and tornadoes to the right are high CO2 (supercharged) tornadoes. Can you spot the difference?

      http://tomnelson.blogspot.com/2012/01/will-droughts-and-floods-eventually.html

  14. Richard C (NZ) on 06/04/2013 at 11:35 am said:

    Failed winter climate predictions

    (The first 33 concern mostly Germany and Central Europe)

    1. “Due to global warming, the coming winters in the local regions will become milder.”
    Stefan Rahmstorf, Potsdam Institute of Climate Impact Research, University of Potsdam, 8 Feb 2006

    ***

    2. “Milder winters, drier summers: Climate study shows a need to adapt in Saxony Anhalt.”
    Potsdam Institute for Climate Impact Research, Press Release, 10 Jan 2010.

    ****

    3. “More heat waves, no snow in the winter“ … “Climate models… over 20 times more precise than the UN IPCC global models. In no other country do we have more precise calculations of climate consequences. They should form the basis for political planning. … Temperatures in the wintertime will rise the most … there will be less cold air coming to Central Europe from the east. …In the Alps winters will be 2°C warmer already between 2021 and 2050.”
    Max Planck Institute for Meteorology, Hamburg, 2 Sept 2008.

    ****
    More >>>>>>>> [59 so far]

    http://notrickszone.com/2013/04/04/climate-science-humiliated-earlier-model-prognoses-of-warmer-winters-now-todays-laughingstocks/

    • Richard C (NZ) on 06/04/2013 at 12:18 pm said:

      Matt Ridley’s diary: My undiscovered island, and the Met Office’s computer problem

      […]

      At least somebody’s happy about the cold. Gary Lydiate runs one of Northumberland’s export success stories, Kilfrost, which manufactures 60 per cent of Europe’s and a big chunk of the world’s aircraft de-icing fluid, so he puts his money where his mouth is, deciding how much fluid to send to various airports each winter. Back in January, when I bumped into him in a restaurant, he was beaming: ‘Joe says this cold weather’s going to last three months,’ he said. Joe is Joe Bastardi, a private weather forecaster, who does not let global warming cloud his judgment. Based on jetstreams, el Niños and ocean oscillations, Bastardi said the winter of 2011–12 would be cold only in eastern Europe, which it was, but the winter of 2012–13 would be cold in western Europe too, which it was. He’s now predicting ‘warming by mid month’ of April for the UK.

      More >>>>>

      http://www.spectator.co.uk/the-week/diary/8880591/diary-603/

Leave a Reply

Your email address will not be published. Required fields are marked *