If you thought our cities are getting warmer, you’re right.
Bureau of Meteorology researchers have found that daytime temperatures in our cities are warming more rapidly than those of the surrounding countryside and that this is due to the cities themselves.
Bureau climate scientist, Belinda Campbell, said “we’ve known for a while that city night time temperatures have been warmer because the heat’s retained after sunset just that much longer than the countryside, and that city daytime temperatures have been warming too.”
“But what we didn’t know was whether city day time temperatures were also warmer because of the urbanisation or whether it was due to the overall warming of the planet associated with the enhanced greenhouse effect.”
‘Their algorithm creates an imaginary hot spot at the North Pole…They appear to be using at least two steps of extrapolation/interpolation – which compounds error. In other words, their entire 21st century warming story is based on a defective interpretation of the Arctic.’
Much of the reported warming from the 20th century was due to upwards adjustments of recent temperatures, and downwards adjustments of older temperatures.
This worked pretty well until satellites showed up and provided some badly needed checks and balances on the adjusters. Since 1990 it has been tougher to justify any further upwards adjustments to the data.
In the debate over the accuracy of the global temperature nothing is more evident than errors in the location data for stations in the GHCN inventory. That inventory is the primary source for all the temperature series. One question is “do these mistakes make a difference?” If one believes as I do that the record is largely correct, then it’s obvious that these mistakes cannot make a huge difference. If one believes, as some do, that the record is flawed, then it’s obvious that these mistakes could be part of the problem. Up until know that is where these two sides of the debate stand. Believers convinced that the small mistakes cannot make a difference; and dis-believers holding that these mistakes could in fact contribute to the bias in the record. Before I get to the question of whether or not these mistakes make a difference, I need to establish the mistakes, show how some of them originate, correct them where I can and then do some simple evaluations of the impact of the mistakes. This is not a simple process. Throughout this process I think we can say two things that are unassailable: 1. the mistakes are real. 2. we simply don’t know if they make a difference. Some believe they cannot (but they haven’t demonstrated that) and some believe they will (but they haven’t demonstrated that). The demonstration of either position requires real work. Up to now no one has done this work.
This matters primarily because to settle the matter of UHI stations must be categorized as urban or rural. That entails collecing some information about the character of the station, say it’s population or the characteristics of the land surface. So, location matters. Consider Nightlights which Hansen2010 uses to categorize stations into urban and rural. That determination is made by looking up the value of a pixel in an image. If it bright, the site is urban. If its dark (mis-located in the ocean) the site is rural.
[Sophisticated analysis of GHCN and UHI at this Blog]
New Retreat from Global Warming Data by Australian Gov Bureau
Article by John O’Sullivan and Val Majkus (via email from John O’Sullivan)
Wednesday, November 10, 2010
Global warmers in full retreat as Aussie experts admit growing doubts about their own methods as new study shows one third of temperatures not reliable.
The Australian Bureau of Meteorology (BOM) admits it was wrong about urban heating effects as a professional statistical analysis by Andrew Barnham exposes a BOM claim that “since 1960 the mean temperature in Australia has increased by about 0.7 °C”; the BOM assertion has no empirical scientific basis.
Barnham, who spent 8 years working in emerging South Asian economies building high volume transaction processing systems, applied a high-tech statistical technique very different from an earlier well-publicized probe by fellow Aussie, Ken Stewart on his blog, Ken’s Kingdom.
Stewart grabbed headlines in what became known as the Australiagate controversy after his findings were featured on popular science blog, Watts Up With That. Stewart exposed dubious BOM adjustments to temperature data that bore little or no resemblance to actual or raw past temperatures.
Like Stewart, Barnham paid particular attention to BOM’s methodology in addressing what is known as the Urban Heat Island Effect (UHI), a proven phenomenon whereby thermometers measuring temperatures in towns and cities become unduly influenced by extra ‘background’ heating from buildings, road surfaces, machinery, etc. It’s in the UHI adjustments that the greatest discrepancies appear to lie.
Interesting Guest post by Ed Thurstan of Sydney, Australia
Synopsis
This study shows that the NOAA maintained GHCN V2 database contains errors in calculating a Mean temperature from a Maximum and a Minimum. 144 years of data from 36 Australian stations are affected.
Means are published when the underlying Maximums and/or Minimums have been rejected.
During October, RSS showed a large drop of 0.232 from September. It appeared that the battle for 2010 as hottest year ever was doomed. 2010 is turning out much cooler than 1998, with no hope of catching up.
But just when the battle appeared lost, the fighters at GISS got their second wind. Instead of a large drop in October temperature anomalies, they found a 0.08 rise! This keeps 2010 well ahead of their hottest year ever – 2005.
There’s an interesting post at Jo Nova’s http://joannenova.com.au/2010/10/bom-giss-have-record-setting-bugs-affecting-a-million-square-miles/
Chris Gillham
Western Australia (WA) covers 2.5 million square kilometers (1 million square miles, about a third as big as the USA). The average of all WA stations over one month last year was adjusted up by as much as a gobsmacking 0.5 degrees due to a database “bug” – which contributed to August 2009 being the hottest August on record?! That’s one heck of a bug!
posted on October 27 2010
Chris also says
While researching the GISS adjustments, I noticed yet another odd data shift that left me wondering about the reliability of temperature recordings. I had listed the 2009 monthly mean temperatures on October 4, 2010, for Kalgoorlie-Boulder, but when I returned to the GISS website database the following day, October 5, I found that every month in 2009 for that location had been shifted up by .1 C.
This means the newly adjusted GISS record shows Kalgoorlie-Boulder’s average mean for Spring 2009 was 20.6C, not 20.5 C anymore, so this historic mining town’s seasonal temperature record is now 1.2 degrees higher than the reality of the BoM records.
Chris has done a huge amount of work and his blog which relates to Western Aust is well worth a visit http://www.waclimate.net/
He’s also examined pre 1910 temps and compared
Although the pre-1910 temperature recordings in each location examined below are considered by the Bureau of Meteorology to be unreliable compared to the regulated post-1910 data, it is still worth comparing these earliest temperatures with the most recent “corrected” data from 1979-2008 in the 20 locations.
Minima
For all 20 locations, the average mean minimum rose .44 degrees C from ~1900 to 1979-2008
For 10 coastal locations, the average mean minimum rose .29 degrees C from ~1900 to 1979-2008
For 10 inland locations, the average mean minimum rose .58 degrees C from ~1900 to 1979-2008
Maxima
For all 20 locations, the average mean maximum rose .48 degrees C from ~1900 to 1979-2008
For 10 coastal locations, the average mean maximum rose .86 degrees C from ~1900 to 1979-2008
For 10 inland locations, the average mean maximum rose .10 degrees C from ~1900 to 1979-2008
I know you must be sick of me and temperatures but seem to be stuck on that today Not at all. You always have something interesting to say. -RT
more on UHI http://www.warwickhughes.com/hoyt/uhi.htm
and the essay about half way down the page:
Since satellite measurements began in 1979, the world’s population has approximately doubled leading to an UHI signal of 0.67 C over land and 0.19 C globally. The observed surface warming is about 0.36 C over the same time period, so a substantial portion may be just uncorrected UHI effects. Other effects include land use changes, increased darkness of vegetation, direct heat from fossil fuel burning, a brighter sun, changes in cosmic ray intensity, soot on snow, more soot in the atmosphere, and greenhouse gases (and this list is not exhaustive). There are many competing theories for the recent warming and some of them do a better job at explaining the observations than do greenhouse gases.
The land surface stations were designed to provide local climatology. They were not designed to detect climate change. Quality control of the surface network is inadequate.
UAH and UHI
Posted on December 16, 2010 by Anthony Watts
Note: clearly satellites can see urban heat, as demonstrated by this recent paper unveiled at the 2010 AGU meeting by NASA. See: Satellites Image the Urban Heat Islands in the Northeast. It can also be demonstrated that the UHI biases the thermometers upwards. As cities grow, so does the increased bias. In that paper NASA says:
The compact city of Providence, R.I., for example, has surface temperatures that are about 12.2 °C (21.9 °F) warmer than the surrounding countryside…
The Urban Heat Island Effect Distorts Global Temperatures
Written by Dr. Tim Ball | May 17 2011
How much do calculations of global temperatures represent the real temperature of the Earth? Every day, new stories appear about temperature records with errors or deliberate omissions. Essex, McKitrick, and Andresen’s article suggests that such a creature doesn’t exist. An important part of the debate is something called the urban heat island effect (UHIE)
New paper finds urban heat islands cause up to 44% of total recorded warming
A paper published today in the Journal of Geophysical Research finds that urban heat islands account for up to 44% of the recorded warming in cities in east China over the period of 1981-2007.
JOURNAL OF GEOPHYSICAL RESEARCH, VOL. 116, D14113, 12 PP., 2011
Observed surface warming induced by urbanization in east China
Key Points:
The rapid urbanization has significant impacts on temperature over east China
A new method was developed to dynamically classify urban and rural stations
Comparison of the trends of UHI effects by using OMR and UMR approaches
Observed surface warming induced by urbanization in east China
Key Points
* The rapid urbanization has significant impacts on temperature over east China
* A new method was developed to dynamically classify urban and rural stations
* Comparison of the trends of UHI effects by using OMR and UMR approaches
Xuchao Yang, Shanghai Typhoon Institute of China Meteorological Administration, Shanghai, China Institute of Meteorological Sciences, Zhejiang Meteorological Bureau, Hangzhou, China Yiling Hou, Shanghai Climate Center, Shanghai, China, Baode Chen, Shanghai Typhoon Institute of China Meteorological Administration, Shanghai, China
Monthly mean surface air temperature data from 463 meteorological stations, including those from the 1981–2007 ordinary and national basic reference surface stations in east China and from the National Centers for Environmental Prediction and National Center for Atmospheric Research (NCEP/NCAR) Reanalysis, are used to investigate the effect of rapid urbanization on temperature change.
New paper: Urban Heat Island effect accounts for 56% of warming in urban areas over past 55 years
A paper published last week in the journal Atmospheric Environment finds that the urban heat island effect accounts for 56% of the total temperature increase in South Korean city stations over the past 55 years. Warmists constantly downplay the urban heat island (UHI) effect as insignificant, but as this paper and several others show, the UHI does cause very significant biases with stations reporting higher or rising temperatures because they are poorly sited.
Quantitative Estimates of Warming by Urbanization in South Korea over the Past 55 Years (1954-2008)
Unadjusted data of long period stations in GISS show a virtually flat century scale trend
Posted on October 24, 2011 by Anthony Watts
Temperature averages of continuously reporting stations from the GISS dataset
Guest post by Michael Palmer, University of Waterloo, Canada
Abstract
The GISS dataset includes more than 600 stations within the U.S. that have been in operation continuously throughout the 20th century. This brief report looks at the average temperatures reported by those stations. The unadjusted data of both rural and non-rural stations show a virtually flat trend across the century.
[…]
Figure 3: Temperature trends and station counts for all US stations in GISS reporting continuously, that is containing at least one monthly data point for each year from 1900 to 2000. The slope for the rural stations (335 total) is -0.00073 deg/year, and for the other stations (278 total) -0.00069 deg/year. The monthly data point coverage is above 90% throughout except for the very first few years.
Posted on October 24, 2011 by Willis Eschenbach
[…] Disagreement with satellite observations
[…]
Remember what we would expect to find if all of the ground records were correct. They’d all lie on or near the same line, and the satellite temperatures would be rising faster than the ground temperatures. Here are the actual results, showing BEST, satellite, GISS, CRUTEM, and GHCN land temperatures:
Figure 3. BEST, average satellite, and other estimates of the global land temperature over the satellite era. Anomaly period 1979-1984 = 0.
In Figure 3, we find the opposite of what we expected. The land temperatures are rising faster than the atmospheric temperatures, contrary to theory. In addition, the BEST data is the worst of the lot in this regard.
Disagreement with other ground-based records.
The disagreement between the four ground-based results also begs for an explanation. Note that the records diverge at the rate of about 0.2°C in thirty years, which is 0.7° per century. Since this is the approximate amount of the last century’s warming, this is by no means a trivial difference.
A ‘Q&A’ With A Renowned Climate Expert Regarding The BEST Research
Below, we posed some very basic questions to Dr. Ball. Our questions are in bold and Dr. Ball’s responses italicized:
1. What actually did BEST analyze and measure? Global temperatures or a subset of global temperatures?
2. Did BEST use different climate station data than that used by NASA/GISS, NOAA/NCDC and HadCRUT?
3. Did BEST use only the best climate stations’ data or did they use all stations’ data?
4. Did BEST use the actual raw temperature data, or an “improved” raw dataset, or adjusted temperatures for their analysis?
5. If adjusted, did BEST perform the adjustments or another agency (3rd party)?
6. How were the adjustments done?
7. Has BEST made public their calculated monthly anomalies and monthly baseline means used to calculate the anomalies for their new temperature series?
8. From your preliminary review of the BEST research, what do you like best of their methodology? What are the shortcomings of their methodology?
The scientific finding that settles the climate-change debate
By Eugene Robinson, Published: October 25
For the clueless or cynical diehards who deny global warming, it’s getting awfully cold out there.
The latest icy blast of reality comes from an eminent scientist whom the climate-change skeptics once lauded as one of their own. Richard Muller, a respected physicist at the University of California, Berkeley, used to dismiss alarmist climate research as being “polluted by political and activist frenzy.” Frustrated at what he considered shoddy science, Muller launched his own comprehensive study to set the record straight. Instead, the record set him straight.
Thanks for the link RC. A very pithy reply with some excellent points.
I particularly enjoyed the punch-line:
One last word: In their scientific paper, submitted for peer review, the Berkeley scientists disclaim knowing the cause of the temperature increase reported by their project. However, their research paper comments: “The human component of global warming may be somewhat overestimated.” I commend them for their honesty and skepticism.
BEST project rescues us from thousands of lying global thermometers
Lucky the BEST project is here to save us from the lying thermometers of the past. Apparently people in the 1960′s and 1970′s were clever enough to get man on the moon, but too stupid to measure the temperature. Millions of people were fooled into thinking the world was cooling for three decades by erroneous thermometer readings. Who would have guessed?
A survey completed last year by Dr. Murray Mitchell of the National Oceanic and Atmospheric Administration reveals a drop of half a degree in average ground temperatures in the Northern Hemisphere between 1945 and 1968
If that’s below ground not above ground then half a degree becomes all the more significant. It goes on:-
And a study released last month by two NOAA scientists notes that the amount of sunshine reaching the ground in the continental U.S. diminished by 1.3% between 1964 and 1972
“Sunshine reaching the ground” – global warming in a nutshell.
Explaining Muller vs. Muller: is BEST blissfully unaware of cosmic-ray-cloud theory?
Posted on October 28, 2011 by Alec Rawls
Here is the puzzle, as noted by Nigel Calder and others: how can BEST insist that a modicum of additional evidence of late 20th century warming should put skepticism of the CO2-warming theory to rest, while at the same time admitting that they never even tried to examine the possible causes of warming?
Rule #1 in science is that you don’t blatt your results until they are either presented at a peer-screened conference or published in the peer-reviewed literature. The price of this is often out-of-hand rejection of otherwise decent work.
But once again, climate researchers have blundered into science by press release instead of science by the rules. Why?
For whatever reason, the Berkeley Earth Surface Temperature (BEST) research team diverted a decent amount of the $600,000 they were granted to analyze surface temperature histories into a coordinated and extensive press release of its findings on October 21. At the same time they submitted four manuscripts to American Geophysical Union journals for peer-review.
The earth-shattering news is that the average land surface temperature of the planet is higher than it was 200 years ago. I know of no credentialed climate scientist who does not know this.
[…]
Given the fact that they have gone public with their science, I’ll go public with my review.
I have a license to do so. The first citation in the literature review portion of the manuscript on systematic urban heating bias is a 2007 paper in Journal of Geophysics by University of Guelph’s Ross McKitirick, and yours truly.
Scientist who said climate change sceptics had been proved wrong accused of hiding truth by colleague
By David Rose – Daily Mail, UK (MailOnline)
It was hailed as the scientific study that ended the global warming debate once and for all – the research that, in the words of its director, ‘proved you should not be a sceptic, at least not any longer’.
Professor Richard Muller, of Berkeley University in California, and his colleagues from the Berkeley Earth Surface Temperatures project team (BEST) claimed to have shown that the planet has warmed by almost a degree centigrade since 1950 and is warming continually.
[…]
But today The Mail on Sunday can reveal that a leading member of Prof Muller’s team has accused him of trying to mislead the public by hiding the fact that BEST’s research shows global warming has stopped.
Prof Judith Curry, who chairs the Department of Earth and Atmospheric Sciences at America’s prestigious Georgia Institute of Technology, said that Prof Muller’s claim that he has proven global warming sceptics wrong was also a ‘huge mistake’, with no scientific basis.
Prof Curry is a distinguished climate researcher with more than 30 years experience and the second named co-author of the BEST project’s four research papers.
[…]
In fact, Prof Curry said, the project’s research data show there has been no increase in world temperatures since the end of the Nineties – a fact confirmed by a new analysis that The Mail on Sunday has obtained.
‘There is no scientific basis for saying that warming hasn’t stopped,’ she said. ‘To say that there is detracts from the credibility of the data, which is very unfortunate.’
However, Prof Muller denied warming was at a standstill.
‘We see no evidence of it [global warming] having slowed down,’ he told BBC Radio 4’s Today programme. There was, he added, ‘no levelling off’.
A graph issued by the BEST project also suggests a continuing steep increase.
[See plots: “THE GRAPH THAT FOOLED THE WORLD” and “THE INCONVENIENT TRUTH”]
But a report to be published today by the Global Warming Policy Foundation includes a graph of world average temperatures over the past ten years, drawn from the BEST project’s data and revealed on its website.
This graph shows that the trend of the last decade is absolutely flat, with no increase at all – though the levels of carbon dioxide in the atmosphere have carried on rising relentlessly.
‘This is nowhere near what the climate models were predicting,’ Prof Curry said. ‘Whatever it is that’s going on here, it doesn’t look like it’s being dominated by CO2.’
[…]
[Prof Curry] added, in the wake of the unexpected global warming standstill, many climate scientists who had previously rejected sceptics’ arguments were now taking them much more seriously.
They were finally addressing questions such as the influence of clouds, natural temperature cycles and solar radiation – as they should have done, she said, a long time ago.
[…]
In Prof Curry’s view, two of the papers were not ready to be published, in part because they did not properly address the arguments of climate sceptics.
As for the graph disseminated to the media, she said: ‘This is “hide the decline” stuff. Our data show the pause, just as the other sets of data do. Muller is hiding the decline.
‘To say this is the end of scepticism is misleading, as is the statement that warming hasn’t paused. It is also misleading to say, as he has, that the issue of heat islands has been settled.’
The GWPF source for the MailOnline article reference:-
Best Confirms Global Temperature Standstill
Dr. David Whitehouse – GWPF
[…]
When asked by the BBC’s Today programme Professor Richard Muller, leader of the initiative, said that the global temperature standstill of the past decade was not present in their data.
“In our data, which is only on the land we see no evidence of it having slowed down. Now the evidence which shows that it has been stopped is a combination of land and ocean data. The oceans do not heat as much as the land because it absorbs more of the heat and when the data are combined with the land data then the other groups have shown that when it does seem to be leveling off. We have not seen that in the land data.”
My first thought would be that it would be remarkable if it was……….
[…]
Fig 1 shows the past ten years plotted from the monthly data from Best’s archives. Click on the image to enlarge.
It is a statistically perfect straight line of zero gradient. Indeed, most of the largest variations in it can be attributed to ENSO and la Nina effects. It is impossible to reconcile this with Professor Muller’s statement. Could it really be the case that Professor Muller has not looked at the data in an appropriate way to see the last ten years clearly?
Indeed Best seems to have worked hard to obscure it. They present data covering more almost 200 years is presented with a short x-axis and a stretched y-axis to accentuate the increase. The data is then smoothed using a ten year average which is ideally suited to removing the past five years of the past decade and mix the earlier standstill years with years when there was an increase. This is an ideal formula for suppressing the past decade’s data.
[…]
Only a few years ago many scientists and commentators would not acknowledge the global temperature standstill of the past decade. Now that it has become unarguable there has emerged more explanations for it than can possibly be the case.
Uh oh: It was the BEST of times, it was the worst of times
Posted on October 29, 2011 by Anthony Watts
Alternate title: Something wonky this way comes
I try to get away to work on my paper and the climate world explodes, pulling me back in. Strange things are happening related to the BEST data and co-authors Richard Muller and Judith Curry. Implosion might be a good word.
Popcorn futures are soaring. BEST Co-author Judith Curry drops a bombshell:
I’ve been looking through Curry’s posts at Climate Etc. She seems to be in rapid catch-up mode with all the sceptic angles that Jo Nova or Alan Cheetham (Global Warming Science) for example have already raked over extensively e.g. She’s posted:-
“Tropospheric and surface temperatures” by Donald Rapp (impressive Biosketch)
He says in respect to Christy et al. (2010)l.dealing with that topic:-
There are two aspects of this result that are particularly important. One is simply that two long periods without a statistically significant increase in TLT would seem to contradict the view that continuously rising CO2 is continuously driving up TLT. The second aspect deals with the scaling ratio of the trend of TLT to the trend of TS in the tropics. Climate models consistently predict this ratio to be ~1.4; the tropospheric temperature is expected to rise faster than the surface temperature. However, as Christy et al. (2010) pointed out, the observed linear trend for TLT (0.9°C/decade) is only about 80% of the observed linear trend for TS, so the observed scaling ratio is roughly 0.8, not the predicted value of 1.4. These results cast doubt on the veracity of climate models, and also suggest that a linear rate of temperature rise does not necessarily result from a linear increase in CO2 concentration.
These issues are hardly news in the sceptic domain. I pointed much of this out to MftE CC in my Climate Metrics case but Santer trumps Christy apparently. “The issues have been resolved” is approximately what I received back (although negotiations are continuing).
BEST statistics show hot air doesn’t rise off concrete!
BREAKING: Steven McIntyre reports that “649 Berkeley stations lack information on latitude and longitude, including 145 BOGUS stations. 453 stations lack not only latitude and longitude, but even a name. Many such stations are located in the country “[Missing]“, but a large fraction are located in “United States”. Steve says: “I’m pondering how one goes about calculating spatial autocorrelation between two BOGUS stations with unknown locations.”
The BEST media hit continues to pump-PR around the world. The Australian repeats the old-fake-news “Climate sceptic Muller won over by warming”. This o-so-manufactured media blitz shows how desperate and shameless the pro-scare team is in their sliding descent. There are no scientists switching from skeptical to alarmist, though thousands are switching the other way.
The sad fact that so many news publications fell for the fakery, without doing a ten minute google, says something about the quality of our news. How is it headline material when someone who was never a skeptic pretends to be “converted” by a result that told us something we all knew anyway (o-look the world is warming)?
The [six] points every skeptic needs to know about the BEST saga [abbreviated]:
1. Muller was never a skeptic
2. BEST broke basic text-book rules of statistics
3. The BEST results are very “adjusted” and not the same as the original thermometer readings
4. Obviously hot air doesn’t rise off concrete
5. The BEST data show the world hasn’t warmed for 13 years.
6. The BEST group have their own vested conflicts
And as Jo reports, “Judith Curry by the way has just threatened to quit”.
One of the authors of a scientific study billed as the ‘end of scepticism’ about climate change yesterday threatened to quit after she said the project leader underplayed the fact there has been no global warming for 13 years.
The Berkeley Earth team uses a statistical tool it calls a scalpel to cut out data when there is a gap. It uses a process called Kriging to fill in gaps in data, borrowed from geoscientists and surveyors, who used it, for example, to estimate the elevation at point B that is between A and C, points where altitude is known. And it weights data more heavily from stations that are reliable than that from those that produce erratic numbers.
The result is a statistical model of what the temperature was at any given moment at any given location. Actual thermometer readings, when they are available, are used only as fodder for the statistical model.
The Berkeley scientists “have a very complicated model,” says William Briggs, a member of the probability and statistics committee of the American Meteorology Society. “They reported on the setting of one of those dials” in the model. “That is not the actual temperature.”
This work needs to be brought to a wider audience because it paints a very different climate picture to the global land datasets based on minimum and maximum temperatures – GISS, HadCRUT and the recent BEST analysis.
I’ll present his discoveries in three parts, as there appear to be three significant elements in his work.
1. Using a minimum and maximum temperature dataset exaggerates the increase in the global average land surface temperature over the last 60 years by approximately 45%
2. Almost all the warming over the last 60 years occurred between 6am and 12 noon
3. Warming is strongly correlated with decreasing cloud cover during the daytime and is therefore caused by increased solar insolation
I’ll then add a part 4 covering some additional analysis of mine
4. Reduced anthropogenic aerosols (and clouds seeded by anthropogenic aerosols) are the cause of most the observed warming over the last 60 years
[…]
GISS, HadCRUT and BEST wrongly assume the mean of Tmin+Tmax represents average daily (24 hour) temperatures.
Note at this point, that any day time warming that does not persist through the following night time is irrelevant to climate warming, as it is not heat that is retained in the climate system for more than 24 hours. So even a correctly derived average temperature is a poor indicator of the amount of warming.
In summary, calculating average temperature from the 3 hourly data instead of the Tmin+Tmax/2 method results in ~45% less warming since 1950.
For those tracking the daily global temperature updates at the Discover website, you might have noticed the continuing drop this month in global temperatures. The mid-tropospheric AMSU channels are showing even cooler temperatures than we had at this date with the last (2008) La Nina. The following screen shot is for AMSU channel 6
Warwick Hughes http://www.warwickhughes.com/blog/?cat=1
has a post called Interesting nuances in IPCC groups treatment of New Zealand temperature data
It links to a blog article of his from 2007 pointing out the degree of warming over different timespans for two selections of grid cells covering New Zealand – for CRUT2 – CRUT3 (UKMO) and NIWA which contains two graphics here comparing the “old” CRUTem2 data of Jones, with the new Hadley Centre CRUT3 and he says ‘Looking at 140 years of New Zealand land based data we now see huge extra warming in the Hadley Centre CRUT3 gridded data.’
Warwick Hughes is a New Zealand-born graduate in geology from Auckland University who has worked in Australia for many years and has carried out pioneering research on surface temperature measurement. (bio from Dr Vincent Gray at http://www.warwickhughes.com/gray04/nzct48.htm)
If you use the ‘search’ for NIWA you find a couple of very interesting articles which are of course relevant to NZ
He said that New Zealand temperature trends as measured by the Jones et al group at the University of Norwich for the IPCC (United Nations Intergovernmental Panel on Climate Change) show warming of 0.63 degrees C over the 27 years 1979 to 2005 which is 0.38 degrees more than the seven station NIWA anomaly trend which warms at 0.25 degrees C.
“When we have the IPCC believing that New Zealand is warming at more than double the rate that NIWA publishes, then this is a sign that the science is far from settled,” said Mr Hughes.
Richard; I just sent an e mail to Warwick Hughes; I recall I mentioned him in replies to BOM as well as this thread but if I do a search at your search request it dosen’t come up with comments in which he’s mentioned; can that be fixed is that too much hassle
thanks Richard; much appreciated; my error was I used quotes and the whole name; anyway I’ve sent that on to Warwick as I like to keep informed those people about whom I say something
Here’s the inimitable Dr Tim Ball on data manipulation http://canadafreepress.com/index.php/article/28360
Data collection is expensive and requires continuity – it’s a major role for government. They fail with weather data because money goes to political climate research. A positive outcome of corrupted climate science exposed by Climategate, is re-examination beginning with raw data by the UK Met Office (UKMO). This is impossible because much is lost, thrown out after modification or conveniently lost, as in the case of records held by Phil Jones, director of Climategate. (Here and Here)
Evidence of manipulation and misrepresentation of data is everywhere. Countries maintain weather stations and adjust the data before it’s submitted through the World Meteorological Organization (WMO) to the central agencies including the Global Historical Climatology Network (GHCN), the Hadley Center associated with CRU now called CRUTEM3, and NASA’s Goddard Institute for Space Studies (GISS).
They make further adjustments before selecting stations to produce their global annual average temperature. This is why they produce different measures each year from supposedly similar data.
There are serious concerns about data quality. The US spends more than others on weather stations, yet their condition and reliability is simply atrocious. Anthony Watts has documented the condition of US weather stations; it is one of governments failures.” (end of quote)
and Ken Stewart has shown how this has happened in Australia http://kenskingdom.wordpress.com/2010/07/27/the-australian-temperature-record-part-8-the-big-picture/
his conclusion?
In a commendable effort to improve the state of the data, the Australian Bureau of Meteorology (BOM) has created the Australian High-Quality Climate Site Network. However, the effect of doing so has been to introduce into the temperature record a warming bias of over 40%. And their climate analyses on which this is based appear to increase this even further to around 66%.
Richard; Are the results of NZCSCET data check publicly available, if so could you point me to the site; I’ll just check their site to make sure it’s not there; I know Ken Stewart will be interested
Hi Val,
The NZ Climate Science Education Trust has been set up to make legal representation easier in our case against NIWA. The case is an Application for Judicial Review — basically asking a judge to look at something we disagree with. As you know, the Coalition (NZCSET) has lodged a Statement of Claim and in response, NIWA’s lawyers have lodged a Statement of Defence. The next move is a conference to decide whether the parties agree. If not, there could a day in court.
There’s a lot of interest, both here and around the world, but there’s no news yet, sorry!
Dr Fred Singer has an opinion piece in The American Thinker today http://www.americanthinker.com/2010/11/the_french_academy_lays_an_egg.html
quoting one para
Even more important, weather satellite data, which furnish the best global temperature data for the atmosphere, show essentially no warming between 1979 and 1997. Now, according to well-established theories of the atmosphere as expounded by textbooks, the surface warming must be smaller than the atmospheric trend by roughly a factor of two. But one half of zero is still zero. It suggests that the surface warming reported by the IPCC, based on weather-station data that had been processed by the Climate Research Unit of East Anglia University (CRU-EAU) may not exist. How could this have come about? We will get the answer once we learn how the CRU selected particular weather stations (from some thousands worldwide) to use for their global product and how they then corrected the actual data (to remove urban influences and other effects). So far, none of the several investigations of “Climategate” has delved into these all-important details. Nor have they established the exact nature of the “trick” used by the CRU and fellow conspirators to “hide the decline” (of temperature) — referred to in the leaked Climategate e-mails.
The “trick to hide the decline” refers to the splicing of the tree ring proxies to the temperature records. The decline refers specifically to the decrease in the apparent recent temperature shown in the proxies, and is commonly referred to as the “divergence problem”
It does not refer to a decline in measured land temperatures in this context.
here’s another article by Dr S Fred Singer http://www.americanthinker.com/2010/11/the_green_bubble_is_about_to_b.html
There is a revolution coming that is likely to burst the green global warming bubble: the temperature trend used by the IPCC (the U.N.’s Intergovernmental Panel on Climate Change) to support their conclusion about anthropogenic global warming (AGW) is likely to turn out to be fake. The situation will become clear once Virginia’s attorney general, Kenneth Cuccinelli, obtains information now buried in e-mails at the University of Virginia. Or Hearings on Climategate by the U.S. Congress may uncover the “smoking gun” that demonstrates that the warming trend used by the IPCC does not really exist.
and more on the Case http://legaltimes.typepad.com/blt/2010/11/global-warming-foia-suit-against-nasa-heats-up-again.html#more
Global Warming FOIA Suit Against NASA Heats Up Again
In court documents filed last night, the Competitive Enterprise Institute argues that NASA has gone out of its way to avoid turning over records that show the agency reverse-engineered temperature data to better make the case that the planet is becoming warmer
I’ve just posted this at Jo Nova’s blog but worth repeating here
Looking at BOM and temperatures I was directed from WUWT to this comment by Warwick Hughes http://www.warwickhughes.com/climate/gissbom.htm
At http://www.warwickhughes.com/climate/bom.htm Warwick says ‘Internationally the BoM does not have things all its own way, NASA / GISS publish on their web site (see www links) UHI adjusted trends for Australian cities and also have for download SE Australian rural records that show no warming since late 19 C, which the BoM say are incorrect’
the comment at http://www.warwickhughes.com/climate/gissbom.htm
is posted 27/9/2000 and Warwick says ‘This page presents three Australian State Capital temperature records compared to nearby rural or more rural neighbours.
This is a significant area for study because south eastern Australia has probably the best set of long term rural temperature records in the southern hemisphere.
GISS / NASA adjustments for urban warming are presented in the case of the State Capitals, contrasting with the Australian Bureau of Meteorology (BoM) adjustments.’
The differerence is stark with BoM finding much warming over SE Australia since late 19 C while NASA finds little trend.
At http://www.warwickhughes.com/climate/ 26, June, 2005, Warwick has a site which in his words exposes the errors and distortions in temperature records used by the IPCC as evidence of “global warming”, Warwick says the central contention of these pages is that for over a decade the IPCC has published global temperature trends distorted by purely local warmth from Urban Heat Islands (UHI’s). There is simply no systematic compensation for urban warming in the Jones dataset. Occasionally there is a slight adjustment in a record for a site change or other anomaly but the majority of records are used “raw”.
Warwick gives some history of this amazing work at http://www.warwickhughes.com/
For those who do not know this is his blog and you can learn about him as the ‘about’ button http://www.warwickhughes.com/blog/
He was also the recipient of the famous Jones letter which you can read about on WUWT http://wattsupwiththat.com/2009/09/23/taking-a-bite-out-of-climate-data/ ‘The Dog ate Global Warming’ That article informs us that ‘Warwick Hughes, an Australian scientist, wondered where that “+/–” came from, so he politely wrote Phil Jones in early 2005, asking for the original data. Jones’s response to a fellow scientist attempting to replicate his work was, “We have 25 years or so invested in the work. Why should I make the data available to you, when your aim is to try and find something wrong with it?”
WUWT has an interesting post up today A J Strata http://wattsupwiththat.com/
Bottom Line – Using two back-of-the-envelope tests for significance against the CRU global temperature data I have discovered:
■75% of the globe has not seen significant peak warming or cooling changes between the period prior to 1960 and the 2000′s which rise above a 0.5°C threshold, which is well within the CRU’s own stated measurement uncertainties o +/- 1°C or worse.
■Assuming a peak to peak change (pre 1960 vs 2000′s) should represent a change greater than 20% of the measured temperature range (i.e., if the measured temp range is 10° then a peak-to-peak change of greater than 2° would be considered ‘significant’) 87% the Earth has not experienced significant temperature changes between pre 1960 period and the 2000′s.
GISS shows temperatures rising sharply since July. We have been having a record cold La Niña since then, and everyone else shows temperatures plummeting.
GISS also showed a huge spike in March which nobody else saw. Does this have anything to do with Hansen’s constant claims of 2010 as the hottest year ever? He shows peak La Niña temperatures almost as warm as peak El Niño temperatures. That is simply ridiculous.
Yes, Richard I saw that this morning and left a comment on Warwick Hughes’ blog; I’m hoping he will make a comment;
Dr Ball has an article here http://www.canadafreepress.com/index.php/article/20528
he says How agencies like the Goddard Institute for Space Studies (GISS) or the Climatic Research Unit (CRU) adjust for the UHIE is the crux of the problem.
the article contains excerpts from Warwick Hughes’ research which finished in 1991 and was continued by Long
Dr Ball ends by saying ‘Now you see why the CRU and IPCC limited the number of stations they were using and restricted them to mostly urban stations to get the result they wanted. You also understand why Tom Wigley told Jones in a leaked email of November 6, 2009 that, “We probably need to say more about this (Difference between land and ocean temperatures). Land warming since 1980 has been twice the ocean warming and skeptics might claim that this proves that urban warming is real and important.” Exactly Tom! ‘
I had an idea that Warwick had a 1995 peer reviewed paper on the topic but I can’t find it
NIWA today released a report reviewing its seven station temperature series, which adds to its analysis of New Zealand’s temperature trends over the past 100 years.
The report was independently peer reviewed by Australia’s Bureau of Meteorology to ensure the ideas, methods, and conclusions stood up in terms of scientific accuracy, logic, and consistency.
NIWA CEO John Morgan confirmed that the scientists from the Bureau’s National Climate Centre concluded that the results and underlying methodology used by NIWA were sound.
“We asked the Australian Bureau of Meteorology to conduct the peer review to ensure a thorough examination by an independent, internationally respected, climate science organisation”, said NIWA CEO John Morgan.
Mr Morgan confirmed that the scientists from the Bureau’s National Climate Centre concluded that the results and underlying methodology used by NIWA were sound.
NIWA’s seven station temperature series comprises temperature records from Auckland, Wellington, Masterton, Nelson, Hokitika, Lincoln, and Dunedin. The seven locations were chosen because they have robust and well documented temperature records that go back 100 years or more, and they represent a good geographical spread across New Zealand.
Temperature data from the seven locations were first examined 30 years ago by leading New Zealand climatologist, Dr Jim Salinger. After making some adjustments for changes in measurement sites, Dr Salinger concluded that the average New Zealand temperature had warmed significantly during the 20th Century.
The series from the seven stations were reviewed in 1992, and then updated annually. They indicated a warming of about 0.9C over the 100 years up to 2009.
In 2010, NIWA re-analysed the Hokitika station temperature series and published the results to demonstrate the methodology applied in creating a temperature series. Because of the public interest in climate data, the NIWA Board and the Minister of Research, Science & Technology, Dr Wayne Mapp, asked that a full review of each of the seven sites be undertaken by NIWA. That review has been completed, independently peer reviewed, and the report released today represents the results of that work.
“I am not surprised that this internationally peer reviewed 2010 report of the seven station temperature series has confirmed that NIWA’s science was sound. It adds to the scientific knowledge that shows that New Zealand’s temperature has risen by about 0.9 degrees over the past 100 years” Mr Morgan said.
NIWA today released a report reviewing its seven station temperature series, which adds to its analysis of New Zealand’s temperature trends over the past 100 years.
The key result of the re-analysis is that the New Zealand-wide warming trend from the seven-station series is almost the same in the 2010 revised series as in the previous series. That is, the previous NIWA result is robust. In terms of the detail for individual sites, the 100-year trend has increased at some sites, and decreased at others, but it is still within the margin of error, and confirms the temperature rise of about 0.9 degrees over the last 100 years.
What is the seven station temperature series?
The seven station series is an analysis of temperature trends from climate station data at seven locations which are geographically representative of the country, and for which there is a long time series of climate records (over 100 years): Auckland, Wellington, Masterton, Nelson, Hokitika, Lincoln (near Christchurch), and Dunedin.
The concept of the seven-station temperature series was originally developed by Dr Jim Salinger in 1981. He recognised that, although the absolute temperatures varied markedly from point to point across the New Zealand landscape, the variations from year to year were much more uniform, and only a few locations were actually required to form a robust estimate of the national temperature trend.
The seven-station series was revised and updated in 1992, and again in 2010.
How is the seven station series constructed?
At each of the seven locations there have been changes in specific climate station location over time. When you create a long time series by adding information from each of these station locations together, you have to make adjustments to account for these changes.
There have been various changes in location etc over time at each of the seven locations making up the seven station series. In order to create a long time series at each location, it is necessary to merge temperature records from a number of specific sites have been merged together to form a long time series. When merging different temperature records like this, it is necessary to adjust for climatic differences from place to place or significant biases would be introduced. Adjustments may also be needed even if the site does not physically move, because changes in exposure or instrumentation at a given site can also bias the temperature measurements.
Why do you need to adjust the raw data?
Long time series of climate data often have artificial discontinuities – caused, for example, by station relocations, changes in instrumentation, or changes in observing practice – which can distort, or even hide, the true climatic signal. Therefore, all climatic studies and data sets should be based on homogeneity-adjusted data.
That is what NIWA climatologists have done in the seven station series, and the seven individual station review documents outline the adjustments.
The raw (original) climate station data have not been changed and are freely available on the NIWA climate database, which means that the NIWA seven station series can be easily reproduced.
How does the NZ temperature rise compare with the global temperature rise?
The 2010 NIWA review of long-term trends in New Zealand’s annual mean temperature showed a rise of about 0.9 degrees Celsius over the last 100 years (0.91 +/- 0.3 degrees C per Century). The IPCC estimate of the 100 year global temperature trend from 1906 to 2005 is 0.74 degrees Celsius, The difference between the IPCC global trend and NIWA’s New Zealand number is less than the margin of error of the trend estimates. The IPCC estimate is a global average – different parts of the world will show different rates of warming. New Zealand is strongly influenced by its oceanic environment.
What is the peer review process?
In this scientific context, peer review constitutes a thorough evaluation of the NIWA documents by individuals with appropriate qualifications in the same field of science (climatology/meteorology). The documents were written by highly qualified NIWA scientists, internally reviewed by other similarly qualified NIWA scientists, and the revised documents were then externally reviewed by Australian Bureau of Meteorology scientists. The external review examined the ideas, methods, and conclusions for scientific accuracy, clarity, and logic.
The NIWA authors then addressed the comments made by the reviewers at the Bureau of Meteorology, and the subsequently revised documents have now been published on the NIWA website.
The 2010 NIWA review of long-term trends in New Zealand’s annual mean temperature showed a rise of about 0.9 degrees Celsius over the last 100 years (0.91 +/- 0.3 degrees C per Century). The IPCC estimate of the 100 year global temperature trend from 1906 to 2005 is 0.74 degrees Celsius, The difference between the IPCC global trend and NIWA’s New Zealand number is less than the margin of error of the trend estimates. The IPCC estimate is a global average
But how does it compare with the Southern Hemisphere (SH) trend?
And you’ve even found more warming than the global trend, NIWA.
The review does not constitute a reanalysis of the New Zealand ‘seven station’ temperature
record. Such a reanalysis would be required to independently determine the sensitivity of, for
example, New Zealand temperature trends to the choice of the underlying network, or the
analysis methodology. Such a task would require full access to the raw and modified
temperature data and metadata, and would be a major scientific undertaking
Well spotted, Andy. They’re saying that what they accused the NZCSC of “already knowing”, back in November 2009, is actually “a major scientific undertaking”.
The other main change is the starting year of the published time series. The early temperature records are less reliable, and there are very few comparison sites pre-1910 to confidently determine site adjustments. We have not used data prior to 1900, and produced the seven-station average from 1909 onwards.
Page 7
These straight line trends are meaningless – its a sine curve.
SST trend +0.71C
7SS trend +0.91C
But atmosphere warming significantly faster than ocean ???
Both of those show cooling over the last decade – Ha!
Page 13
The review does not constitute a reanalysis of the New Zealand ‘seven station’ temperature record. Such a reanalysis would be required to independently determine the sensitivity of, for example, New Zealand temperature trends to the choice of the underlying network, or the analysis methodology. Such a task would require full access to the raw and modified temperature data and metadata, and would be a major scientific undertaking. As such, the review will constrain itself to comment on the appropriateness of the methods used to undertake the ‘seven station’ temperature analysis, in accordance with the level of the information supplied.
Page 25
They found even MORE warming at Auckland – its worse than we thought.
(and Masterton, and Wellington)
Basically NIWA gave BOM its methodology and BOM confirmed that NIWA used it – seems reasonable given its NOT A REANALYSIS.
The University of Virginia (U.Va.) had stalled since last year in handing over its record relating to accusations against a former academic employee implicated in Climategate affair of November 2009.
The researcher at the center of the underliying controversy is global warming doomsayer, Professor Michael Mann who now works at Penn. State University. Mann, a Lead Author for the United Nations Intergovernmental Panel on Climate Change (IPCC) has been under increased scrutiny since the climate fraud scandal hit the headlines over a year ago.
The latest story appears on the SPPI website which reports, “Court records reveal that counsel for the University has indicated instead that the Mann-related records do in fact exist, on a backup server. To avoid University delay or claims for huge search fees, today’s request specifically directs the school to search that server.”
Many people have noticed that the global mean temperature was increasing in the first third of the 20th century (33 years), slightly decreasing in the second third of the 20th century (33 years), and increasing in the last third of the 20th century (33 years or so).
If this evolution may be extrapolated in the obvious way, there is a 66-year cycle in the temperatures. The 20th century warming is biased because the century included two warming half-periods but only one cooling half-period: that’s why the warming trend in the 20th century could be much higher than the long-term one.
Some of the warmest years – at least in the U.S. – appeared approximately 66 years before the warm years such as 1998.
Examining the past 15 years of monthly global temperature anomalies, the per century change from a warming trend to a cooling trend becomes clear.
Since 2001 the per century trends have conclusively switched from a global warming direction to a global cooling direction. In addition, the early 2011 temperature anomalies confirm what has actually been taking place since 2001. If the May 2011 10-year trend continues, the global temperature by 2100 will have decreased by -0.67°C.
This warming to cooling reversal has happened in the face of “business as usual” increases in atmospheric CO2 levels.
Jones was speaking of the 15 years starting in 1996 and ending in 2010. Although the warming has become “statistically significant” in his opinion (not others), the actual level of warming is literally immaterial when put into the context of catastrophic warming (from 5 to 10 degrees Celsius by 2100) pushed by government payroll scientists enthralled (enriched?) with alarmism.
To better understand the level of immaterial warming that has happened, look at the chart above (adjacent). The red curve is a plot of monthly HadCRUT anomalies for the 15 years (180 months) ending May 2011. The light blue curve is a 2nd order curve fit of the anomalies. The black dots represent monthly atmospheric CO2 levels and gray curve the 2nd order fitting to those CO2 levels. (charts and stats done in Excel)
[…]
Below are charts for other major temperature datasets for the last 15 years ending May 2011. Regardless of the dataset, global temperatures are in a slight cooling phase presently and they could continue to go down, and then again, they may not. That’s what natural climate change is about.
Hansen found the region north of 80N pretty darn hot during September, even though he doesn’t have any thermometers there. DMI does have thermometers there, and they found that September was normal or below normal. Whatever Hansen is doing bears no resemblance to science.
[See plot]
With 250km smoothing, it is clear that the hot area north of 80N is all “missing data.”
[See plot]
DMI shows normal to below, using the same baseline period.
[See plot]
By making up hot temperatures in the Arctic, Hansen is able to drag the global temperature anomaly up significantly.
GISS has invented some new corrections to heat the Arctic. Until a few weeks ago, Reykjavik, Iceland looked like the graph below, with the 1930s and 1940s warmer than the present :
[See plot]
Now it looks like this, with the 1930s and 1940s much cooler than the present.
[See plot]
The new technique is called “removing suspicious records” – i.e any record which doesn’t support his theory.
Hansen seems to have adopted a general strategy of warming the present by cooling the 1930s and 1940s.
[See blink comparators and 1940 Arctic warming documentation]
Ole Heinrich points out that Hansen is cooling the past all over the Arctic. Compare the current Nuuk, with the one from a few weeks ago.
[See blink comparator]
Hansen must have spoken to his witch doctor, who told him that all temperatures before 1980 were exactly one degree too high, and that all temperatures since 1990 were a little too low.
[See adjustments plot]
Richard Muller (the skeptic) says that GISS data is all golden.
Following on from yesterday’s Icelandic Saga, I now have the real temperature records for Reykjavik from the Iceland Met Office. You will recall that GISS had recently altered their temperature records for several stations in Iceland and Greenland – full story here (Linked]. Information on the Iceland Met Office site did not seem to support Hansen’s revised figures, but now I have the full temperature record for Reykjavik from them, which shows how much GISS have adjusted their temperatures up by.
Yesterday while finishing off my post “NOAA Don’t Believe The Iceland Met Office”, by chance, I took another look at the latest GISS graph for Reykjavik, which is shown above. It showed, just as it had last week, that the temperature in 2003 was much higher than 1939 and 1941, which as we now know from the Iceland Met Office is not true. However something else looked wrong, that I could not put my finger on.
Fortunately, however, I had kept a printout of last week’s dataset and eventually I realised something else had changed. Effectively the scale had been shifted down, so that every singly year became about a degree cooler.
>>>>>>>>>
Footnote
I have contacted the Iceland Met Office to get their view on the GHCN adjustments. Their reply could not be clearer :-
a) They were not aware these adjustments were being made.
b) They do not know why they have been made, but are asking!
c) They do not accept the “corrections” and have no intention of amending their own records.
How GISS Has Totally Corrupted Reykjavik’s Temperatures
By Paul Homewood January 25, 2012
Now that GHCN have created a false warming trend in Iceland and Greenland , and GISS have amended every single temperature record on their database for Reykjavik going back to 1901 (except for 2010 and 2011), we should have a look at the overall effect.
[See plot]
The red line reflects the actual temperature records provided by the Iceland Met Office and shows quite clearly a period around 1940, followed by another 20 years later, which were much warmer than the 1970’s. GISS, as the blue line shows, have magically made this warm period disappear, by reducing the real temperatures by up to nearly 2 degrees.
Meanwhile the Iceland Met Office say that “The GHCN “corrections” are grossly in error in the case of Reykjavik”.
Google Warming: Google Sponsors Student To Fabricate “Global Warming” Temperatures For NASA
Corruption of climate science takes all sorts of forms – one is to fabricate global warming temperatures after the fact, using “correcting” algorithms that NASA / GISS favors, which it now appears to have been outsourced to a Google-funded effort – aka ‘Google Warming’
[…]
The adjustments done to historical temperatures during 2011 provides further evidence that climate data corruption is alive and well within the climate science community. But the big surprise is who actually performed the magical global warming of Arctic regions….
“To isolate these “abrupt shifts”, they use an algorithm. And it was changes to this algorithm in July 2011 by a Google Summer Student [add’l info here]…that suddenly produced this swathe of anomalous adjustments in Greenland, Iceland and Siberia. The Icelandic Met have confirmed that there have been no station moves or other non-climatic factors, which would have created the need for the adjustments in Iceland, and of course the algorithms in use previously in GHCN V2 and V3 did not spot anything unusual in the temperature data.”
Voila, we can now add the term ‘Google Warming’ to the climate debate – perhaps understood to mean the following?: “to fabricate global warming.”
Just out:
Hausfather et al
The Impact of Urbanization on Land Temperature Trends
While urban warming is a real phenomenon, it is overweighted in land temperature reconstructions due to the oversampling
of urban areas relative to their global land coverage. Rapid urbanization over the past three decades has likely contributed
to a modest warm bias in unhomogenized global land temperature reconstructions, with urban stations warming about ten
percent faster than rural stations in the period from 1979 to 2010. Urban stations are warming faster than rural stations on
average across all urbanity proxies, cutoffs, and spatial resolutions examined, though the underlying data is noisy and there
are many individual cases of urban cooling. Our estimate for the bias due to UHI in the land record is on the order of 0.03C
per decade for urban stations. This result is consistent with both the expected sign of the effect and regional estimates
covering the same time period (Zhou et al 2004) and differs from some recent work suggesting zero or negative UHI bias.
Also, the “GISS” series from 1978/12 – 2011/11 looks nothing like the corresponding GISTEMP series. Gareth’s “GISS” series was at 0 anomaly 1978/12 and does not go below that after but GISTEMP is at or below 0 six times after 1978/12, the last time being 1993/12.
Are “GISS” and GISTEMP two totally different series?
“GISS” might be land-only because the JunkScience source for GISTEMP is land+SST:-
I had two more comments posted under the Dom Salinger article #75 and #76 addressed to RW, #75 is:-
******************************************************************************************************
Nonentity #75 05:14 pm Jan 13 2012
RW #73, I see the leader of your local groupthink circle (Gareth Renowden at Hot Topic) has just used GISTEMP to crow about a 0.51 C Nov 2011 anomaly.
Good thing he didn’t end his series with the Dec 2011 anomaly (0, first time since 1993) otherwise he’d have to explain an exactly average anomaly.
Tricky, and a bit boring for the circle I would imagine.
But what has Gareth plotted? (“the graph I created”)
First of all, the +0.51 C is the D-N Ann Mean anomaly, not “Jan – Nov 2011” as he states and not the +0.48 Nov anomaly that I thought was +0.51 (my error)..
He’s plotted the D-N Ann Means at for each year – not the monthly anomalies and there’s obviously a lot less datapoints. There’s a negative monthly anomaly at 1994/2 that does not occur on “the graph I created”. His first negative anomaly is the 1976 Ann Mean.
The 10 C line is when “everything shuts down” apparently. That the current soil temperatures are close to the 10 C line so soon after summer (due to cloudiness) is not good from what I can gather.
As found at Hydro Services after watching ‘Rob’s Country’ on Central TV. Click on ‘AS SEEN ON CTV VIEW Previous Week’s Soil Temperatures’.
Central and CTV are different outfits, ‘Rob’s Country’ is a CTV programme that was based in the CTV building that collapsed in the earthquake. Rob was not in the building at the time and he operated out of Central in the Waikato for a while after I think.
What will these temps do to the SI-biased NZT7 2012 figure (12.8 2011, 13.1 2010)?
Extreme June cold snap breaks records
The extreme cold snap which hit much of the country in early June brought the lowest daytime maximum temperatures on record to parts of Canterbury, and the lowest June maximums to some other areas.
In its June climate summary, Niwa said the temperature at Christchurch airport reached only 0.4degC on June 6, as heavy snow fell throughout the day.
In Waipara West, 60km north of Christchurch, the temperature reached 2.3degC, and in Cheviot it got to 1.3degC. In all three places the temperatures were the lowest daily maximums on record.
Places which recorded their lowest June maximums on that day included Lincoln with 0.7degC – its second-lowest for any month, Arthurs Pass with -1.2degC, Blenheim with 5.6degC, and Greymouth with 5.2degC – also its second-lowest on record.
Later in the month, Lake Rotoiti had its lowest-recorded June maximum at 2.1degC and Milford Sound had its lowest June maximum of 2.5degC.
Arthurs Pass had its lowest daily minimum June temperature on record, with -11.2degC, Culverden had its lowest with -10.2degC, and Le Bons Bay on Banks Peninsula had its equal-lowest June minimum with a flat zero. Le Bons Bay also had its lowest June monthly-average minimum air temperature at 4.4degC.
I took last Monday afternoon off to go skiing with my son. it was minus 4 degrees C at 11am as we left the garage. It stayed at that temp until 10 km out of town, when it rose to a whole 1 degree C.
My concern for your dog arises from the Oymyakon doco. The inhabitants brought their horses (meat source apparently) in regularly to remove accumulated ice from their backs.
Interesting development. Judith Curry features 3 papers from OUTSIDE of climate science and asks:-
JC comments: Each of these papers provides a fresh perspective on interpreting surface temperature data trends. The authors of all these papers have academic training and expertise that lies well outside of the field of atmospheric science and climatology. I have long said that fresh perspectives from outside the field of climate science are needed. Congrats to Ross for getting his paper published in Climate Dynamics, which is a high impact mainstream climate journal. My main question is whether the lead authors of the relevant IPCC AR5 chapter will pay attention to these papers; they should (and the papers meet the publication deadline for inclusion in the AR5).
Chances of appearing in AR5 may be diminished by acknowledgment of 2 prominent sceptics:-
Acknowledgements: I have benefited greatly from discussions with William Kininmonth and in particular, David Stockwell who introduced me to the Chow Break Test.
Conclusion
There is a strong set of coincident events at or around 2000 that suggest the onset of a cool phase of the Pacific Decadal Oscillation. This is supported by the decreasing humidity in the Northern Pacific Ocean after the break in 2000 (Figure 6) where the probability of the straight line fit showing no decrease is 3%. However for the global surface temperature this analysis has not established whether the cool phase of the Pacific Decadal Oscillation dominates the warm phase of the North Atlantic Decadal Oscillation.
The variations in global temperature, atmospheric CO2, water vapour and atmospheric methane all indicate the importance of the Atlantic and Pacific Decadal Oscillations. This is not easily taken into account in General Circulation Models and until there is a better understanding of the long term behaviour of the oceans, it must be a significant difficultly in projecting future temperatures.
NOAA Conducts Large-Scale Experiment And Proves Global Warming Skeptics Correct
So, what has this NOAA experiment found? The bottom image (Image #3) tells that story – when compared to measurements from the old, inaccurate, non-pristine network, temperature “warming” in the U.S. is being overstated anywhere from +0.5°C on average, up to almost +4.0°C (+0.9°F to +7.2°F) in some locations during the summer months.
To clarify, this range of overstatement depends on the given new and old stations being compared. However, when the new network versus old network results are examined in total, for the recent summer heat wave in the U.S., the old stations were reporting bogus warming during July that amounted to some +2.1°F higher than the actual temperatures.
What does this mean? Within the climate science realm, the old climate/weather station system had long been considered the best and most complete measurement network in the world. But when pitted against a brand new climate measurement system that has the best qualities that science can provide, we find that the traditional U.S. methodology is significantly overstating the “global warming” phenomenon. This means that if other countries replaced their own low quality network with NOAA’s greatest and latest technology, with the best location site standards applied, we would discover that world-wide temperature increases have been wildly overstated also.
So, since 1998, GISS have found an extra 0.33C of warming that RSS cannot find. This is an astonishing figure, which effectively doubles the amount of warming found in total since 1979.
Comparison of GISS with HADCRUT and UAH shows a similar divergence.
As Doctor McCoy might have said “It’s temperature, Jim, but not as we know it”.
“Compared to seasonal norms, over the past month the coldest area on the globe was east central Russia near the town of Nyagan, where temperatures for the month averaged as much as 2.51 C (about 4.5 degrees Fahrenheit) cooler than seasonal norms. Compared to seasonal norms, the “warmest” area on the globe in January was the Norwegian arctic archipelago of Svalbard, which is north of Norway and east of Greenland. Temperatures there averaged 4.1 C (about 7.4 degrees Fahrenheit) warmer than seasonal norms for January.”
The Australian heat wave was just a minor contributor in this geo-spatial plot:-
There must have been very sudden tropospheric warming in polar/sub-polar regions for unexplained reasons from what I can gather. But that is just what the warmists predict of course.
Big drop in global surface temperature in February, ocean temps flat
By Dr. Roy Spencer
UAH Global Temperature Update for February, 2013: +0.18 deg. C
Our Version 5.5 global average lower tropospheric temperature (LT) anomaly for February, 2013 is +0.18 deg. C, a large decrease from January’s +0.50 deg. C.
Global Microwave Sea Surface Temperature Update for Feb. 2013: -0.01 deg. C
The global average sea surface temperature (SST) update for Feb. 2013 is -0.01 deg. C, relative to the 2003-2006 average:
Hi all. I’m wondering if anyone can direct me to some info on the disparities between the night and daytime global temperatures. Is the night warming faster than the day & if so does this prove AGW?
Comments On “The Shifting Probability Distribution Of Global Daytime And Night-Time Temperatures” By Donat and Alexander 2012 – A Not Ready For Prime Time Study
“The reconfirmation now of a strong sun-temperature relation based specifically upon the daytime temperature maxima adds strong and independent scientific weight to the reality of the sun-temperature connection.
The close relationships between the abrupt ups and downs of solar activity and similar changes in temperature that we have identified occur locally in coastal Greenland; regionally in the Arctic Pacific and north Atlantic; and hemispherically for the whole circum-Arctic region. This suggests strongly that changes in solar radiation drive temperature variations on at least a hemispheric scale.
Close correlations like these simply do not exist for temperature and changing atmospheric CO2 concentration. In particular, there is no coincidence between the measured steady rise in global atmospheric CO2 concentration and the often dramatic multi-decadal (and shorter) ups and downs of surface temperature that occur all around the world.”
GISS Figures Out For February | NOT A LOT OF PEOPLE KNOW THAT
By Paul Homewood
While we waiting for HADCRUT, a quick look at GISS numbers for February, which are back down to 0.49C, from 0.60C in January.
The average for 2012 was 0.56C.
I really wanted, though, to show the GISS map for winter temperatures, i.e. Dec – Feb. The map shows the anomalies against the 1981-2010 baseline.
[…]
The line at the top of the map shows that there is an anomaly of 0.10C – in other words, the Dec 2012 – Feb 2013 temperature is a mere 0.10C higher than the 1981-2010 baseline.
Yes, a mere 0.10C. And this during a winter when ENSO conditions have been neutral, so alarmists cannot even hide behind La Nina excuses.
Both UAH and RSS are virtually unchanged. UAH is exactly the same at 0.18C, and RSS is up from 0.19 to 0.20C.
Both are around or below the level they were in Nov/Dec.
I thought we’d have a look at longer term trends today though. Using HADCRUT4 numbers, I have plotted the monthly numbers since 1979, with a 10-year running average.
Within the last decade, temperatures have definitely flatlined. For instance, the latest 12-month average is 0.47C, and this compares with a figure of 0.49C for 2002.
But, intriguingly, the longer term, 10-year average also stopped rising at the end of 2010, and has actually started to fall back. This is not clear on the scale of the above graph, but zooming in below illustrates it well.
The amounts may be small, but the evidence is clear – the world has been getting colder in the last few years. This is not down to a single year’s figures, nor ENSO variations, as the 10 year span will even out such fluctuations.
Perhaps we really should be worrying about global cooling again.
A new study by Stockwell and Stewart, “Biases in the Australian high quality temperature network” has found that warming in Australia over the last century could have been overestimated by 31%.
ABSTRACT
Various reports identify global warming over the last century as around 0.7°C, but warming in Australia at around 0.9°C, suggesting Australia may be warming faster than the rest of the world. This study evaluates potential biases in the High Quality Network (HQN) compiled from 100 rural surface temperature series from 1910 due to: (1) homogeneity adjustments used to correct for changes in location and instrumentation, and (2) the discrimination of urban and rural sites. The approach was to compare the HQN with a new network compiled from raw data using the minimal adjustments necessary to produce contiguous series, called the Minimal Adjustment Network (MAN). The average temperature trend of the MAN stations was 31% lower than the HQN, and by a number of measures, the trend of the Australian MAN is consistent with the global trend. This suggests that biases from these sources have exaggerated apparent Australian warming. Additional problems with the HQN include failure of homogenization procedures to properly identify errors, individual sites adjusted more than the magnitude of putative warming last century, and some sites of such poor quality they should not be used, especially under a “High Quality” banner.
‘Australian Met Office Accused Of Manipulating Temperature Records’
Posted on August 23, 2014 by Anthony Watts
There’s quite a row developing after a scathing article in the Australian, some news clips follow. h/t to Dr. Benny Peiser at The GWPF
The [Australian] Bureau of Meteorology has been accused of manipulating historic temperature records to fit a predetermined view of global warming. Researcher Jennifer Marohasy claims the adjusted records resemble “propaganda” rather than science. Dr Marohasy has analysed the raw data from dozens of locations across Australia and matched it against the new data used by BOM showing that temperatures were progressively warming. In many cases, Dr Marohasy said, temperature trends had changed from slight cooling to dramatic warming over 100 years. –Graham Lloyd, The Australian, 23 August 2014
The escalating row goes to heart of the climate change debate — in particular, whether computer models are better than real data and whether temperature records are being manipulated in a bid to make each year hotter than the last. Marohasy’s research has put her in dispute with BoM over a paper she published with John Abbot at Central Queensland University in the journal Atmospheric Research concerning the best data to use for rainfall forecasting. BoM challenged the findings of the Marohasy-Abbot paper, but the international journal rejected the BoM rebuttal, which had been prepared by some of the bureau’s top scientists. This has led to an escalating dispute over the way in which Australia’s historical temperature records are “improved” through homogenisation, which is proving more difficult to resolve. –Graham Lloyd, The Australian, 23 August 2014
‘Australian Met Office Accused Of Manipulating Temperature Records’
Date: 23/08/14
Graham Lloyd, The Australian
The [Australian] Bureau of Meteorology has been accused of manipulating historic temperature records to fit a predetermined view of global warming.
Researcher Jennifer Marohasy claims the adjusted records resemble “propaganda” rather than science.
Dr Marohasy has analysed the raw data from dozens of locations across Australia and matched it against the new data used by BOM showing that temperatures were progressively warming.
In many cases, Dr Marohasy said, temperature trends had changed from slight cooling to dramatic warming over 100 years.
BOM has rejected Dr Marohasy’s claims and said the agency had used world’s best practice and a peer reviewed process to modify the physical temperature records that had been recorded at weather stations across the country.
It said data from a selection of weather stations underwent a process known as “homogenisation” to correct for anomalies. It was “very unlikely” that data homogenisation impacted on the empirical outlooks.
In a statement to The Weekend Australian BOM said the bulk of the scientific literature did not support the view that data homogenisation resulted in “diminished physical veracity in any particular climate data set’’.
Historical data was homogenised to account for a wide range of non-climate related influences such as the type of instrument used, choice of calibration or enclosure and where it was located.
“All of these elements are subject to change over a period of 100 years, and such non-climate related changes need to be accounted for in the data for reliable analysis and monitoring of trends,’’ BOM said.
Account is also taken of temperature recordings from nearby stations. It took “a great deal of care with the climate record, and understands the importance of scientific integrity”.
Dr Marohasy said she had found examples where there had been no change in instrumentation or siting and no inconsistency with nearby stations but there had been a dramatic change in temperature trend towards warming after homogenisation.
She said that at Amberley in Queensland, homogenisation had resulted in a change in the temperature trend from one of cooling to dramatic warming.
She calculated homogenisation had changed a cooling trend in the minimum temperature of 1C per century at Amberley into a warming trend of 2.5C. This was despite there being no change in location or instrumentation.
”The heat is on. Bureau of Meteorology ‘altering climate figures’ — The Australian’
Joanne Nova, August 23rd, 2014
Congratulations to The Australian again for taking the hard road and reporting controversial, hot, documented problems, that few in the Australian media dare to investigate.
How accurate are our national climate datasets when some adjustments turn entire long stable records from cooling trends to warming ones (or visa versa)? Do the headlines of “hottest ever record” (reported to a tenth of a degree) mean much if thermometer data sometimes needs to be dramatically changed 60 years after being recorded?
One of the most extreme examples is a thermometer station in Amberley, Queensland where a cooling trend in minima of 1C per century has been homogenized and become a warming trend of 2.5C per century. This is a station at an airforce base that has no recorded move since 1941, nor had a change in instrumentation. It is a well-maintained site near a perimeter fence, yet the homogenisation process produces a remarkable transformation of the original records, and rather begs the question of how accurately we know Australian trends at all when the thermometers are seemingly so bad at recording the real temperature of an area.
‘How Australia’s Bureau of Meteorology is Turning Up The Heat’
Written by Graham Lloyd, Australian on 24 August 2014.
When raging floodwaters swept through Brisbane in January 2011 they submerged a much-loved red Corvette sports car in the basement car park of a unit in the riverside suburb of St Lucia.
On the scale of the billions of dollars worth of damage done to the nation’s third largest city in the man-made flood, the loss of a sports car may not seem like much.
But the loss has been the catalyst for an escalating row that raises questions about the competence and integrity of Australia’s premier weather agency, the Bureau of Meteorology, stretching well beyond the summer storms.
It goes to heart of the climate change debate — in particular, whether computer models are better than real data and whether temperature records are being manipulated in a bid to make each year hotter than the last.
With farmer parents, researcher Jennifer Marohasy says she has always had a fascination with rainfall and drought-flood cycles. So, in a show of solidarity with her husband and his sodden Corvette, Marohasy began researching the temperature records noted in historic logs that date back through the Federation drought of the late 19th century.
Specifically, she was keen to try forecasting Brisbane floods using historical data and the latest statistical modelling techniques.
Marohasy’s research has put her in dispute with BoM over a paper she published with John Abbot at Central Queensland University in the journal Atmospheric Research concerning the best data to use for rainfall forecasting. (She is a biologist and a sceptic of the thesis that human activity is bringing about global warming.) BoM challenged the findings of the Marohasy-Abbot paper, but the international journal rejected the BoM rebuttal, which had been prepared by some of the bureau’s top scientists.
‘Heat is on over weather bureau ‘homogenising temperature records’
Written by Dr Jennifer Marohasy on 24 August 2014.
EARLIER this year Tim Flannery said “the pause” in global warming was a myth, leading medical scientists called for stronger action on climate change, and the Australian Bureau of Meteorology declared 2013 the hottest year on record. All of this was reported without any discussion of the actual temperature data. It has been assumed that there is basically one temperature series and that it’s genuine.
But I’m hoping that after today, with both a feature (page 20) and a news piece (page 9) in The Weekend Australia things have changed forever.
I’m hoping that next time Professor Flannery is interviewed he will be asked by journalists which data series he is relying on: the actual recorded temperatures or the homogenized remodeled series. Because as many skeptics have known for a long time, and as Graham Lloyd reports today for News Ltd, for any one site across this wide-brown land Australia, while the raw data may show a pause, or even cooling, the truncated and homogenized data often shows dramatic warming.
When I first sent Graham Lloyd some examples of the remodeling of the temperature series I think he may have been somewhat skeptical. I know he on-forwarded this information to the Bureau for comment, including three charts showing the homogenization of the minimum temperature series for Amberley.
Mr Lloyd is the Environment Editor for The Australian newspaper and he may have been concerned I got the numbers wrong. He sought comment and clarification from the Bureau, not just for Amberley but also for my numbers pertaining to Rutherglen and Bourke.
I understand that by way of response to Mr Lloyd, the Bureau has not disputed these calculations.
This is significant. The Bureau now admits that it changes the temperature series and quite dramatically through the process of homogenization.
I repeat the Bureau has not disputed the figures. The Bureau admits that the data is remodeled.
What the Bureau has done, however, is try and justify the changes. In particular, for Amberley the Bureau is claiming to Mr Lloyd that there is very little available documentation for Amberley before 1990 and that information before this time may be “classified”: as in top secret. That’s right, there is apparently a reason for jumping-up the minimum temperatures for Amberley but it just can’t provide Mr Lloyd with the supporting meta-data at this point in time.
‘Australian Bureau of Meteorology accused of Criminally Adjusted Global Warming’
Written by James Delingpole, Breitbart London on 25 August 2014.
The Australian Bureau of Meteorology has been caught red-handed manipulating temperature data to show “global warming” where none actually exists.
At Amberley, Queensland, for example, the data at a weather station showing 1 degree Celsius cooling per century was “homogenized” (adjusted) by the Bureau so that it instead showed a 2.5 degrees warming per century.
At Rutherglen, Victoria, a cooling trend of -0.35 degrees C per century was magically transformed at the stroke of an Australian meteorologist’s pen into a warming trend of 1.73 degrees C per century.
Last year, the Australian Bureau of Meteorology made headlines in the liberal media by claiming that 2013 was Australia’s hottest year on record. This prompted Australia’s alarmist-in-chief Tim Flannery – an English literature graduate who later went on to earn his scientific credentials with a PhD in palaeontology, digging up ancient kangaroo bones – to observe that global warming in Australia was “like climate change on steroids.”
But we now know, thanks to research by Australian scientist Jennifer Marohasy, that the hysteria this story generated was based on fabrications and lies.
‘BOM finally explains! Cooling changed to warming trends because stations “might” have moved!’
by Joanne Nova, August 26th, 2014
It’s the news you’ve been waiting years to hear! Finally we find out the exact details of why the BOM changed two of their best long term sites from cooling trends to warming trends. The massive inexplicable adjustments like these have been discussed on blogs for years. But it was only when Graham Lloyd advised the BOM he would be reporting on this that they finally found time to write three paragraphs on specific stations.
Who knew it would be so hard to get answers. We put in a Senate request for an audit of the BOM datasets in 2011. Ken Stewart, Geoff Sherrington, Des Moore, Bill Johnston, and Jennifer Marohasy have also separately been asking the BOM for details about adjustments on specific BOM sites. (I bet Warwick Hughes has too). The BOM has ignored or circumvented all these, refusing to explain why individual stations were adjusted in detail.
The two provocative articles Lloyd put together last week were Heat is on over weather bureau and Bureau of Meteorology ‘altering climate figures, which I covered here. This is the power of the press at its best. The absence of articles like these, is why I have said the media IS the problem — as long as the media ignore the BOM failure to supply their full methods and reasons the BOM mostly get away with it. It’s an excellent development The Australian is starting to hold the BOM to account. (No sign of curiosity or investigation at the ABC and Fairfax, who are happy to parrot BOM press releases unquestioned like sacred scripts.)
‘Who’s going to be sacked for making-up global warming at Rutherglen?’
By Jennifer Marohasy on August 27, 2014
HEADS need to start rolling at the Australian Bureau of Meteorology. The senior management have tried to cover-up serious tampering that has occurred with the temperatures at an experimental farm near Rutherglen in Victoria. Retired scientist Dr Bill Johnston used to run experiments there. He, and many others, can vouch for the fact that the weather station at Rutherglen, providing data to the Bureau of Meteorology since November 1912, has never been moved.
Senior management at the Bureau are claiming the weather station could have been moved in 1966 and/or 1974 and that this could be a justification for artificially dropping the temperatures by 1.8 degree Celsius back in 1913.
Surely its time for heads to roll!
[See Rutherglen graph]
Some background: Near Rutherglen, a small town in a wine-growing region of NE Victoria, temperatures have been measured at a research station since November 1912. There are no documented site moves. An automatic weather station was installed on 29th January 1998.
Temperatures measured at the weather station form part of the ACORN-SAT network, so the information from this station is checked for discontinuities before inclusion into the official record that is used to calculate temperature trends for Victoria, Australia, and also the United Nation’s Intergovernmental Panel on Climate Change (IPCC).
The unhomogenized/raw mean annual minimum temperature trend for Rutherglen for the 100-year period from January 1913 through to December 2013 shows a slight cooling trend of 0.35 degree C per 100 years. After homogenization there is a warming trend of 1.73 degree C per 100 years. This warming trend is essentially achieved by progressively dropping down the temperatures from 1973 back through to 1913. For the year of 1913 the difference between the raw temperature and the ACORN-SAT temperature is a massive 1.8 degree C.
There is absolutely no justification for doing this.
This cooling of past temperatures is a new trick* that the mainstream climate science community has endorsed over recent years to ensure next year is always hotter than last year – at least for Australia.
There is an extensive literature that provides reasons why homogenization is sometimes necessary, for example, to create continuous records when weather stations move locations within the same general area i.e. from a post office to an airport. But the way the method has been implemented at Rutherglen is not consistent with the original principle which is that changes should only be made to correct for non-climatic factors.
In the case of Rutherglen the Bureau has just let the algorithms keep jumping down the temperatures from 1973. To repeat the biggest change between the raw and the new values is in 1913 when the temperature has been jumped down a massive 1.8 degree C.
In doing this homogenization a warming trend is created when none previously existed.
The Bureau has tried to justify all of this to Graham Lloyd at The Australian newspaper by stating that there must have been a site move, its flagging the years 1966 and 1974. But the biggest adjustment was made in 1913! In fact as Bill Johnston explains in today’s newspaper [See below], the site never has moved.
Surely someone should be sacked for this blatant corruption of what was a perfectly good temperature record.
‘Climate records contradict Bureau of Meteorology’ [paywalled]
by Graham Lloyd, The Australian. 27th August 2014
THE official catalogue of weather stations contradicts the Bureau of Meteorology’s explanation that the relocation of the thermometer in the Victorian winegrowing district of Rutherglen has turned a cooling into a warming trend. THE Bureau of Meteorology appears to have invented a move in the official thermometer site at the Rutherglen weather station in Victoria to help justify a dramatic revision of historic temperatures to produce a warming trend.
‘Big adjustments? BOM says Rutherglen site shifted, former workers there say “No” ‘
by Joanne Nova, August 27th, 2014
The hot questions for the Australian Bureau of Meteorology (BOM) mount up. Rutherglen was one of the temperature recording stations that was subject to large somewhat mysterious adjustments which turned a slight cooling trend into a strongly warming one. Yet the official notes showed that the site did not move and was a continuous record. On paper, Rutherglen appeared to be ideal — a rare long rural temperature record where measurements had come from the same place since 1913.
The original cooling trend of – 0.35C was transformed into a +1.73C warming after “homogenisation” by the BOM. To justify that the BOM claims that there may have been an unrecorded shift, and it was “consistent” with the old station starting further up the slope before it moved down to the hollow.
Today retired scientist Bill Johnston got in touch with Jennifer Marohasy, with me and with Graham Lloyd of The Australian to say that he worked at times at Rutherglen and the official thermometer had not moved. It was always placed where it is now at the bottom of the hollow. That information has already made it into print in The Australian.
‘Uninformed climate amateurs ask professionals to explain their data revision’
by Joanne Nova, August 28th, 2014
David Karoly knew he had to defend the BOM with regard to the hot questions about adjustments to Amberley, Bourke, and Rutherglen data. What he didn’t have were photos of historic equipment, maps of thermometer sites, or quotes from people who took observations. Instead he wielded the magic wand of “peer review” — whereupon questions asked in English are rendered invalid if they are printed in a newspaper instead of a trade-magazine.
Prof David Karoly, Climate Professional called people who ask for explanations poorly informed amateurs [hotlink, paywall, see below]. In response, we Poorly Informed Climate Amateurs wonder what it takes to get Climate Professionals to inform us? Instead of hiding behind ‘peer review’, vague complex methods, and the glow of their academic aura, the professionals could act professional and explain exactly what they did to the data?
[…]
The articles by Graham Lloyd on Jennifer Marohasy’s analysis are generating debate.
Letters to The Australian [hotlink, see below] 28th August 2014
Bill Johnston, former NSW natural resources research scientist, Cook, ACT…….
Michael Asten, School of Earth Atmosphere and Environment, Monash University, Melbourne. Vic……
Greg Buchanan, Niagara Park, NSW…….
‘Amateurs’ challenging Bureau of Meteorology climate figures [paywalled]
The Australian, August 26, 2014
CONCERNS about the accuracy of the Bureau of Meteorology’s historical data are being raised by “poorly informed amateurs”, one of Australia’s leading climate scientists has said. David Karoly of Melbourne University’s School of Earth Sciences, said claims BOM had introduced a warming trend by homogenising historical temperature data should be submitted for peer review.
‘Rewriting the History of Bourke: Part 2, Adjusting Maximum Temperatures Both Down and uP, and Then Changing Them Altogether’
By Jennifer Marohasy on April 6, 2014
“Anyone who doesn’t take truth seriously in small matters cannot be trusted in large ones either.” Albert Einstein.
[…] In a report entitled ‘Techniques involved in developing the Australian Climate Observation Reference Network – Surface Air Temperature (ACORN-SAT) dataset’ (CAWCR Technical Report No. 049), Blair Trewin explains that up to 40 neighbouring weather stations can be used for detecting inhomogeneities and up to 10 can be used for adjustments. What this means is that temperatures, ever so diligently recorded in the olden days at Bourke by the postmaster, can be change on the basis that it wasn’t so hot at a nearby station that may in fact be many hundreds of kilometres away, even in a different climate zone.
Consider the recorded versus adjusted values for January 1939, Table 1. The recorded values have been changed. And every time the postmaster recorded 40 degrees, Dr Trewin has seen fit to change this value to 39.1 degree Celsius. Why?
NEAR Rutherglen, a small town in a wine-growing region of northeastern Victoria, temperatures have been measured at a research station since November 1912. There are no documented site moves. An automatic weather station was installed on 29th January 1998.
Temperatures measured at the weather station form part of the ACORN-SAT network, so the information from this station is homogenized before inclusion into the official record that is used to calculate temperature trends for Victoria and also Australia.
The unhomogenized/raw mean annual minimum temperature trend for Rutherglen for the 100-year period from January 1913 through to December 2013 shows a slight cooling trend of 0.35 degree C per 100 years, see Figure 1. After homogenization there is a warming trend of 1.73 degree C per 100 years. This warming trend is essentially achieved by progressively dropping down the temperatures from 1973 back through to 1913. For the year of 1913 the difference between the raw temperature and the ACORN-SAT temperature is a massive 1.8 degree C.
‘Hiding something? BOM throws out Bourke’s hot historic data, changes long cooling trend to warming’
JoNova, August 30th, 2014
Hello Soviet style weather service? On January 3, 1909, an extremely hot 51.7C (125F) was recorded at Bourke. It’s possibly the hottest ever temperature recorded in a Stevenson Screen in Australia, but the BOM has removed it as a clerical error. There are legitimate questions about the accuracy of records done so long ago — standards were different. But there are very legitimate questions about the BOMs treatment of this historic data. ‘The BOM has also removed the 40 years of weather recorded before 1910, which includes some very hot times. Now we find out the handwritten original notes from 62 years of the mid 20th Century were supposed to be dumped in 1996 as well. Luckily, these historic documents were saved from the dustbin and quietly kept in private hands instead.
[…] I went and checked not only the old newspapers but also the book in the national archive, because, guess what? The Bureau of Meteorology is claiming it was all a clerical error. They have scratched this record made on 3rd January 1909 from the official record for Bourke, which means it’s also scratched from the NSW and national temperature record.
Yep. It never happened. No heatwave back in 1909.
They have also wiped the heatwave of January 1896. This was probably the hottest January on record, not just for Bourke, but Australia-wide. Yet according to the rules dictated by the Bureau, if it was recorded before 1910, it doesn’t count.
‘1953 Headline: Melbourne’s weather is changing! Summers getting colder and wetter’
By Joanne Nova
Once upon a time — before the Great Politicization of Climate Science — CSIRO was able to analyze trends from 1880 to 1910. In 1953 CSIRO scientists were making a case that large parts of Australia had been hotter in the 1880s and around the turn of last century. >>>>>
‘Bureau of Meteorology warms to transparency over adjusted records’
Graham Lloyd, The Australian
September 11, 2014 12:00AM [Paywall]
THE Bureau of Meteorology has been forced to publish details of all changes made to historic temperature records as part of its homogenisation process to establish the nation’s climate change trend. Publication of the reasons for all data adjustments was a key recommendation of the bureau’s independent peer review panel which approved the bureau’s ACORN SAT methodology.
‘Scientists should know better: the truth was out there’
Graham Lloyd, The Australian
September 11, 2014 12:00AM [Paywall]
IT reflects poorly on key members of Australia’s climate science establishment that tribal loyalty is more important than genuine inquiry. Openness not ad hominem histrionics was always the answer for lingering concerns about what happened to some of the nation’s temperature records under the Bureau of Meteorology’s process of homogenisation.
The Pairwise Homogenization Algorithm [hotlink #1 – see below] was designed as an automated method of detecting and correcting localized temperature biases due to station moves, instrument changes, microsite changes, and meso-scale changes like urban heat islands.
The algorithm (whose code can be downloaded here [hotlink] is conceptually simple: it assumes that climate change forced by external factors tends to happen regionally rather than locally. If one station is warming rapidly over a period of a decade a few kilometers from a number of stations that are cooling over the same period, the warming station is likely responding to localized effects (instrument changes, station moves, microsite changes, etc.) rather than a real climate signal.
To detect localized biases, the PHA iteratively goes through all the stations in the network and compares each of them to their surrounding neighbors. It calculates difference series between each station and their neighbors (separately for min and max) and looks for breakpoints that show up in the record of one station but none of the surrounding stations. These breakpoints can take the form of both abrupt step-changes and gradual trend-inhomogenities that move a station’s record further away from its neighbors.
‘Homogenization of Temperature Series via Pairwise Comparisons’
MATTHEW J. MENNE AND CLAUDE N. WILLIAMS JR. [MW09]
NOAA/National Climatic Data Center, Asheville, North Carolina
(Manuscript received 2 October 2007, in final form 2 September 2008)
Page 4 pdf,
a. Selection of neighbors and formulation of difference series
Next, time series of differences Dt are formed between all target–neighbor monthly temperature series.
To illustrate this, take two monthly series Xt and Yt, that is, a target and one of its correlated neighbors.
Following Lund et al. (2007), these two series can be represented as
XmT+v = uv^x + B^x (mT+v) + SmT+v^x + emT+v^x (1)
and
YmT+v = uv^y + B^y (mT+v) + SmT+v^y + emT+v^y (2)
where
m represents the monthly mean anomaly at the specific series,
T = 12 represents the months in the annual cycle,
v = (1, . . . , 12) is the monthly index,
m = the year (or annual cycle) number,
and the et terms denote mean zero error terms at time t for the two series.
The St terms represent shift factors cause by station changes, which are thought to be step functions.
This seems to be the 95th Percentile Matching (PM-95) method that BOM uses for ACORN-SAT except it’s not an X to Y neighbour comparison as in BOM’s method.
2. An Fmax test statistic
We start with the simple two-phase linear regression
scheme for a climatic series {Xt} considered by Solow
(1987), Easterling and Peterson (1995), and Vincent
(1998; among others). This model can be written in the
form:
Xt = [model] (2.1)
where {et} is mean zero independent random error with
a constant variance.
The model in (2.1) is viewed as a classic simple linear
regression that allows for two phases. This allows for
both step- (u1 ± u2) and trend- (a1 ± a2) type changepoints.
Specifically, the time c is called a changepoint
in (2.1) if u1 ± u2 and/or a1 ± a2. In most cases, there
will be a discontinuity in the mean series values at the
changepoint time c, but this need not always be so (Fig.
10 in section 5 gives a quadratic-based example where
the changepoint represents more of a slowing of rate of
increase than a discontinuity).
# # #
Fmax test statistic series lengths (n) range from 10 (e.g. 10 months, RS93 k=0.4) to 5000 in Table 1, but I can’t see any recommendation for length n in respect to temperature series. There’s nothing said about n in equation 2.2 for example. What happens when a break (c) occurs at time 5 months of n = 100 months for example? Isn’t this just effectively n = 10?
‘Massive Tampering With Temperatures In South America’
By Paul Homewood, January 20, 2015
One of the regions that has contributed to GISS’ “hottest ever year” is South America, particularly Brazil, Paraguay and the northern part of Argentina. In reality, much of this is fabricated, as they have no stations anywhere near much of this area, as NOAA show below.
Nevertheless, there does appear to be a warm patch covering Paraguay and its close environs. However, when we look more closely, we find things are not quite as they seem.
[See data coverage graph]
There are just three genuinely rural stations in Paraguay that are currently operating – Puerto Casado, Mariscal and San Juan. They all show a clear and steady upward trend since the 1950’s, with 2014 at the top, for instance at Puerto Casada: [graph]
It could not be more clearcut, could it? However, it all looks a bit too convenient, so I thought I would check out the raw data (which is only available up to 2011 on the GISS site, so the last three years cannot be compared). Lo and behold! [graph]
As we so often see, the past has been cooled.
GHCN show the extent to which they have adjusted temperatures, the best part of 2 degree centigrade. [graphs]
Of course, there may be a genuine problem with Puerto Casada’s record, except that we see exactly the same thing happening at the other two Paraguayan sites. [Raw – Adjusted gif comparisons]
So we find that a large chunk of Gavin’s hottest year is centred around a large chunk of South America, where there is little actual data, and where the data that does exist has been adjusted out of all relation to reality.
Even by GHCN standards, this tampering takes some beating.
From the above post, Paul Homewood’s recent temperature record posts can be accessed:
Recent Posts
How GHCN Keep Rewriting Reykjavik History
Cooling The Past In Bolivia
Cooling The Past In San Diego
Greene Hypocrisy
Higher Snowfalls Due to Change In Measurement, Not Global Warming.
Temperature Adjustments Around The World
Shub Niggurath On The Paraguayan Adjustments
It is often claimed that these adjustments are needed to “correct” errors in the historic record, or compensate for station moves. All of which makes the adjustment at Valentia Observatory even more nonsensical [old vs new].
Valentia Observatory, situated in SW Ireland, is regarded as one of the highest quality meteorological stations in the world, located at the same site since 1892, well away from any urban or other non climatic biases. The Irish Met Office say this:-
“Since the setting up of the Irish Meteorological Service, the work programme of the Observatory has greatly expanded and it has always been equipped with the most technologically advanced equipment and instrumentation. The Observatory is well known and very highly regarded by the scientific community. As well as fulfilling its national and international role within Met Éireann it is involved in many projects with other scientific bodies both in Ireland and abroad.”
If we cannot get accurate temperature trends in Valentia, we cannot get them anywhere. Yet the GHCN algorithm decides that the actual temperatures measured there do not look right, and lops 0.4C off temperatures before 1967.
Worse still, the algorithm uses a bunch of hopelessly unreliable urban sites, as far away as Paris, to decide upon the “correct temperature”, as Ronan Connolly illustrated.
[…] We saw previously how the temperature history for Paraguay, and a large slice of the surrounding region, had been altered as a result of temperature adjustments, which had significantly reduced historic temperatures and changed a cooling trend into a warming one.
I can now confirm that similar “cooling the past” adjustments have been carried out in the Arctic region, and that the scale and geographic range of these is breathtaking. Nearly every current station from Greenland, in the west, to the heart of Siberia (87E), in the east, has been altered in this way. The effect has been to remove a large part of the 1940’s spike, and as consequence removed much of the drop in temperatures during the subsequent cold decades.
‘Another Smoking Gun That NCDC Temperature Adjustments Are Incorrect’
Posted on February 5, 2015 by Steven Goddard
Yesterday I showed how 100% of US warming since 1990 is due to NCDC filling in fake temperatures for missing data. The actual measured data shows no warming
See gif: Measured vs No Underlying Data
See graph: Percent Of USHCN Final Temperatures Which Are “Estimated” [over 50%]
The next step is to look at the correlation between how much infilling is being done vs. the divergence between estimated temperatures and measured.
See graph
There is a very good correlation between infilling and inflated temperatures. The more fake data they generate, the larger the divergence in temperature vs. measured. This is likely due to loss of data at rural stations, which are now being doubly contaminated by gridded and infilled urban temperatures.
In case you thought widespread temperature adjustments were confined to the Arctic and South America, consider again. Apparently, New Zealand has caught Paraguayan fever!
There are five stations currently operational under GHCN in New Zealand and surrounding islands. It will come as no great surprise now to learn that GHCN warming adjustments have been added to every single one. (Full set of graphs below).
In all cases, other than Hokitika, the adjustment has been made in the mid 1970’s.
This adjustment has been triggered by a drop in temperatures in 1976, as we can see with Gisborne, below. (The algorithm did not spot that temperatures recovered to previous levels two years later!)
See graph: Raw Data, Gisborne Aero
Was this temperature drop due to some local, non-climatic factor at Gisborne. Apparently not, because the same drop occurred at all eleven of the other NZ stations operating at that time.
Below is a comparison of the unadjusted annual temperatures for 1975 and 1976.
As a result of the adjustment, Gisborne’s temperatures for 1974 and earlier have been lowered by 0.7C. Similar sized adjustments seem to have been made at the other stations.
As the algorithm cannot have arrived at the adjustment by comparing NZ stations with each other, it must have used stations further away, presumably in Australia.
But can we really compare the two? Once again, the evidence points strongly to the adjustments being incorrect, and reacting to a genuine drop in temperature.
It is often claimed that, overall, temperature adjustments up and down largely cancel each other out. But, while we keep coming across warming adjustments that are questionable, I don’t see cooling ones similarly criticised. Maybe most of these are justifiable.
If this is the case, and many of the warming ones are not, then the overall effect would be much greater than suggested.
On the other hand, if many cooling adjustments are also incorrect, it does not inspire much confidence in the process.
APPENDIX
NCDC, NOAA, GHCN v3 station plots of above list.
>”As a result of the adjustment, Gisborne’s temperatures for 1974 and earlier have been lowered by 0.7C. Similar sized adjustments seem to have been made at the other stations.”
This is not an adjustment that NIWA make in the NZ 7SS compilation used by HadCRUT4. Which in turn has triple the trend of the alternative 7SS using Rhoades & Salinger (1993) methodology:
There is no reason for a break between 1974 and 1979. But 1975/76 is only a 0.1 adjustment. There is a progressive cumulative adjustment in 0.1 increments adding to 0.7.
See metANN column at far right of the data sheets.
At 1963 the cumulative adjustment is 0.7
At 1968 the cumulative adjustment is 0.6
At 1972 the cumulative adjustment is 0.5
At 1975 the cumulative adjustment is 0.4
At 1980 the cumulative adjustment is 0.3
At 1982 the cumulative adjustment is 0.2
At 1986 the cumulative adjustment is 0.1
At 2001 the cumulative adjustment is 0.1
At 2002 the cumulative adjustment is 0.0
For some reason the GISS Gisborne Aero data sheet URLs return “Not found” sorry. Deleting “http://” works for the adjusted data, but for the raw data start at this page:
Select “after removing suspicious records” and then “Download monthly data as text”
I would point out that over the GISS period of Gisborne Aero dataset adjustments of which there appear to be about 7 in total 1962 – 2002, BEST make no adjustments whatsoever over the same period.
The GISS Gisborne Aero 1973 cumulative adjustment is 0.5
1973 monthly raw (top) vs adjusted (bottom)
19.4 18.5 16.2 14.6 12.7 10.0 8.6 10.5 12.3 14.2 17.2 17.2
18.9 18.0 15.7 14.1 12.2 9.5 8.1 10.0 11.8 13.7 16.7 16.8
0.5 difference for every month
The 1974 – 1977 range of common cumulative adjustment is 0.4
1974 monthly raw (top) vs adjusted (bottom)
17.7 20.6 15.1 14.8 11.2 10.1 10.1 8.9 12.1 13.6 15.5 17.8
17.3 20.2 14.7 14.4 10.8 9.7 9.7 8.5 11.7 13.2 15.1 17.4
0.4 difference for every month
1977 monthly raw (top) vs adjusted (bottom)
18.4 18.9 17.8 14.5 10.9 10.1 9.4 10.4 10.2 13.4 14.9 17.5
18.0 18.5 17.4 14.1 10.5 9.7 9.0 10.0 9.8 13.0 14.5 17.2
0.4 difference for every month
The 1978 cumulative adjustment is 0.3
1978 monthly raw (top) vs adjusted (bottom)
19.2 19.5 17.6 16.4 12.0 10.0 9.7 10.3 11.3 12.0 16.0 18.0
18.9 19.2 17.3 16.1 11.7 9.7 9.4 10.0 11.0 11.7 15.7 17.7
0.3 difference for every month
Apparently, according to GISS (but not BEST), there were 2 distinct 0.1 steps from 1978 to 1977 and from 1974 to 1973. Similarly for the other ranges of common cumulative adjustments.
Covers GISS and BEST adjustments at Puerto Casado, Paraguay and Gisborne Aero, New Zealand along with links to the Climate Etc discussion and a bunch of other stuff.
Using the excellent web platform provided by NASA GISS it is possible to access GHCN V2 and GHCN v3 records, compare charts and download the data. It does not take long to find V3 records that appear totally different to V2 and I wanted to investigate this further. At this point I was advised that the way homogenisation works is to adjust records in such a way that a warming trend added in one station is compensated by cooling added to another. This didn’t sound remotely scientific to me but I clicked on Alice Springs in the middle of Australia and recovered 30 V2 and V3 records in a 1000 km radius and set about a systematic comparison of the two. The results are described in detail below.
In summary I found that while individual stations are subject to large and what often appears to be arbitrary and robotic adjustments in V3, the average outcome across all 30 stations is effectively zero. At the regional level, homogenisation does not appear to be responsible for adding warming in Australia. But the thing that truly astonished me was the fact that the mean temperature trend for these 30 stations, 1880 to 2011, was a completely flat line. There has been no recorded warming across a very large portion of the Australian continent.
# In Alice Springs the raw record is flat and has no sign of warming. In the adjusted record, homogenistaion has added warming by significantly cooling the past. Five other stations inside the 1000 km ring have similarly long and similarly flat records – Boulia, Cloncurry, Farina, Burketown and Donors Hill. There can be no conceivable reason to presume that the flat raw Alice Springs record is somehow false and in need of adjustment.
# Six records show a significant mid-1970s cooling of about 3˚C (Alice Springs, Barrow Creek, Brunette Down, Cammoo Weal, Boulia and Windorah) that owing to its consistency appears to be a real signal. Homegisation has tended to remove this real temperature history.
# The average raw temperature record for all 30 stations is completely flat from 1906 (no area weighting applied). There has been no measurable warming across the greater part of Australia. The main discontinuity in the record, pre-1906, arises from there being only 3 operating stations that do not provide representative cover.
# Homogenisation appears to have added warming or cooling to records where neither existed. Homogenisation may also have removed real climate signal.
# I find zero warming over such a large part of the Australian continent to be a surprise result that is consistent with Roger Andrew’s observation of no to little warming in the southern hemisphere, an observation that still requires more rigorous testing.
euanmearns | March 17, 2015 at 12:33
By way of a little further background. This post is a bit rough around the edges in part because it is a huge amount of work to clean the V3 data where large amounts of records are deleted and many are “created”. I was also feeling my way trying to make sense of how to treat the results. I have since moved on to look at Southern Africa and I hope these results will also be posted here. Excluding urban records that show warming trends, southern Africa looks like central Australia.
One thing I want to try and nail is how the likes of BEST manage to create warming from temperature records that are flat. I ventured on to Real Climate a few weeks ago and was told repeatedly that what GHCN and GISS were doing must be correct since BEST shows the same trends.
I have completed analysis of S S America and Antarctica that have yet to be published. All this pretty well confirms Roger Andrews observation that there is little warming in the southern hemisphere which I find is a real puzzle.
‘New paper finds a large warming bias in Northern Hemisphere temperatures from ‘non-valid’ station data’
The Hockey Shtick, May 28, 2015
A new paper published in the Journal of Atmospheric and Solar-Terrestrial Physics finds that the quality of Northern Hemisphere temperature data has significantly & monotonically decreased since the year 1969, and that the continued use of ‘non-valid’ weather stations in calculating Northern Hemisphere average temperatures has created a ‘positive bias’ and “overestimation of temperatures after including non-valid stations.”
The paper appears to affirm a number of criticisms of skeptics that station losses, fabricated/infilled data, and positively-biased ‘adjustments’ to temperature data have created a positive skew to the data and overestimation of warming during the 20th and 21st centuries.
Graphs from the paper below show that use of both valid and ‘non-valid’ station data results in a mean annual Northern Hemisphere temperature over 1C warmer at the end of the record in 2013 as compared to use of ‘valid’ weather station data exclusively.
[snip]
Extraction of the global absolute temperature for Northern Hemisphere using a set of 6190 meteorological stations from 1800 to 2013
Demetris T. Christopoulos
Highlights
•
Introduce the concept of a valid station and use for computations.
•
Define indices for data quality and seasonal bias and use for data evaluation.
•
Compute averages for mean and five point summary plus standard deviations.
•
Indicate a monotonically decreasing data quality after the year 1969.
•
Observe an overestimation of temperature after including non-valid stations.
In most parts of Iceland, last year was the coldest since 2000, in marked contrast to 2014.
The Iceland Temperature Series below is built up from the seven following sites, which have long running, high quality temperature records back to 1931 and earlier. They also present a reasonable geographic distribution. It is also important to note that the temperature data has been carefully homogenised over the years by the Iceland Met Office, to adjust for station moves, equipment changes etc. (For more detail, see here).
The warm years of the 1930’s and 40’s, and much colder ones that followed clearly correlate with the AMO cycle. Although the warmth has been more persistent in the last decade, only one year, 2014, has been warmer than those earlier years.
There is no evidence to suggest that temperature trends will increase in the next few years, and much to suggest that Iceland will suffer from a return of a very cold climate once the AMO turns cold again.
We can see the same patterns in the individual stations below:
[…see graphs…]
Needless to say, the adjusted GISS versions below bear no comparison.
Below is plotted all of the currently active individual USHCN stations in Michigan. There is only one station which shows net warming since the 1950s (Mt. Pleasant) and that one appears to have a discontinuity at 1995 causing the problem.
The NOAA hockey stick in Michigan is completely fake. It has no basis in reality.
The rate of change of near surface land air temperature as estimated in the Berkeley “BEST” dataset is very similar to the rate of change in the sea surface temperature record, except that it shows twice the rate of change.
Sea water has a specific heat capacity about 4 times that of rock. This means that rock will change in temperature four times more than water for the same change in thermal energy, for example from incoming solar radiation.
Since soil, in general, is a mix of fine particles of rock and organic material with a significant water content. The two temperatures records are consistent with the notion of considering land as ‘moist rock’. This also partly explains the much larger temperature swings in desert regions: the temperature of dry sand will change four times faster than ocean water and be twice as volatile as non-desert land regions.
This also underlines why is it inappropriate to average land and sea temperatures as is done in several recognised global temperature records such as HadCRUT4 ( a bastard mix of HadSST3 and CRUTem4 ) as well as GISS-LOTI and the new BEST land and sea averages.
It is a classic case of ‘apples and oranges’. If you take the average of an apple and an orange, the answer is a fruit salad. It is not a useful quantity for physics based calculations such as earth energy budget and the impact of a radiative “forcings”.
The difference in heat capacity will skew the data in favour of the land air temperatures which vary more rapidly and will thus give an erroneous basis for making energy based calculations
In his comment to How much Estimation is too much Estimation?, Anthony Watts suggested I create a scatter plot showing station distribution with latitude/longitude. It turned out not to be the ordeal I thought it might be, so I have posted some of the results in this thread. I started with 1885 and created a plot every 20 years, ending in 2005. I deliberately ended with 2005 because this is the final year in the GHCN record prior to the US station die-off of 2006.
Every dot on a plot represents a station, not a scribal record. Stations may be comprised of multiple records. A blue dot represents a station with an annual average that was fully calculated from existing monthly averages. A red dot represents a station that had missing monthly averages for that year, so the annual average had to be estimated. Stations that had insufficient data to estimate an annual average are not shown.
In the case where multiple scribal records exist for a station in the given year, I assigned a blue dot if all records were fully calculated from existing averages, a red dot if at least one record was estimated, and no dot if none of the records could produce an estimate. I believe this errs in the direction of assigning more blue dots than is deserved. Hansen’s bias method mathematically forces estimation to occur during the period of scribal record overlap.
The first plot shows coverage in 1885, five years into the GHCN record.
[see chart]
1905 shows improved coverage across the continental US, Japan and parts of Australia. A few stations have appeared in Africa.
[see chart]
1925 shows increased density in the western US, southern Canada, and the coast of Australia.
[see chart]
At the end of WWII, not a lot of change is noticeable other than improved coverage in Africa and South America as well as central China and Siberia.
[see chart]
In 1965 we see considerable increases in China, parts of Europe, Turkey, Africa and South America.
[see chart]
A decline in quality seems to be apparent in 1985, as many more stations show as red, indicating their averages are estimated due to missing monthly data.
[see chart]
A huge drop in stations is visible in the 2005 plot, notably Australia, China, and Canada. 2005 was the warmest year in over a century. Not surprising, as the Earth hadn’t seen station coverage like that in over a century.
[see chart]
The final plot illustrates the world-wide station coverage used to tell us “2006 Was Earth’s Fifth Warmest Year“.
It’s like watching the lights go out over the West. Sinan Unur has mapped the surface stations into a beautiful animation. His is 4 minutes long and spans from 1701-2010. I’ve taken some of his snapshots and strung them into a 10 second animation.
You can see as development spreads across the world that more and more places are reporting temperatures. It’s obvious how well documented temperatures were (once) in the US. The decay of the system in the last 20 years is stark.
Mears and Wentz aper was rejected by the first Journal where it was submitted.
Spencer:
“The paper is for MT, not LT…but I think we can assume that changes in one will be reflected in the other when Mears completes their analysis.
From what little we have looked at so far, it appears that they did not correct for spurious warming in NOAA-14 MSU relative to NOAA-15 AMSU…see their Fig. 7c. They just leave it in.
Since this spurious warming is near the middle of the whole time period, this shifts the second half of the satellite record warmer when NOAA-14 MSU (the last in the MSU series) is handed off to NOAA-15 AMSU (the first in the AMSU series).
Why do we think NOAA-14 MSU is at fault?
1) AMSU is supposed to have a “Cadillac” calibration design (that’s the term a NASA engineer, Jim Shiue, used when describing to me the AMSU design, which he was involved in).
2) NOAA-14 MSU requires a large correction for the calibrated TB increasing with instrument temperature as the satellite drifts into a different orbit. The NOAA-15 AMSU requires no such correction…and it wasn’t drifting during the period in question anyway.
So, it looks like they decided to force good data to match bad data. Sound familiar?
‘Comments on New RSS v4 Pause-Busting Global Temperature Dataset’
March 4th, 2016 by Roy W. Spencer, Ph. D.
Now that John Christy and I have had a little more time to digest the new paper by Carl Mears and Frank Wentz (“Sensitivity of satellite derived tropospheric temperature trends to the diurnal cycle adjustment”, paywalled here), our conclusion has remained mostly the same as originally stated in Anthony Watts’ post.
While the title of their article implies that their new diurnal drift adjustment to the satellite data has caused the large increase in the global warming trend, it is actually their inclusion of what the evidence will suggest is a spurious warming (calibration drift) in the NOAA-14 MSU instrument that leads to most (maybe 2/3) of the change. I will provide more details of why we believe that satellite is to blame, below.
Also, we provide new radiosonde validation results, supporting the UAH v6 data over the new RSS v4 data.
[…]
Conclusion
The evidence suggests that the new RSS v4 MT dataset has spurious warming due to a lack of correction for calibration drift in the NOAA-14 MSU instrument. Somewhat smaller increases in their warming trend are due to their use of a climate model for diurnal drift adjustment, compared to our use of an empirical approach that relies upon observed diurnal drift from the satellite data themselves. While the difference in diurnal drift correction methodolgy is a more legitimate point of contention, in the final analysis independent validation with radiosonde data and most reanalysis datasets suggest better agreement with the UAH product than the RSS product.
Measured minimum temperatures in Boulder, CO show a strong cooling trend over the last 60 years, but NOAA massively tampers with the data to turn it into a warming trend.
‘Can Both GISS and HadCRUT4 be Correct? (Now Includes April and May Data)’
justthefactswuwt / June 16, 2016
Guest Post by Werner Brozek, Excerpted from Professor Robert Brown from Duke University, Conclusion by Walter Dnes and Edited by Just The Facts:
[see graph: HadCRUT4 vs GISTEMP 1960 – 1979]
[…]
The two anomalies match up almost perfectly from the right hand edge to the present. They do not match up well from 1920 to 1960, except for a brief stretch of four years or so in early World War II, but for most of this interval they maintain a fairly constant, and identical, slope to their (offset) linear trend! They match up better (too well!) — with again a very similar linear trend but yet another offset across the range from 1880 to 1920. But across the range from 1960 to 1979, Ouch! That’s gotta hurt. Across 20 years, HadCRUT4 cools Earth by around 0.08 C, while GISS warms it by around around 0.07C.
[…]
Finally, there is the ongoing problem with using anomalies in the first place rather than computing global average temperatures. Somewhere in there, one has to perform a subtraction. The number you subtract is in some sense arbitrary, but any particular number you subtract comes with an error estimate of its own. And here is the rub:
The place where the two global anomalies develop their irreducible split is square inside the mutually overlapping part of their reference periods!
That is, the one place they most need to be in agreement, at least in the sense that they reproduce the same linear trends, that is, the same anomalies is the very place where they most greatly differ. Indeed, their agreement is suspiciously good — as far as linear trend is concerned – everywhere else, in particular in the most recent present where one has to presume that the anomaly is most accurately being computed and the most remote past where one expects to get very different linear trends but instead get almost identical ones!
Schmidt predicts a 1.3 C anomaly for 2016 (plus or minus) but his graph uses a “Pre-industrial baseline (1880-1889)”. This is NOT the GISTEMP LOTI baseline.
Schmidt’s graph has 2015 at 1.1 C. GISS LOTI data sheet has 0.87 C for 2015. Schmidt’s 2016 prediction is 0.2 C higher than 2015. So from the LOTI data sheets 0.87 + 0.2 = 1.07 C for 2016. This is only a fraction below the YTD mean (1.095) with 6 months of data still to come in and La Nina cooling.
Year Jan Feb Mar Apr May Jun
2016 114 133 129 109 93 79
0.54 C drop in 4 months Feb to Jun. Only another 0.23 C drop in 6 months and the 2016 mean would equal the 2015 mean (0.87). ENSO-neutral is about 0.65 and conditions will cross ENSO positive to negative this year i.e. neutral at some point before the end of the year.
In the top graph (scary) he uses a 1980-2015 baseline which I assume (doesn’t say) is in respect to each specific month rather than all months (i.e. different baseline for each month – I think). July anomaly is over 2 C (scary).
But in the next LOTI map the anomaly, from normal 1951-1980 all months mean baseline, is only 0.83 C (yawn).
July isn’t even the hottest LOTI month in 2016, February was.
With LOTI July now in at 0.83 C, the YTD LOTI mean only has to fall another 0.189 C over the next 5 months and Schmidt’s “> 99% chance” of a new record is toast.
But he’ll just point to one of his other graphs (the one that suits is story best like in Wash Post above) with different baseline criteria that provides him with the greatest anomaly departure.
Remains to be seen but enough cooler months will foil that plan too. In terms of LOTI, I think Schmidt must be sweating.
Neither Cox nor Roberts addressed the IPCC’s primary climate change criteria. (I did at #17 – 10 “likes”. 19 ‘”likes” at #12.1.2 re CO2 disconnect). I doubt either of them have even the vaguest idea.
But in terms of temperature series, a useful resource to call up in the future. Way down at #30.1.1 I placed a comment re GISS “adjustments” to Gisborne Aero NZ. That comment, and others nearby, cross links to Judith Curry’s Climate Etc, Paul Homewood’s notalotofpeopleknowthat, Euan Mearns’ Energy Matters, Climate$you, and back to ‘Temperature Records’ here at CCG from Paul Homewood’s post. Further down other GISS and BEST “adjustment” case studies e.g. Puerto Casado. Zeke Hausfather did a runner on that one
GISTEMP is rubbish and provably so but at least it’s a devil we know. And the data after the El Nino peak isn’t on Schmidt’s “long-term trend” anymore despite his frantic Tweeting. I think Schmidt’s on a hiding to nothing over the next 2, 3, 4 years.
Doesn’t appear to realize NASA provides space platforms for temperature monitoring that contradict GISTEMP (Or that NASA is – National Aeronautics and Space Administration).
‘Rock star-scientist Brian Cox confused on more than global temperatures’
By Jennifer Marohasy – posted Thursday, 18 August 2016
[…. 3 pages….click “ALL” at bottom……]
Cox may not care too much for facts. He is not only a celebrity scientist, but also a rock star. Just the other day I was watching a YouTube video of him playing keyboard as the lead-singer of the band screamed, “We don’t need a reason”.
There was once a clear distinction between science – that was about reason and evidence – and art that could venture into the make-believe including through the re-interpretation of facts. This line is increasingly blurred in climate science where data is now routinely remodeled to make it more consistent with global warming theory.
For example, I’m currently working on a 61-page expose of the situation at Rutherglen. Since November 1912, air temperatures have been measured at an agricultural research station near Rutherglen in northern Victoria, Australia. The data is of high quality, therefore, there is no scientific reason to apply adjustments in order to calculate temperature trends and extremes. Mean annual temperatures oscillate between 15.8°C and 13.4°C. The hottest years are 1914 and 2007; there is no overall warming-trend. The hottest summer was in 1938–1939 when Victoria experienced the Black Friday bushfire disaster. This 1938-39 summer was 3°C hotter than the average-maximum summer temperature at Rutherglen for the entire period: December 1912 to February 2016. Minimum annual temperatures also show significant inter-annual variability.
In short, this temperature data, like most of the temperature series from the 112 sites used to concoct the historical temperature record by the Australian Bureau of Meteorology does not accord with global warming theory.
So, adjustments are made by the Australian Bureau of Meteorology to these temperature series before they are incorporated into the Australian Climate Observations Reference Network – Surface Air Temperature (ACORN-SAT); and also the UK Met Office’s HadCRUT dataset, which informs IPCC deliberations.
The temperature spike in 1938-1939 is erroneously identified as a statistical error, and all temperatures before 1938 adjusted down by 0.62°C. The most significant change is to the temperature minima with all temperatures before 1974, and 1966, adjusted-down by 0.61°C and 0.72°C, respectively. For the year 1913, there is a 1.3°C difference between the annual raw minimum value as measured at Rutherglen and the remodelled value.
The net effect of the remodelling is to create statistically significant warming of 0.7 °C in the ACORN-SAT mean temperature series for Rutherglen, in general agreement with anthropogenic global warming theory.
NASA applies a very similar technique to the thousands of stations used to reproduce the chart that Cox held-up on Monday night during the Q&A program. I discussed these change back in 2014 with Gavin Schmidt, who oversees the production of these charts at NASA. I was specifically complaining about how they remodel the data for Amberley, a military base near where I live in Queensland.
Back in 2014, the un-adjusted mean annual maximum temperatures for Amberley – since recordings were first made in 1941 – show temperatures trending up from a low of about 25.5°Cin 1950 to a peak of almost 28.5°Cin 2002. The minimum temperature series for Amberley showed cooling from about 1970. Of course this does not accord with anthropogenic global warming theory. To quote Karl Braganza from the Bureau as published by online magazine The Conversation, “Patterns of temperature change that are uniquely associated with the enhanced greenhouse effect, and which have been observed in the real world include… Greater warming in winter compared with summer… Greater warming of night time temperatures than daytime temperatures”.
The Bureau has “corrected” this inconvenient truth at Amberley by jumping-up the minimum temperatures twice through the homogenization process: once around 1980 and then around 1996 to achieve a combined temperature increase of over 1.5°C.
This is obviously a very large step-change, remembering that the entire temperature increase associated with global warming over the 20th century is generally considered to be in the order of 0.9°C.
According to various peer-reviewed papers, and technical reports, homogenization as practiced in climate science is a technique that enables non-climatic factors to be eliminated from temperature series – by making various adjustments.
It is often done when there is a site change (for example from a post office to an airport), or equipment change (from a Glaisher Stand to a Stevenson screen). But at Amberley neither of these criteria can be applied. The temperatures have been recorded at the same well-maintained site within the perimeter of the air force base since 1941. Through the homogenization process the Bureau have changed what was a cooling trend in the minimum temperature of 1.0°Cper century, into a warming trend of 2.5°C per century.
Homogenization – the temperature adjusting done by the Bureau – has not resulted in some small change to the temperatures as measured at Amberley, but rather a change in the temperature trend from one of cooling to dramatic warming as was done to the series for Rutherglen.
NASA’s Goddard Institute for Space Studies (GISS) based in New York also applies a jump-up to the Amberley series in 1980, and makes other changes, so that the annual average temperature for Amberley increases from 1941 to 2012 by about 2°C.
The new Director of GISS, Gavin Schmidt, explained to me on Twitter back in 2014 that: “@jennmarohasy There is an inhomogenity detected (~1980) and based on continuity w/nearby stations it is corrected. #notrocketscience”.
When I sought clarification regarding what was meant by “nearby” stations I was provided with a link to a list of 310 localities used by climate scientists at Berkeley when homogenizing the Amberley data.
The inclusion of Berkeley scientists was perhaps to make the point that all the key institutions working on temperature series (the Australian Bureau, NASA, and also scientists at Berkeley) appreciated the need to adjust-up the temperatures at Amberley. So, rock star scientists can claim an absolute consensus?
But these 310 “nearby” stations, they stretch to a radius of 974 kilometres and include Frederick Reef in the Coral Sea, Quilpie post office and even Bourke post office. Considering the un-adjusted data for the six nearest stations with long and continuous records (old Brisbane aero, Cape Moreton Lighthouse, Gayndah post office, Bundaberg post office, Miles post office and Yamba pilot station) the Bureau’s jump-up for Amberley creates an increase for the official temperature trend of 0.75°C per century.
Temperatures at old Brisbane aero, the closest of these station, also shows a long-term cooling trend. Indeed perhaps the cooling at Amberley is real. Why not consider this, particularly in the absence of real physical evidence to the contrary? In the Twitter conversation with Schmidt I suggested it was nonsense to use temperature data from radically different climatic zones to homogenize Amberley, and repeated my original question asking why it was necessary to change the original temperature record in the first place. Schmidt replied, “@jennmarohasy Your question is ill-posed. No-one changed the trend directly. Instead procedures correct for a detected jump around ~1980.”
If Twitter was around at the time George Orwell was writing the dystopian fiction Nineteen Eighty-Four, I wonder whether he might have borrowed some text from Schmidt’s tweets, particularly when words like, “procedures correct” refer to mathematical algorithms reaching out to “nearby” locations that are across the Coral Sea and beyond the Great Dividing Range to change what was a mild cooling-trend, into dramatic warming, for an otherwise perfectly politically-incorrect temperature series.
Apart from the estimate being merely “based on” data (i.e. not actual data), and that the 1998 El Nino relativity has been erased, there is a further eye teaser concerning the LOWESS smoothing (red line) at the end point 2015. In other words, the red line guides the eye but not the brain.
2004 to 2011 is the “hiatus/pause” in the red line. After 2011 the red line hikes up abruptly due to the El Nino spike. 2013 (0.66) was ENSO-neutral,
GISS appear to be assuming, given Gavin Schmidt’s 2016 prediction and reasoning (see Tweets), that the red line will continue upwards to a new record high (anomaly 0.87+) as per 1998 in the graph rather than return to neutral (anomaly 0.66 ish) and a continued “hiatus”.
I’m guessing, given the monthly data trajectory, that instead of continuing upwards what we will see in a couple of years is just a spike in the red line smoothing, but time will tell.
Basically, what I’m getting at is that the LOWESS smoothing (red line) of annual mean datapoints 2011 – 2015 is no indication of the series trajectory from 2015 onwards. But GISS and their fan base like Cox, and his fans (“fanbois” – Dellingpole) are convinced the red line is, actually, the data trajectory.
This series progression will be make or break between now and about 2018 for Schmidt and GISS given their posturing. We could see some “adjustments” of course. But then they still have the satellites to contend with.
As I reported earlier, NOAA shows Angola as their hottest month ever in August. [see chart]
Amazing, since NOAA doesn’t actually have any thermometer readings in Angola – which is almost twice the size of Texas. [see chart]
The RSS TLT August anomaly for central Angola was 0.53, slightly above average and nowhere near a record. There has been almost no trend in Angola temperatures since the start of records in 1978 [see chart]
NOAA and NASA are defrauding the American people and the world with their junk science, which is used as the basis for policy.
I am looking for an old televised climate change debate, but I cannot find it. I thought it was about 10 years old and that John Campbell was the moderator…….but nothing is popping up in Google so I must have this wrong.
From memory it was a debate between two experts/scientists. An early exchange was an expert showed a graph of rising temperatures. The opposing expert then explained that recent data was purposely excluded because it did not follow the trend.
Is there any chance someone knows the debate I’m thinking of, or who was in it?
I’ve been advised that the climate change debate I’m looking for happened more like 20 years ago, and was between David Wratt and Chris de Freitas on TV1. If someone knows what show that was on, or has a copy of the debate, that would be great. Cheers.
Seventy years is plenty
https://www.climateconversation.org.nz/2010/09/seventy-years-is-plenty/
Temperature adjustments science or art?
https://www.climateconversation.org.nz/2010/09/temperature-adjustments-science-or-art/
Land Temperature Records UHI
MEDIA RELEASE
Hot cities
If you thought our cities are getting warmer, you’re right.
Bureau of Meteorology researchers have found that daytime temperatures in our cities are warming more rapidly than those of the surrounding countryside and that this is due to the cities themselves.
Bureau climate scientist, Belinda Campbell, said “we’ve known for a while that city night time temperatures have been warmer because the heat’s retained after sunset just that much longer than the countryside, and that city daytime temperatures have been warming too.”
“But what we didn’t know was whether city day time temperatures were also warmer because of the urbanisation or whether it was due to the overall warming of the planet associated with the enhanced greenhouse effect.”
http://www.bom.gov.au/announcements/media_releases/ho/20101013.shtml
h/t Ian Wishart and WUWT
The Great Dying of Thermometers
Why GISS Temperatures Are Too High
‘GISS is the only temp index which shows significant warming since year 2000’
‘Their algorithm creates an imaginary hot spot at the North Pole…They appear to be using at least two steps of extrapolation/interpolation – which compounds error. In other words, their entire 21st century warming story is based on a defective interpretation of the Arctic.’
Where Will GISS Find Future Warming?
Posted on October 24, 2010 by stevengoddard
Much of the reported warming from the 20th century was due to upwards adjustments of recent temperatures, and downwards adjustments of older temperatures.
This worked pretty well until satellites showed up and provided some badly needed checks and balances on the adjusters. Since 1990 it has been tougher to justify any further upwards adjustments to the data.
Steven Mosher’s Blog
In the debate over the accuracy of the global temperature nothing is more evident than errors in the location data for stations in the GHCN inventory. That inventory is the primary source for all the temperature series. One question is “do these mistakes make a difference?” If one believes as I do that the record is largely correct, then it’s obvious that these mistakes cannot make a huge difference. If one believes, as some do, that the record is flawed, then it’s obvious that these mistakes could be part of the problem. Up until know that is where these two sides of the debate stand. Believers convinced that the small mistakes cannot make a difference; and dis-believers holding that these mistakes could in fact contribute to the bias in the record. Before I get to the question of whether or not these mistakes make a difference, I need to establish the mistakes, show how some of them originate, correct them where I can and then do some simple evaluations of the impact of the mistakes. This is not a simple process. Throughout this process I think we can say two things that are unassailable: 1. the mistakes are real. 2. we simply don’t know if they make a difference. Some believe they cannot (but they haven’t demonstrated that) and some believe they will (but they haven’t demonstrated that). The demonstration of either position requires real work. Up to now no one has done this work.
This matters primarily because to settle the matter of UHI stations must be categorized as urban or rural. That entails collecing some information about the character of the station, say it’s population or the characteristics of the land surface. So, location matters. Consider Nightlights which Hansen2010 uses to categorize stations into urban and rural. That determination is made by looking up the value of a pixel in an image. If it bright, the site is urban. If its dark (mis-located in the ocean) the site is rural.
[Sophisticated analysis of GHCN and UHI at this Blog]
Phil Jones and the Chinese weather station corruption
November 7, 2010
by Anthony Watts
**BREAKING NEWS**
New Retreat from Global Warming Data by Australian Gov Bureau
Article by John O’Sullivan and Val Majkus (via email from John O’Sullivan)
Wednesday, November 10, 2010
Global warmers in full retreat as Aussie experts admit growing doubts about their own methods as new study shows one third of temperatures not reliable.
The Australian Bureau of Meteorology (BOM) admits it was wrong about urban heating effects as a professional statistical analysis by Andrew Barnham exposes a BOM claim that “since 1960 the mean temperature in Australia has increased by about 0.7 °C”; the BOM assertion has no empirical scientific basis.
Barnham, who spent 8 years working in emerging South Asian economies building high volume transaction processing systems, applied a high-tech statistical technique very different from an earlier well-publicized probe by fellow Aussie, Ken Stewart on his blog, Ken’s Kingdom.
Stewart grabbed headlines in what became known as the Australiagate controversy after his findings were featured on popular science blog, Watts Up With That. Stewart exposed dubious BOM adjustments to temperature data that bore little or no resemblance to actual or raw past temperatures.
Like Stewart, Barnham paid particular attention to BOM’s methodology in addressing what is known as the Urban Heat Island Effect (UHI), a proven phenomenon whereby thermometers measuring temperatures in towns and cities become unduly influenced by extra ‘background’ heating from buildings, road surfaces, machinery, etc. It’s in the UHI adjustments that the greatest discrepancies appear to lie.
Continues….
val majkus says:
November 11, 2010 at 10:43 am
Interesting Guest post by Ed Thurstan of Sydney, Australia
Synopsis
This study shows that the NOAA maintained GHCN V2 database contains errors in calculating a Mean temperature from a Maximum and a Minimum. 144 years of data from 36 Australian stations are affected.
Means are published when the underlying Maximums and/or Minimums have been rejected.
http://wattsupwiththat.com/2010/11/10/gross-data-errors-in-ghcn-v2-for-australia/#more-27653
GISS : Fighting For #1
Posted on November 13, 2010 by stevengoddard
During October, RSS showed a large drop of 0.232 from September. It appeared that the battle for 2010 as hottest year ever was doomed. 2010 is turning out much cooler than 1998, with no hope of catching up.
But just when the battle appeared lost, the fighters at GISS got their second wind. Instead of a large drop in October temperature anomalies, they found a 0.08 rise! This keeps 2010 well ahead of their hottest year ever – 2005.
There’s an interesting post at Jo Nova’s http://joannenova.com.au/2010/10/bom-giss-have-record-setting-bugs-affecting-a-million-square-miles/
Chris Gillham
Western Australia (WA) covers 2.5 million square kilometers (1 million square miles, about a third as big as the USA). The average of all WA stations over one month last year was adjusted up by as much as a gobsmacking 0.5 degrees due to a database “bug” – which contributed to August 2009 being the hottest August on record?! That’s one heck of a bug!
posted on October 27 2010
Chris also says
While researching the GISS adjustments, I noticed yet another odd data shift that left me wondering about the reliability of temperature recordings. I had listed the 2009 monthly mean temperatures on October 4, 2010, for Kalgoorlie-Boulder, but when I returned to the GISS website database the following day, October 5, I found that every month in 2009 for that location had been shifted up by .1 C.
This means the newly adjusted GISS record shows Kalgoorlie-Boulder’s average mean for Spring 2009 was 20.6C, not 20.5 C anymore, so this historic mining town’s seasonal temperature record is now 1.2 degrees higher than the reality of the BoM records.
Chris has done a huge amount of work and his blog which relates to Western Aust is well worth a visit
http://www.waclimate.net/
He’s also examined pre 1910 temps and compared
Although the pre-1910 temperature recordings in each location examined below are considered by the Bureau of Meteorology to be unreliable compared to the regulated post-1910 data, it is still worth comparing these earliest temperatures with the most recent “corrected” data from 1979-2008 in the 20 locations.
Minima
For all 20 locations, the average mean minimum rose .44 degrees C from ~1900 to 1979-2008
For 10 coastal locations, the average mean minimum rose .29 degrees C from ~1900 to 1979-2008
For 10 inland locations, the average mean minimum rose .58 degrees C from ~1900 to 1979-2008
Maxima
For all 20 locations, the average mean maximum rose .48 degrees C from ~1900 to 1979-2008
For 10 coastal locations, the average mean maximum rose .86 degrees C from ~1900 to 1979-2008
For 10 inland locations, the average mean maximum rose .10 degrees C from ~1900 to 1979-2008
and check out Chris’ Analysis and Forecast page for some great links
http://www.waclimate.net/bureau.html
including BOM links and ‘Contrary Evidence’
I know you must be sick of me and temperatures but seem to be stuck on that todayNot at all. You always have something interesting to say. -RTmore on UHI
http://www.warwickhughes.com/hoyt/uhi.htm
and the essay about half way down the page:
Since satellite measurements began in 1979, the world’s population has approximately doubled leading to an UHI signal of 0.67 C over land and 0.19 C globally. The observed surface warming is about 0.36 C over the same time period, so a substantial portion may be just uncorrected UHI effects. Other effects include land use changes, increased darkness of vegetation, direct heat from fossil fuel burning, a brighter sun, changes in cosmic ray intensity, soot on snow, more soot in the atmosphere, and greenhouse gases (and this list is not exhaustive). There are many competing theories for the recent warming and some of them do a better job at explaining the observations than do greenhouse gases.
The land surface stations were designed to provide local climatology. They were not designed to detect climate change. Quality control of the surface network is inadequate.
And ain’t that right!
UAH and UHI
Posted on December 16, 2010 by Anthony Watts
Note: clearly satellites can see urban heat, as demonstrated by this recent paper unveiled at the 2010 AGU meeting by NASA. See: Satellites Image the Urban Heat Islands in the Northeast. It can also be demonstrated that the UHI biases the thermometers upwards. As cities grow, so does the increased bias. In that paper NASA says:
The compact city of Providence, R.I., for example, has surface temperatures that are about 12.2 °C (21.9 °F) warmer than the surrounding countryside…
http://wattsupwiththat.com/2010/12/16/uah-and-uhi/
The Urban Heat Island Effect Distorts Global Temperatures
Written by Dr. Tim Ball | May 17 2011
How much do calculations of global temperatures represent the real temperature of the Earth? Every day, new stories appear about temperature records with errors or deliberate omissions. Essex, McKitrick, and Andresen’s article suggests that such a creature doesn’t exist. An important part of the debate is something called the urban heat island effect (UHIE)
History of the UHIE Problem
Physical Cause
Evidence
http://climatechangedispatch.com/home/8996-the-urban-heat-island-effect-distorts-global-temperatures
New paper finds urban heat islands cause up to 44% of total recorded warming
A paper published today in the Journal of Geophysical Research finds that urban heat islands account for up to 44% of the recorded warming in cities in east China over the period of 1981-2007.
JOURNAL OF GEOPHYSICAL RESEARCH, VOL. 116, D14113, 12 PP., 2011
Observed surface warming induced by urbanization in east China
Key Points:
The rapid urbanization has significant impacts on temperature over east China
A new method was developed to dynamically classify urban and rural stations
Comparison of the trends of UHI effects by using OMR and UMR approaches
Xuchao Yang et al
http://hockeyschtick.blogspot.com/2011/07/new-paper-finds-urban-heat-islands.html
New paper: UHI, alive and well in China
Posted on July 28, 2011 by Anthony Watts
http://www.agu.org/journals/jd/jd1114/2010JD015452/2010jd015452-op04-tn-350x.jpg
JOURNAL OF GEOPHYSICAL RESEARCH, VOL. 116, D14113, 12 PP., 2011
doi:10.1029/2010JD015452
Observed surface warming induced by urbanization in east China
Key Points
* The rapid urbanization has significant impacts on temperature over east China
* A new method was developed to dynamically classify urban and rural stations
* Comparison of the trends of UHI effects by using OMR and UMR approaches
Xuchao Yang, Shanghai Typhoon Institute of China Meteorological Administration, Shanghai, China Institute of Meteorological Sciences, Zhejiang Meteorological Bureau, Hangzhou, China Yiling Hou, Shanghai Climate Center, Shanghai, China, Baode Chen, Shanghai Typhoon Institute of China Meteorological Administration, Shanghai, China
Monthly mean surface air temperature data from 463 meteorological stations, including those from the 1981–2007 ordinary and national basic reference surface stations in east China and from the National Centers for Environmental Prediction and National Center for Atmospheric Research (NCEP/NCAR) Reanalysis, are used to investigate the effect of rapid urbanization on temperature change.
http://wattsupwiththat.com/2011/07/28/new-paper-uhi-alive-and-well-in-china/
New paper: Urban Heat Island effect accounts for 56% of warming in urban areas over past 55 years
A paper published last week in the journal Atmospheric Environment finds that the urban heat island effect accounts for 56% of the total temperature increase in South Korean city stations over the past 55 years. Warmists constantly downplay the urban heat island (UHI) effect as insignificant, but as this paper and several others show, the UHI does cause very significant biases with stations reporting higher or rising temperatures because they are poorly sited.
Quantitative Estimates of Warming by Urbanization in South Korea over the Past 55 Years (1954-2008)
Maeng-Ki Kim and Seonae Kim
http://hockeyschtick.blogspot.com/2011/08/new-paper-urban-heat-island-effect.html
Unadjusted data of long period stations in GISS show a virtually flat century scale trend
Posted on October 24, 2011 by Anthony Watts
Temperature averages of continuously reporting stations from the GISS dataset
Guest post by Michael Palmer, University of Waterloo, Canada
Abstract
The GISS dataset includes more than 600 stations within the U.S. that have been in operation continuously throughout the 20th century. This brief report looks at the average temperatures reported by those stations. The unadjusted data of both rural and non-rural stations show a virtually flat trend across the century.
[…]
Figure 3: Temperature trends and station counts for all US stations in GISS reporting continuously, that is containing at least one monthly data point for each year from 1900 to 2000. The slope for the rural stations (335 total) is -0.00073 deg/year, and for the other stations (278 total) -0.00069 deg/year. The monthly data point coverage is above 90% throughout except for the very first few years.
>>>>>>>>>
http://wattsupwiththat.com/2011/10/24/unadjusted-data-of-long-period-stations-in-giss-show-a-virtually-flat-century-scale-trend/
What the BEST data actually says
Posted on October 24, 2011 by Willis Eschenbach
[…]
Disagreement with satellite observations
[…]
Remember what we would expect to find if all of the ground records were correct. They’d all lie on or near the same line, and the satellite temperatures would be rising faster than the ground temperatures. Here are the actual results, showing BEST, satellite, GISS, CRUTEM, and GHCN land temperatures:
Figure 3. BEST, average satellite, and other estimates of the global land temperature over the satellite era. Anomaly period 1979-1984 = 0.
In Figure 3, we find the opposite of what we expected. The land temperatures are rising faster than the atmospheric temperatures, contrary to theory. In addition, the BEST data is the worst of the lot in this regard.
Disagreement with other ground-based records.
The disagreement between the four ground-based results also begs for an explanation. Note that the records diverge at the rate of about 0.2°C in thirty years, which is 0.7° per century. Since this is the approximate amount of the last century’s warming, this is by no means a trivial difference.
>>>>>>>>>>
http://wattsupwiththat.com/2011/10/24/what-the-best-data-actually-says/#more-49905
Rewriting The Past At The Ministry Of Truth
In 1975, NCAR generated this graph of global cooling. Temperatures plummeted from 1945 to at least 1970.
Note that 1970 used to be colder than 1900.
In 2011, Richard Muller published this graph, showing that it never happened.
Below is an overlay at the same scale. The cooling after 1950 has disappeared. Winston Smith would be proud!
[See plots]
>>>>>>>>>>>
http://www.real-science.com/rewriting-ministry-truth
A ‘Q&A’ With A Renowned Climate Expert Regarding The BEST Research
Below, we posed some very basic questions to Dr. Ball. Our questions are in bold and Dr. Ball’s responses italicized:
1. What actually did BEST analyze and measure? Global temperatures or a subset of global temperatures?
2. Did BEST use different climate station data than that used by NASA/GISS, NOAA/NCDC and HadCRUT?
3. Did BEST use only the best climate stations’ data or did they use all stations’ data?
4. Did BEST use the actual raw temperature data, or an “improved” raw dataset, or adjusted temperatures for their analysis?
5. If adjusted, did BEST perform the adjustments or another agency (3rd party)?
6. How were the adjustments done?
7. Has BEST made public their calculated monthly anomalies and monthly baseline means used to calculate the anomalies for their new temperature series?
8. From your preliminary review of the BEST research, what do you like best of their methodology? What are the shortcomings of their methodology?
http://www.c3headlines.com/2011/10/a-qa-with-a-renowned-climate-expert-regarding-the-best-research.html
Singer’s letter to WaPo on BEST
The scientific finding that does not settle the climate-change debate
S. Fred Singer Letter to WashPost Oct 25, 2011
http://wattsupwiththat.com/2011/10/25/singers-letter-to-wapo-on-best/
Responding to:-
The scientific finding that settles the climate-change debate
By Eugene Robinson, Published: October 25
For the clueless or cynical diehards who deny global warming, it’s getting awfully cold out there.
The latest icy blast of reality comes from an eminent scientist whom the climate-change skeptics once lauded as one of their own. Richard Muller, a respected physicist at the University of California, Berkeley, used to dismiss alarmist climate research as being “polluted by political and activist frenzy.” Frustrated at what he considered shoddy science, Muller launched his own comprehensive study to set the record straight. Instead, the record set him straight.
>>>>>>>>>
http://www.washingtonpost.com/opinions/the-scientific-finding-that-settles-the-climate-change-debate/2011/03/01/gIQAd6QfDM_story.html?wpisrc=emailtoafriend
Thanks for the link RC. A very pithy reply with some excellent points.
I particularly enjoyed the punch-line:
BEST project rescues us from thousands of lying global thermometers
Lucky the BEST project is here to save us from the lying thermometers of the past. Apparently people in the 1960′s and 1970′s were clever enough to get man on the moon, but too stupid to measure the temperature. Millions of people were fooled into thinking the world was cooling for three decades by erroneous thermometer readings. Who would have guessed?
>>>>>>>>>>
http://joannenova.com.au/2011/10/thank-god-best-project-rescues-us-from-thousands-of-lying-global-thermometers/
I note this 1975 Newsweek quote:-
If that’s below ground not above ground then half a degree becomes all the more significant. It goes on:-
“Sunshine reaching the ground” – global warming in a nutshell.
Australis brought some well needed perspective to comments under Gareth’s “You’re (not) the BEST thing” HT post I see.
Gareth has since moved on to extreme weather, prompting angst from Thomas needless to say.
Explaining Muller vs. Muller: is BEST blissfully unaware of cosmic-ray-cloud theory?
Posted on October 28, 2011 by Alec Rawls
Here is the puzzle, as noted by Nigel Calder and others: how can BEST insist that a modicum of additional evidence of late 20th century warming should put skepticism of the CO2-warming theory to rest, while at the same time admitting that they never even tried to examine the possible causes of warming?
>>>>>>>>>
http://wattsupwiththat.com/2011/10/28/explaining-muller-vs-muller-is-best-blissfully-unaware-of-cosmic-ray-cloud-theory/
Patrick Michaels on BEST: The Earth Is Round!!!
Rule #1 in science is that you don’t blatt your results until they are either presented at a peer-screened conference or published in the peer-reviewed literature. The price of this is often out-of-hand rejection of otherwise decent work.
But once again, climate researchers have blundered into science by press release instead of science by the rules. Why?
For whatever reason, the Berkeley Earth Surface Temperature (BEST) research team diverted a decent amount of the $600,000 they were granted to analyze surface temperature histories into a coordinated and extensive press release of its findings on October 21. At the same time they submitted four manuscripts to American Geophysical Union journals for peer-review.
The earth-shattering news is that the average land surface temperature of the planet is higher than it was 200 years ago. I know of no credentialed climate scientist who does not know this.
[…]
Given the fact that they have gone public with their science, I’ll go public with my review.
I have a license to do so. The first citation in the literature review portion of the manuscript on systematic urban heating bias is a 2007 paper in Journal of Geophysics by University of Guelph’s Ross McKitirick, and yours truly.
>>>>>>>>>>>>
http://www.forbes.com/sites/patrickmichaels/2011/10/28/keep-this-bit-of-news-on-the-qt-the-earth-is-round/
Ouch:-
Scientist who said climate change sceptics had been proved wrong accused of hiding truth by colleague
By David Rose – Daily Mail, UK (MailOnline)
It was hailed as the scientific study that ended the global warming debate once and for all – the research that, in the words of its director, ‘proved you should not be a sceptic, at least not any longer’.
Professor Richard Muller, of Berkeley University in California, and his colleagues from the Berkeley Earth Surface Temperatures project team (BEST) claimed to have shown that the planet has warmed by almost a degree centigrade since 1950 and is warming continually.
[…]
But today The Mail on Sunday can reveal that a leading member of Prof Muller’s team has accused him of trying to mislead the public by hiding the fact that BEST’s research shows global warming has stopped.
Prof Judith Curry, who chairs the Department of Earth and Atmospheric Sciences at America’s prestigious Georgia Institute of Technology, said that Prof Muller’s claim that he has proven global warming sceptics wrong was also a ‘huge mistake’, with no scientific basis.
Prof Curry is a distinguished climate researcher with more than 30 years experience and the second named co-author of the BEST project’s four research papers.
[…]
In fact, Prof Curry said, the project’s research data show there has been no increase in world temperatures since the end of the Nineties – a fact confirmed by a new analysis that The Mail on Sunday has obtained.
‘There is no scientific basis for saying that warming hasn’t stopped,’ she said. ‘To say that there is detracts from the credibility of the data, which is very unfortunate.’
However, Prof Muller denied warming was at a standstill.
‘We see no evidence of it [global warming] having slowed down,’ he told BBC Radio 4’s Today programme. There was, he added, ‘no levelling off’.
A graph issued by the BEST project also suggests a continuing steep increase.
[See plots: “THE GRAPH THAT FOOLED THE WORLD” and “THE INCONVENIENT TRUTH”]
But a report to be published today by the Global Warming Policy Foundation includes a graph of world average temperatures over the past ten years, drawn from the BEST project’s data and revealed on its website.
This graph shows that the trend of the last decade is absolutely flat, with no increase at all – though the levels of carbon dioxide in the atmosphere have carried on rising relentlessly.
‘This is nowhere near what the climate models were predicting,’ Prof Curry said. ‘Whatever it is that’s going on here, it doesn’t look like it’s being dominated by CO2.’
[…]
[Prof Curry] added, in the wake of the unexpected global warming standstill, many climate scientists who had previously rejected sceptics’ arguments were now taking them much more seriously.
They were finally addressing questions such as the influence of clouds, natural temperature cycles and solar radiation – as they should have done, she said, a long time ago.
[…]
In Prof Curry’s view, two of the papers were not ready to be published, in part because they did not properly address the arguments of climate sceptics.
As for the graph disseminated to the media, she said: ‘This is “hide the decline” stuff. Our data show the pause, just as the other sets of data do. Muller is hiding the decline.
‘To say this is the end of scepticism is misleading, as is the statement that warming hasn’t paused. It is also misleading to say, as he has, that the issue of heat islands has been settled.’
>>>>>>>>>
Read more: http://www.dailymail.co.uk/sciencetech/article-2055191/Scientists-said-climate-change-sceptics-proved-wrong-accused-hiding-truth-colleague.html#ixzz1cF6MpJCh
The GWPF source for the MailOnline article reference:-
Best Confirms Global Temperature Standstill
Dr. David Whitehouse – GWPF
[…]
When asked by the BBC’s Today programme Professor Richard Muller, leader of the initiative, said that the global temperature standstill of the past decade was not present in their data.
“In our data, which is only on the land we see no evidence of it having slowed down. Now the evidence which shows that it has been stopped is a combination of land and ocean data. The oceans do not heat as much as the land because it absorbs more of the heat and when the data are combined with the land data then the other groups have shown that when it does seem to be leveling off. We have not seen that in the land data.”
My first thought would be that it would be remarkable if it was……….
[…]
Fig 1 shows the past ten years plotted from the monthly data from Best’s archives. Click on the image to enlarge.
It is a statistically perfect straight line of zero gradient. Indeed, most of the largest variations in it can be attributed to ENSO and la Nina effects. It is impossible to reconcile this with Professor Muller’s statement. Could it really be the case that Professor Muller has not looked at the data in an appropriate way to see the last ten years clearly?
Indeed Best seems to have worked hard to obscure it. They present data covering more almost 200 years is presented with a short x-axis and a stretched y-axis to accentuate the increase. The data is then smoothed using a ten year average which is ideally suited to removing the past five years of the past decade and mix the earlier standstill years with years when there was an increase. This is an ideal formula for suppressing the past decade’s data.
[…]
Only a few years ago many scientists and commentators would not acknowledge the global temperature standstill of the past decade. Now that it has become unarguable there has emerged more explanations for it than can possibly be the case.
>>>>>>>>>>
http://www.thegwpf.org/the-observatory/4230-best-confirms-global-temperature-standstill.html
Uh oh: It was the BEST of times, it was the worst of times
Posted on October 29, 2011 by Anthony Watts
Alternate title: Something wonky this way comes
I try to get away to work on my paper and the climate world explodes, pulling me back in. Strange things are happening related to the BEST data and co-authors Richard Muller and Judith Curry. Implosion might be a good word.
Popcorn futures are soaring. BEST Co-author Judith Curry drops a bombshell:
>>>>>>>>>>
http://wattsupwiththat.com/2011/10/29/uh-oh-it-was-the-best-of-times-it-was-the-worst-of-times/#more-50286
Check out Los Angeles: BEST vs GISS
In an exclusive interview with “The Mail on Sunday”, Dr Judith Curry….
Somehow, I don’t think this is going to end well…
Like watching a train wreck by installments.
I’ve been looking through Curry’s posts at Climate Etc. She seems to be in rapid catch-up mode with all the sceptic angles that Jo Nova or Alan Cheetham (Global Warming Science) for example have already raked over extensively e.g. She’s posted:-
“Tropospheric and surface temperatures” by Donald Rapp (impressive Biosketch)
http://judithcurry.com/2011/10/29/tropospheric-and-surface-temperatures/
He says in respect to Christy et al. (2010)l.dealing with that topic:-
These issues are hardly news in the sceptic domain. I pointed much of this out to MftE CC in my Climate Metrics case but Santer trumps Christy apparently. “The issues have been resolved” is approximately what I received back (although negotiations are continuing).
Stuff are a bit behind the curve on this one
http://www.stuff.co.nz/world/americas/5880287/Sceptic-finds-he-now-agrees-global-warming-is-real
Delingpole isn’t
Lying, cheating climate scientists caught lying, cheating again
http://blogs.telegraph.co.uk/news/jamesdelingpole/100114292/lying-cheating-climate-scientists-caught-lying-cheating-again/
Stuff’s stuff up was to reprint a climate article from AP – bad bad Stuff.
Deltoid jokes are turning sour
http://scienceblogs.com/deltoid/2011/10/the_berkeley_earth_surface_tem.php?utm_source=sbhomepage&utm_medium=link&utm_content=channellink
Can’t wait to see what they make of Dellingpole’s “Lying, cheating climate scientists”
#170 is a classic:-
“I didn’t take that Daily Mail article seriously” – of course not.
“look at that hideous GWPF graph” – a warmists worst nightmare.
“where is the 2008 La Niña?” – that’s BEST data, ask Muller.
Daily Mail article is a “distasteful hit piece on Richard Muller by David Rose” according to one MichaelEMann on Twitter.
Watts is wound up – “The BEST whopper ever”
http://wattsupwiththat.com/2011/10/30/the-best-whopper-ever/
Afterthought – appropriate that ethically challenged Mann would attempt to steal the moral high-ground.
H/T to Tom Nelson for the Mann Twitter http://tomnelson.blogspot.com/2011/10/occupy-wall-street-snow-day-update.html#links
NZ Herald has followed Stuff’s lead and printed Seth Borenstein’s AP article “Sceptic forced to admit globe really is warming” http://www.nzherald.co.nz/science/news/article.cfm?c_id=82&objectid=10762969
Duh – I’ve seen smarter sheep than these editors.
BEST statistics show hot air doesn’t rise off concrete!
The BEST media hit continues to pump-PR around the world. The Australian repeats the old-fake-news “Climate sceptic Muller won over by warming”. This o-so-manufactured media blitz shows how desperate and shameless the pro-scare team is in their sliding descent. There are no scientists switching from skeptical to alarmist, though thousands are switching the other way.
The sad fact that so many news publications fell for the fakery, without doing a ten minute google, says something about the quality of our news. How is it headline material when someone who was never a skeptic pretends to be “converted” by a result that told us something we all knew anyway (o-look the world is warming)?
The [six] points every skeptic needs to know about the BEST saga [abbreviated]:
1. Muller was never a skeptic
2. BEST broke basic text-book rules of statistics
3. The BEST results are very “adjusted” and not the same as the original thermometer readings
4. Obviously hot air doesn’t rise off concrete
5. The BEST data show the world hasn’t warmed for 13 years.
6. The BEST group have their own vested conflicts
A Lesson for Skeptics
A field of strawmen
>>>>>>>>>>>>
http://joannenova.com.au/2011/11/best-statistics-show-hot-air-doesnt-rise-off-concrete/
And as Jo reports, “Judith Curry by the way has just threatened to quit”.
Global temperatures: All over the map
WSJ.com 11/5/11
[…]
The Berkeley Earth team uses a statistical tool it calls a scalpel to cut out data when there is a gap. It uses a process called Kriging to fill in gaps in data, borrowed from geoscientists and surveyors, who used it, for example, to estimate the elevation at point B that is between A and C, points where altitude is known. And it weights data more heavily from stations that are reliable than that from those that produce erratic numbers.
The result is a statistical model of what the temperature was at any given moment at any given location. Actual thermometer readings, when they are available, are used only as fodder for the statistical model.
The Berkeley scientists “have a very complicated model,” says William Briggs, a member of the probability and statistics committee of the American Meteorology Society. “They reported on the setting of one of those dials” in the model. “That is not the actual temperature.”
>>>>>>>>
http://hockeyschtick.blogspot.com/2011/11/global-temperatures-all-over-map.html
Nice synopsis of the BEST project in this article.
Australian temperatures
guest post by Philip Bradley
This work needs to be brought to a wider audience because it paints a very different climate picture to the global land datasets based on minimum and maximum temperatures – GISS, HadCRUT and the recent BEST analysis.
I’ll present his discoveries in three parts, as there appear to be three significant elements in his work.
1. Using a minimum and maximum temperature dataset exaggerates the increase in the global average land surface temperature over the last 60 years by approximately 45%
2. Almost all the warming over the last 60 years occurred between 6am and 12 noon
3. Warming is strongly correlated with decreasing cloud cover during the daytime and is therefore caused by increased solar insolation
I’ll then add a part 4 covering some additional analysis of mine
4. Reduced anthropogenic aerosols (and clouds seeded by anthropogenic aerosols) are the cause of most the observed warming over the last 60 years
[…]
GISS, HadCRUT and BEST wrongly assume the mean of Tmin+Tmax represents average daily (24 hour) temperatures.
Note at this point, that any day time warming that does not persist through the following night time is irrelevant to climate warming, as it is not heat that is retained in the climate system for more than 24 hours. So even a correctly derived average temperature is a poor indicator of the amount of warming.
In summary, calculating average temperature from the 3 hourly data instead of the Tmin+Tmax/2 method results in ~45% less warming since 1950.
>>>>>>>>>>>
http://www.bishop-hill.net/blog/2011/11/4/australian-temperatures.html
H/T Jonathan Lowe, Philip Bradley, Bishop Hill and Andy.
Satellite Systems
UAH lower troposphere temperature colder than 2008
Posted on October 30, 2011 by Anthony Watts
Brrr…the Troposphere Is Ignoring Your SUV
by Roy W. Spencer, Ph. D.
For those tracking the daily global temperature updates at the Discover website, you might have noticed the continuing drop this month in global temperatures. The mid-tropospheric AMSU channels are showing even cooler temperatures than we had at this date with the last (2008) La Nina. The following screen shot is for AMSU channel 6
>>>>>>>>>
http://wattsupwiththat.com/2011/10/30/uah-lower-troposphere-temperature-colder-than-2008/
Warwick Hughes http://www.warwickhughes.com/blog/?cat=1
has a post called Interesting nuances in IPCC groups treatment of New Zealand temperature data
It links to a blog article of his from 2007 pointing out the degree of warming over different timespans for two selections of grid cells covering New Zealand – for CRUT2 – CRUT3 (UKMO) and NIWA which contains two graphics here comparing the “old” CRUTem2 data of Jones, with the new Hadley Centre CRUT3 and he says ‘Looking at 140 years of New Zealand land based data we now see huge extra warming in the Hadley Centre CRUT3 gridded data.’
Warwick Hughes is a New Zealand-born graduate in geology from Auckland University who has worked in Australia for many years and has carried out pioneering research on surface temperature measurement. (bio from Dr Vincent Gray at http://www.warwickhughes.com/gray04/nzct48.htm)
If you use the ‘search’ for NIWA you find a couple of very interesting articles which are of course relevant to NZ
http://www.warwickhughes.com/blog/?p=53#more-53
He said that New Zealand temperature trends as measured by the Jones et al group at the University of Norwich for the IPCC (United Nations Intergovernmental Panel on Climate Change) show warming of 0.63 degrees C over the 27 years 1979 to 2005 which is 0.38 degrees more than the seven station NIWA anomaly trend which warms at 0.25 degrees C.
“When we have the IPCC believing that New Zealand is warming at more than double the rate that NIWA publishes, then this is a sign that the science is far from settled,” said Mr Hughes.
Richard; I just sent an e mail to Warwick Hughes; I recall I mentioned him in replies to BOM as well as this thread but if I do a search at your search request it dosen’t come up with comments in which he’s mentioned; can that be fixed is that too much hassle
Val, I just tested it and it works for me. Search for “warwick” (no quotes) and this is the first link in the list that appears:
https://www.climateconversation.org.nz/open-threads/climate/climate-science/temperature-records/
It’s this very page. Try again?
thanks Richard; much appreciated; my error was I used quotes and the whole name; anyway I’ve sent that on to Warwick as I like to keep informed those people about whom I say something
BOM Tips from Ken Stewart
https://www.climateconversation.org.nz/2010/10/world-of-sceptical-questions-unfolds%E2%80%A6/#comment-25708
Australia’s High Quality Data: 12-year-sites used for “long term” trends
Analysing Australian Temperature Trends
By Andrew Barnham
val majkus says:
November 1, 2010 at 10:47 am
Here’s the inimitable Dr Tim Ball on data manipulation
http://canadafreepress.com/index.php/article/28360
Data collection is expensive and requires continuity – it’s a major role for government. They fail with weather data because money goes to political climate research. A positive outcome of corrupted climate science exposed by Climategate, is re-examination beginning with raw data by the UK Met Office (UKMO). This is impossible because much is lost, thrown out after modification or conveniently lost, as in the case of records held by Phil Jones, director of Climategate. (Here and Here)
Evidence of manipulation and misrepresentation of data is everywhere. Countries maintain weather stations and adjust the data before it’s submitted through the World Meteorological Organization (WMO) to the central agencies including the Global Historical Climatology Network (GHCN), the Hadley Center associated with CRU now called CRUTEM3, and NASA’s Goddard Institute for Space Studies (GISS).
They make further adjustments before selecting stations to produce their global annual average temperature. This is why they produce different measures each year from supposedly similar data.
There are serious concerns about data quality. The US spends more than others on weather stations, yet their condition and reliability is simply atrocious. Anthony Watts has documented the condition of US weather stations; it is one of governments failures.” (end of quote)
and Ken Stewart has shown how this has happened in Australia http://kenskingdom.wordpress.com/2010/07/27/the-australian-temperature-record-part-8-the-big-picture/
his conclusion?
In a commendable effort to improve the state of the data, the Australian Bureau of Meteorology (BOM) has created the Australian High-Quality Climate Site Network. However, the effect of doing so has been to introduce into the temperature record a warming bias of over 40%. And their climate analyses on which this is based appear to increase this even further to around 66%.
Has anyone done similar checking in New Zealand?
Has anyone done similar checking in New Zealand?
Definitely – the NZCSC.
NZCSCET v NIWA is the culmination of that.
I saw Anthony Watts comment here a while ago asking Richard T for some info on an Auckland site, so there is overseas interest.
BTW – did you see my “Open Threads as promised” tip in “Controversy and scandal” ?
Richard; Are the results of NZCSCET data check publicly available, if so could you point me to the site; I’ll just check their site to make sure it’s not there; I know Ken Stewart will be interested
Just hit the Tag “NZ temperature records” or select the Category “Air temperature” in the right hand column (or both).
Thanks Richard; I sent Ken a link to the Quadrant Online article which is very informative
Richard, when is the result of the BOM auditing of NIWA’s records due?
I wish I knew. We expect an announcment shortly, but who knows?
Hi Val,
The NZ Climate Science Education Trust has been set up to make legal representation easier in our case against NIWA. The case is an Application for Judicial Review — basically asking a judge to look at something we disagree with. As you know, the Coalition (NZCSET) has lodged a Statement of Claim and in response, NIWA’s lawyers have lodged a Statement of Defence. The next move is a conference to decide whether the parties agree. If not, there could a day in court.
There’s a lot of interest, both here and around the world, but there’s no news yet, sorry!
Thanks Richard; as Warwick Hughes said about the BOM auditing ‘it’s a bit like asking Dracula to audit the Blood Bank’
Dr Fred Singer has an opinion piece in The American Thinker today http://www.americanthinker.com/2010/11/the_french_academy_lays_an_egg.html
quoting one para
Even more important, weather satellite data, which furnish the best global temperature data for the atmosphere, show essentially no warming between 1979 and 1997. Now, according to well-established theories of the atmosphere as expounded by textbooks, the surface warming must be smaller than the atmospheric trend by roughly a factor of two. But one half of zero is still zero. It suggests that the surface warming reported by the IPCC, based on weather-station data that had been processed by the Climate Research Unit of East Anglia University (CRU-EAU) may not exist. How could this have come about? We will get the answer once we learn how the CRU selected particular weather stations (from some thousands worldwide) to use for their global product and how they then corrected the actual data (to remove urban influences and other effects). So far, none of the several investigations of “Climategate” has delved into these all-important details. Nor have they established the exact nature of the “trick” used by the CRU and fellow conspirators to “hide the decline” (of temperature) — referred to in the leaked Climategate e-mails.
The “trick to hide the decline” refers to the splicing of the tree ring proxies to the temperature records. The decline refers specifically to the decrease in the apparent recent temperature shown in the proxies, and is commonly referred to as the “divergence problem”
It does not refer to a decline in measured land temperatures in this context.
here’s another article by Dr S Fred Singer
http://www.americanthinker.com/2010/11/the_green_bubble_is_about_to_b.html
There is a revolution coming that is likely to burst the green global warming bubble: the temperature trend used by the IPCC (the U.N.’s Intergovernmental Panel on Climate Change) to support their conclusion about anthropogenic global warming (AGW) is likely to turn out to be fake. The situation will become clear once Virginia’s attorney general, Kenneth Cuccinelli, obtains information now buried in e-mails at the University of Virginia. Or Hearings on Climategate by the U.S. Congress may uncover the “smoking gun” that demonstrates that the warming trend used by the IPCC does not really exist.
and more on the Case
http://legaltimes.typepad.com/blt/2010/11/global-warming-foia-suit-against-nasa-heats-up-again.html#more
Global Warming FOIA Suit Against NASA Heats Up Again
In court documents filed last night, the Competitive Enterprise Institute argues that NASA has gone out of its way to avoid turning over records that show the agency reverse-engineered temperature data to better make the case that the planet is becoming warmer
I’ve just posted this at Jo Nova’s blog but worth repeating here
Looking at BOM and temperatures I was directed from WUWT to this comment by Warwick Hughes
http://www.warwickhughes.com/climate/gissbom.htm
At http://www.warwickhughes.com/climate/bom.htm Warwick says ‘Internationally the BoM does not have things all its own way, NASA / GISS publish on their web site (see www links) UHI adjusted trends for Australian cities and also have for download SE Australian rural records that show no warming since late 19 C, which the BoM say are incorrect’
the comment at http://www.warwickhughes.com/climate/gissbom.htm
is posted 27/9/2000 and Warwick says ‘This page presents three Australian State Capital temperature records compared to nearby rural or more rural neighbours.
This is a significant area for study because south eastern Australia has probably the best set of long term rural temperature records in the southern hemisphere.
GISS / NASA adjustments for urban warming are presented in the case of the State Capitals, contrasting with the Australian Bureau of Meteorology (BoM) adjustments.’
The differerence is stark with BoM finding much warming over SE Australia since late 19 C while NASA finds little trend.
At http://www.warwickhughes.com/climate/ 26, June, 2005, Warwick has a site which in his words exposes the errors and distortions in temperature records used by the IPCC as evidence of “global warming”, Warwick says the central contention of these pages is that for over a decade the IPCC has published global temperature trends distorted by purely local warmth from Urban Heat Islands (UHI’s). There is simply no systematic compensation for urban warming in the Jones dataset. Occasionally there is a slight adjustment in a record for a site change or other anomaly but the majority of records are used “raw”.
Warwick gives some history of this amazing work at http://www.warwickhughes.com/
For those who do not know this is his blog and you can learn about him as the ‘about’ button http://www.warwickhughes.com/blog/
He was also the recipient of the famous Jones letter which you can read about on WUWT http://wattsupwiththat.com/2009/09/23/taking-a-bite-out-of-climate-data/ ‘The Dog ate Global Warming’ That article informs us that ‘Warwick Hughes, an Australian scientist, wondered where that “+/–” came from, so he politely wrote Phil Jones in early 2005, asking for the original data. Jones’s response to a fellow scientist attempting to replicate his work was, “We have 25 years or so invested in the work. Why should I make the data available to you, when your aim is to try and find something wrong with it?”
WUWT has an interesting post up today A J Strata
http://wattsupwiththat.com/
Bottom Line – Using two back-of-the-envelope tests for significance against the CRU global temperature data I have discovered:
■75% of the globe has not seen significant peak warming or cooling changes between the period prior to 1960 and the 2000′s which rise above a 0.5°C threshold, which is well within the CRU’s own stated measurement uncertainties o +/- 1°C or worse.
■Assuming a peak to peak change (pre 1960 vs 2000′s) should represent a change greater than 20% of the measured temperature range (i.e., if the measured temp range is 10° then a peak-to-peak change of greater than 2° would be considered ‘significant’) 87% the Earth has not experienced significant temperature changes between pre 1960 period and the 2000′s.
GISS Temperatures Out Of Line With The Rest Of The World
Posted on December 11, 2010 by stevengoddard
GISS shows temperatures rising sharply since July. We have been having a record cold La Niña since then, and everyone else shows temperatures plummeting.
GISS also showed a huge spike in March which nobody else saw. Does this have anything to do with Hansen’s constant claims of 2010 as the hottest year ever? He shows peak La Niña temperatures almost as warm as peak El Niño temperatures. That is simply ridiculous.
See Plots……
Yes, Richard I saw that this morning and left a comment on Warwick Hughes’ blog; I’m hoping he will make a comment;
Dr Ball has an article here http://www.canadafreepress.com/index.php/article/20528
he says How agencies like the Goddard Institute for Space Studies (GISS) or the Climatic Research Unit (CRU) adjust for the UHIE is the crux of the problem.
the article contains excerpts from Warwick Hughes’ research which finished in 1991 and was continued by Long
Dr Ball ends by saying ‘Now you see why the CRU and IPCC limited the number of stations they were using and restricted them to mostly urban stations to get the result they wanted. You also understand why Tom Wigley told Jones in a leaked email of November 6, 2009 that, “We probably need to say more about this (Difference between land and ocean temperatures). Land warming since 1980 has been twice the ocean warming and skeptics might claim that this proves that urban warming is real and important.” Exactly Tom! ‘
I had an idea that Warwick had a 1995 peer reviewed paper on the topic but I can’t find it
there is this excellent paper by the late John Daly relating to the radio sonde record
http://www.john-daly.com/sonde.htm
That radio sonde comparison shows the blatant miss-information (science?) promulgated (Santer in this case).
No wonder the warmists cheered when John Daly passed away.
NIWA Releases Review Of NZ Temperature Trends
Thursday, 16 December, 2010 – 18:14 – Voxy
NIWA today released a report reviewing its seven station temperature series, which adds to its analysis of New Zealand’s temperature trends over the past 100 years.
The report was independently peer reviewed by Australia’s Bureau of Meteorology to ensure the ideas, methods, and conclusions stood up in terms of scientific accuracy, logic, and consistency.
NIWA CEO John Morgan confirmed that the scientists from the Bureau’s National Climate Centre concluded that the results and underlying methodology used by NIWA were sound.
“We asked the Australian Bureau of Meteorology to conduct the peer review to ensure a thorough examination by an independent, internationally respected, climate science organisation”, said NIWA CEO John Morgan.
Mr Morgan confirmed that the scientists from the Bureau’s National Climate Centre concluded that the results and underlying methodology used by NIWA were sound.
NIWA’s seven station temperature series comprises temperature records from Auckland, Wellington, Masterton, Nelson, Hokitika, Lincoln, and Dunedin. The seven locations were chosen because they have robust and well documented temperature records that go back 100 years or more, and they represent a good geographical spread across New Zealand.
Temperature data from the seven locations were first examined 30 years ago by leading New Zealand climatologist, Dr Jim Salinger. After making some adjustments for changes in measurement sites, Dr Salinger concluded that the average New Zealand temperature had warmed significantly during the 20th Century.
The series from the seven stations were reviewed in 1992, and then updated annually. They indicated a warming of about 0.9C over the 100 years up to 2009.
In 2010, NIWA re-analysed the Hokitika station temperature series and published the results to demonstrate the methodology applied in creating a temperature series. Because of the public interest in climate data, the NIWA Board and the Minister of Research, Science & Technology, Dr Wayne Mapp, asked that a full review of each of the seven sites be undertaken by NIWA. That review has been completed, independently peer reviewed, and the report released today represents the results of that work.
“I am not surprised that this internationally peer reviewed 2010 report of the seven station temperature series has confirmed that NIWA’s science was sound. It adds to the scientific knowledge that shows that New Zealand’s temperature has risen by about 0.9 degrees over the past 100 years” Mr Morgan said.
7 station series review
Home › News & Publications › News › 7 station series review
NIWA today released a report reviewing its seven station temperature series, which adds to its analysis of New Zealand’s temperature trends over the past 100 years.
December 16, 2010
You can also download the full report. (169 page PDF, 7.6 MB)
[Snip}
Questions and Answers
What does the re-analysis show?
The key result of the re-analysis is that the New Zealand-wide warming trend from the seven-station series is almost the same in the 2010 revised series as in the previous series. That is, the previous NIWA result is robust. In terms of the detail for individual sites, the 100-year trend has increased at some sites, and decreased at others, but it is still within the margin of error, and confirms the temperature rise of about 0.9 degrees over the last 100 years.
What is the seven station temperature series?
The seven station series is an analysis of temperature trends from climate station data at seven locations which are geographically representative of the country, and for which there is a long time series of climate records (over 100 years): Auckland, Wellington, Masterton, Nelson, Hokitika, Lincoln (near Christchurch), and Dunedin.
The concept of the seven-station temperature series was originally developed by Dr Jim Salinger in 1981. He recognised that, although the absolute temperatures varied markedly from point to point across the New Zealand landscape, the variations from year to year were much more uniform, and only a few locations were actually required to form a robust estimate of the national temperature trend.
The seven-station series was revised and updated in 1992, and again in 2010.
How is the seven station series constructed?
At each of the seven locations there have been changes in specific climate station location over time. When you create a long time series by adding information from each of these station locations together, you have to make adjustments to account for these changes.
There have been various changes in location etc over time at each of the seven locations making up the seven station series. In order to create a long time series at each location, it is necessary to merge temperature records from a number of specific sites have been merged together to form a long time series. When merging different temperature records like this, it is necessary to adjust for climatic differences from place to place or significant biases would be introduced. Adjustments may also be needed even if the site does not physically move, because changes in exposure or instrumentation at a given site can also bias the temperature measurements.
Why do you need to adjust the raw data?
Long time series of climate data often have artificial discontinuities – caused, for example, by station relocations, changes in instrumentation, or changes in observing practice – which can distort, or even hide, the true climatic signal. Therefore, all climatic studies and data sets should be based on homogeneity-adjusted data.
That is what NIWA climatologists have done in the seven station series, and the seven individual station review documents outline the adjustments.
The raw (original) climate station data have not been changed and are freely available on the NIWA climate database, which means that the NIWA seven station series can be easily reproduced.
How does the NZ temperature rise compare with the global temperature rise?
The 2010 NIWA review of long-term trends in New Zealand’s annual mean temperature showed a rise of about 0.9 degrees Celsius over the last 100 years (0.91 +/- 0.3 degrees C per Century). The IPCC estimate of the 100 year global temperature trend from 1906 to 2005 is 0.74 degrees Celsius, The difference between the IPCC global trend and NIWA’s New Zealand number is less than the margin of error of the trend estimates. The IPCC estimate is a global average – different parts of the world will show different rates of warming. New Zealand is strongly influenced by its oceanic environment.
What is the peer review process?
In this scientific context, peer review constitutes a thorough evaluation of the NIWA documents by individuals with appropriate qualifications in the same field of science (climatology/meteorology). The documents were written by highly qualified NIWA scientists, internally reviewed by other similarly qualified NIWA scientists, and the revised documents were then externally reviewed by Australian Bureau of Meteorology scientists. The external review examined the ideas, methods, and conclusions for scientific accuracy, clarity, and logic.
The NIWA authors then addressed the comments made by the reviewers at the Bureau of Meteorology, and the subsequently revised documents have now been published on the NIWA website.
But how does it compare with the Southern Hemisphere (SH) trend?
And you’ve even found more warming than the global trend, NIWA.
From page 13 of the pdf report
The review does not constitute a reanalysis of the New Zealand ‘seven station’ temperature
record. Such a reanalysis would be required to independently determine the sensitivity of, for
example, New Zealand temperature trends to the choice of the underlying network, or the
analysis methodology. Such a task would require full access to the raw and modified
temperature data and metadata, and would be a major scientific undertaking
Well spotted, Andy. They’re saying that what they accused the NZCSC of “already knowing”, back in November 2009, is actually “a major scientific undertaking”.
Page 8
Page 7
These straight line trends are meaningless – its a sine curve.
SST trend +0.71C
7SS trend +0.91C
But atmosphere warming significantly faster than ocean ???
Both of those show cooling over the last decade – Ha!
Page 13
Page 25
They found even MORE warming at Auckland – its worse than we thought.
(and Masterton, and Wellington)
Basically NIWA gave BOM its methodology and BOM confirmed that NIWA used it – seems reasonable given its NOT A REANALYSIS.
Court Orders University to Surrender Global Warming Records
http://www.suite101.com/content/court-orders-university-to-surrender-global-warming-records-a328888
The University of Virginia (U.Va.) had stalled since last year in handing over its record relating to accusations against a former academic employee implicated in Climategate affair of November 2009.
The researcher at the center of the underliying controversy is global warming doomsayer, Professor Michael Mann who now works at Penn. State University. Mann, a Lead Author for the United Nations Intergovernmental Panel on Climate Change (IPCC) has been under increased scrutiny since the climate fraud scandal hit the headlines over a year ago.
The latest story appears on the SPPI website which reports, “Court records reveal that counsel for the University has indicated instead that the Mann-related records do in fact exist, on a backup server. To avoid University delay or claims for huge search fees, today’s request specifically directs the school to search that server.”
Wonder if they will find anything?
Lubos Motl: Is there a 66-year cycle in temperatures?
Thursday, March 3rd 2011, 2:24 PM EST – Co2sceptic
Many people have noticed that the global mean temperature was increasing in the first third of the 20th century (33 years), slightly decreasing in the second third of the 20th century (33 years), and increasing in the last third of the 20th century (33 years or so).
If this evolution may be extrapolated in the obvious way, there is a 66-year cycle in the temperatures. The 20th century warming is biased because the century included two warming half-periods but only one cooling half-period: that’s why the warming trend in the 20th century could be much higher than the long-term one.
Some of the warmest years – at least in the U.S. – appeared approximately 66 years before the warm years such as 1998.
Continues………
Examining the past 15 years of monthly global temperature anomalies, the per century change from a warming trend to a cooling trend becomes clear.
Since 2001 the per century trends have conclusively switched from a global warming direction to a global cooling direction. In addition, the early 2011 temperature anomalies confirm what has actually been taking place since 2001. If the May 2011 10-year trend continues, the global temperature by 2100 will have decreased by -0.67°C.
This warming to cooling reversal has happened in the face of “business as usual” increases in atmospheric CO2 levels.
http://www.c3headlines.com/2011/07/latest-global-temperature-data-confirms-that-unequivocal-global-cooling-is-accelerating.html
—————————————————————————————————————————-
Phil Jones recently proclaimed that global warming finally has become “statistically significant” since 1995.
Jones was speaking of the 15 years starting in 1996 and ending in 2010. Although the warming has become “statistically significant” in his opinion (not others), the actual level of warming is literally immaterial when put into the context of catastrophic warming (from 5 to 10 degrees Celsius by 2100) pushed by government payroll scientists enthralled (enriched?) with alarmism.
To better understand the level of immaterial warming that has happened, look at the chart above (adjacent). The red curve is a plot of monthly HadCRUT anomalies for the 15 years (180 months) ending May 2011. The light blue curve is a 2nd order curve fit of the anomalies. The black dots represent monthly atmospheric CO2 levels and gray curve the 2nd order fitting to those CO2 levels. (charts and stats done in Excel)
[…]
Below are charts for other major temperature datasets for the last 15 years ending May 2011. Regardless of the dataset, global temperatures are in a slight cooling phase presently and they could continue to go down, and then again, they may not. That’s what natural climate change is about.
http://www.c3headlines.com/2011/06/phil-jones-global-warming-over-last-15-years-is-insignificant-immaterial-irrelevant-and-inco.html
Hansen And His Purple Crayon
by Steven Goddard
Hansen found the region north of 80N pretty darn hot during September, even though he doesn’t have any thermometers there. DMI does have thermometers there, and they found that September was normal or below normal. Whatever Hansen is doing bears no resemblance to science.
[See plot]
With 250km smoothing, it is clear that the hot area north of 80N is all “missing data.”
[See plot]
DMI shows normal to below, using the same baseline period.
[See plot]
By making up hot temperatures in the Arctic, Hansen is able to drag the global temperature anomaly up significantly.
>>>>>>>>>>
http://www.real-science.com/hansen-purple-crayon
New GISS Data Set Heating Up The Arctic
Posted on January 15, 2012 by Steven Goddard
GISS has invented some new corrections to heat the Arctic. Until a few weeks ago, Reykjavik, Iceland looked like the graph below, with the 1930s and 1940s warmer than the present :
[See plot]
Now it looks like this, with the 1930s and 1940s much cooler than the present.
[See plot]
The new technique is called “removing suspicious records” – i.e any record which doesn’t support his theory.
Hansen seems to have adopted a general strategy of warming the present by cooling the 1930s and 1940s.
[See blink comparators and 1940 Arctic warming documentation]
http://www.real-science.com/new-giss-data-set-heating-arctic?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Real-Science%2Ffeed+%28Real+Science%29
Cooling The Past In Nuuk
Posted on January 15, 2012 by Steven Goddard
Ole Heinrich points out that Hansen is cooling the past all over the Arctic. Compare the current Nuuk, with the one from a few weeks ago.
[See blink comparator]
Hansen must have spoken to his witch doctor, who told him that all temperatures before 1980 were exactly one degree too high, and that all temperatures since 1990 were a little too low.
[See adjustments plot]
Richard Muller (the skeptic) says that GISS data is all golden.
http://www.real-science.com/cooling-nuuk
Hansen Warming Things Up In Reykjavik
By Paul Homewood January 17, 2012
Following on from yesterday’s Icelandic Saga, I now have the real temperature records for Reykjavik from the Iceland Met Office. You will recall that GISS had recently altered their temperature records for several stations in Iceland and Greenland – full story here (Linked]. Information on the Iceland Met Office site did not seem to support Hansen’s revised figures, but now I have the full temperature record for Reykjavik from them, which shows how much GISS have adjusted their temperatures up by.
>>>>>>>>>>>>
http://notalotofpeopleknowthat.wordpress.com/2012/01/17/hansen-warming-things-up-in-reykjavik/
GISS Make The Past Colder In ReykjavikBy
Paul Homewood January 24, 2012
Yesterday while finishing off my post “NOAA Don’t Believe The Iceland Met Office”, by chance, I took another look at the latest GISS graph for Reykjavik, which is shown above. It showed, just as it had last week, that the temperature in 2003 was much higher than 1939 and 1941, which as we now know from the Iceland Met Office is not true. However something else looked wrong, that I could not put my finger on.
Fortunately, however, I had kept a printout of last week’s dataset and eventually I realised something else had changed. Effectively the scale had been shifted down, so that every singly year became about a degree cooler.
>>>>>>>>>
Footnote
I have contacted the Iceland Met Office to get their view on the GHCN adjustments. Their reply could not be clearer :-
a) They were not aware these adjustments were being made.
b) They do not know why they have been made, but are asking!
c) They do not accept the “corrections” and have no intention of amending their own records.
http://notalotofpeopleknowthat.wordpress.com/2012/01/24/giss-make-the-past-colder-in-reykjavik/
How GISS Has Totally Corrupted Reykjavik’s Temperatures
By Paul Homewood January 25, 2012
Now that GHCN have created a false warming trend in Iceland and Greenland , and GISS have amended every single temperature record on their database for Reykjavik going back to 1901 (except for 2010 and 2011), we should have a look at the overall effect.
[See plot]
The red line reflects the actual temperature records provided by the Iceland Met Office and shows quite clearly a period around 1940, followed by another 20 years later, which were much warmer than the 1970’s. GISS, as the blue line shows, have magically made this warm period disappear, by reducing the real temperatures by up to nearly 2 degrees.
Meanwhile the Iceland Met Office say that “The GHCN “corrections” are grossly in error in the case of Reykjavik”.
http://notalotofpeopleknowthat.wordpress.com/2012/01/25/how-giss-has-totally-corrupted-reykjaviks-temperatures/
Google Warming: Google Sponsors Student To Fabricate “Global Warming” Temperatures For NASA
Corruption of climate science takes all sorts of forms – one is to fabricate global warming temperatures after the fact, using “correcting” algorithms that NASA / GISS favors, which it now appears to have been outsourced to a Google-funded effort – aka ‘Google Warming’
[…]
The adjustments done to historical temperatures during 2011 provides further evidence that climate data corruption is alive and well within the climate science community. But the big surprise is who actually performed the magical global warming of Arctic regions….
Voila, we can now add the term ‘Google Warming’ to the climate debate – perhaps understood to mean the following?: “to fabricate global warming.”
http://www.c3headlines.com/2012/03/google-warming-google-sponsors-student-to-fabricate-global-warming-temperatures-for-nasa.html
Doing The Dirty Deed In Dublin
Posted on January 18, 2012 by Steven Goddard
Hansen just whacked a degree off all pre-1990 temperatures in Dublin.
[See blink comparator]
http://www.real-science.com/dirty-deed-dublin
Just out:
Hausfather et al
The Impact of Urbanization on Land Temperature Trends
While urban warming is a real phenomenon, it is overweighted in land temperature reconstructions due to the oversampling
of urban areas relative to their global land coverage. Rapid urbanization over the past three decades has likely contributed
to a modest warm bias in unhomogenized global land temperature reconstructions, with urban stations warming about ten
percent faster than rural stations in the period from 1979 to 2010. Urban stations are warming faster than rural stations on
average across all urbanity proxies, cutoffs, and spatial resolutions examined, though the underlying data is noisy and there
are many individual cases of urban cooling. Our estimate for the bias due to UHI in the land record is on the order of 0.03C
per decade for urban stations. This result is consistent with both the expected sign of the effect and regional estimates
covering the same time period (Zhou et al 2004) and differs from some recent work suggesting zero or negative UHI bias.
http://eposters.agu.org/files/2011/12/Hausfather-et-al-Urban-Trends.pdf
Anthony Watts does a summary of this work here
http://wattsupwiththat.com/2011/12/05/the-impact-of-urbanization-on-land-temperature-trends/
HadCRUT Continues To Plummet
Posted on December 26, 2011 by Steven Goddard
The HadCRUT November anomaly was the fourth lowest of the millennium.
http://www.real-science.com/hadcrut-continues-plummet?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Real-Science%2Ffeed+%28Real+Science%29
HadCRUT3v is the global land+ocean series (as opposed to other land-only series e.g. BEST)
GISTEMP anomaly is at 0 for the first time since 1993.
http://junksciencearchive.com/MSU_Temps/All_Comp.png
December 2011 datapoint. Baseline 30-year mean 1981-2010
Renowden may be crowing in ‘McLean’s folly 2: the reckoning’ but he’s ended his series at Nov, anomaly +0.51 C.
If he’d added Dec, the anomaly would be at 0 C
Might not be crowing 2012.
I’m at a loss to explain Gareth’s “GISS” plot. It shows a +0.51 C Nov 2011 anomaly from the “1880-2011 average”.
http://hot-topic.co.nz/wp-content/uploads/2012/01/Maclean1956outcome.png
But the current GISTEMP anomaly from the 1981-2010 30-year mean baseline also had +0.51 C (or close).
http://junksciencearchive.com/MSU_Temps/All_Comp.png
Also, the “GISS” series from 1978/12 – 2011/11 looks nothing like the corresponding GISTEMP series. Gareth’s “GISS” series was at 0 anomaly 1978/12 and does not go below that after but GISTEMP is at or below 0 six times after 1978/12, the last time being 1993/12.
Are “GISS” and GISTEMP two totally different series?
“GISS” might be land-only because the JunkScience source for GISTEMP is land+SST:-
http://junksciencearchive.com/MSU_Temps/GISSglobal.png
I had two more comments posted under the Dom Salinger article #75 and #76 addressed to RW, #75 is:-
******************************************************************************************************
Nonentity #75 05:14 pm Jan 13 2012
RW #73, I see the leader of your local groupthink circle (Gareth Renowden at Hot Topic) has just used GISTEMP to crow about a 0.51 C Nov 2011 anomaly.
Good thing he didn’t end his series with the Dec 2011 anomaly (0, first time since 1993) otherwise he’d have to explain an exactly average anomaly.
Tricky, and a bit boring for the circle I would imagine.
http://www.stuff.co.nz/dominion-post/comment/6229217/Extreme-is-the-new-normal
******************************************************************************************************
It will be fun if RW responds and is printed (braces for causticism).
Answering my own question, Gareth’s data source is here:-
http://data.giss.nasa.gov/gistemp/tabledata_v3/GLB.Ts+dSST.txt
Land-Ocean, base period: 1951-1980, I must be wrong about the “1981-2010 30-year mean baseline” for the JunkScience Monthly Mean plot.
Exactly the same data source as JunkScience for GISTEMP:-
http://junksciencearchive.com/MSU_Temps/GISSglobal.png
But what has Gareth plotted? (“the graph I created”)
First of all, the +0.51 C is the D-N Ann Mean anomaly, not “Jan – Nov 2011” as he states and not the +0.48 Nov anomaly that I thought was +0.51 (my error)..
He’s plotted the D-N Ann Means at for each year – not the monthly anomalies and there’s obviously a lot less datapoints. There’s a negative monthly anomaly at 1994/2 that does not occur on “the graph I created”. His first negative anomaly is the 1976 Ann Mean.
http://hot-topic.co.nz/mcleans-folly-2-the-reckoning/
http://hot-topic.co.nz/wp-content/uploads/2012/01/Maclean1956outcome.png
2011 still much higher than 1956 though but he could be in for a rude shock when he plots the D-N Ann Mean at the end of this year.
Here’s the REAL land temps currently in Canterbury (forget NZT7 and Stevenson’s Screens)
10cm Soil Temperature degrees Celsius
http://www.hydroservices.co.nz/images/pdf/soil.pdf
The 10 C line is when “everything shuts down” apparently. That the current soil temperatures are close to the 10 C line so soon after summer (due to cloudiness) is not good from what I can gather.
As found at Hydro Services after watching ‘Rob’s Country’ on Central TV. Click on ‘AS SEEN ON CTV VIEW Previous Week’s Soil Temperatures’.
http://www.hydroservices.co.nz/
Central and CTV are different outfits, ‘Rob’s Country’ is a CTV programme that was based in the CTV building that collapsed in the earthquake. Rob was not in the building at the time and he operated out of Central in the Waikato for a while after I think.
What will these temps do to the SI-biased NZT7 2012 figure (12.8 2011, 13.1 2010)?
Lincoln is the only actual NZT7 location mentioned but some of the other SI locations are close by to Nelson, Hokitika and Dunedin at a stretch.
From our correspondent in South Canterbury:
I took last Monday afternoon off to go skiing with my son. it was minus 4 degrees C at 11am as we left the garage. It stayed at that temp until 10 km out of town, when it rose to a whole 1 degree C.
I’m guessing that it got a whole lot colder than -4 during the night. The 2cm of ice on the dogs drinking bowl might be a clue.
Unfortunately for ski lovers like myself, our local skifield (Dobson) is one of the few not open due to lack of snow
The high winds a week ago blew the snow away and there hasn’t been any snow since because of the high pressure system over the country.
Did you de-ice the dog?
No snow here in Tga either, sunny with schoolie girls out parading in miniskirts.
Our proxy for this being fog-bound Auckland Airport.
My concern for your dog arises from the Oymyakon doco. The inhabitants brought their horses (meat source apparently) in regularly to remove accumulated ice from their backs.
My dog gets to sleep inside. Mind you, I have seen the huskies sleeping outside in the snow in Norway. They breed them tough over there
Interesting development. Judith Curry features 3 papers from OUTSIDE of climate science and asks:-
JC comments: Each of these papers provides a fresh perspective on interpreting surface temperature data trends. The authors of all these papers have academic training and expertise that lies well outside of the field of atmospheric science and climatology. I have long said that fresh perspectives from outside the field of climate science are needed. Congrats to Ross for getting his paper published in Climate Dynamics, which is a high impact mainstream climate journal. My main question is whether the lead authors of the relevant IPCC AR5 chapter will pay attention to these papers; they should (and the papers meet the publication deadline for inclusion in the AR5).
http://judithcurry.com/2012/06/21/three-new-papers-on-interpreting-temperature-trends/
One of those papers is:-
Did the global temperature trend change at the end of the 1990s?
Tom Quirk
Institute of Public Affairs
Melbourne
Australia
http://ipa.org.au/library/publication/1339463007_document_break_paper_apjas_ipa.pdf
Chances of appearing in AR5 may be diminished by acknowledgment of 2 prominent sceptics:-
Acknowledgements: I have benefited greatly from discussions with William Kininmonth and in particular, David Stockwell who introduced me to the Chow Break Test.
Conclusion
There is a strong set of coincident events at or around 2000 that suggest the onset of a cool phase of the Pacific Decadal Oscillation. This is supported by the decreasing humidity in the Northern Pacific Ocean after the break in 2000 (Figure 6) where the probability of the straight line fit showing no decrease is 3%. However for the global surface temperature this analysis has not established whether the cool phase of the Pacific Decadal Oscillation dominates the warm phase of the North Atlantic Decadal Oscillation.
The variations in global temperature, atmospheric CO2, water vapour and atmospheric methane all indicate the importance of the Atlantic and Pacific Decadal Oscillations. This is not easily taken into account in General Circulation Models and until there is a better understanding of the long term behaviour of the oceans, it must be a significant difficultly in projecting future temperatures.
NOTE: the Quirk paper is a “break point” analysis similar to what BOM used for ACORN-SAT and NZCSET used for the ‘Statistical Audit’ of NIWA’s 7SS.
NOAA Conducts Large-Scale Experiment And Proves Global Warming Skeptics Correct
So, what has this NOAA experiment found? The bottom image (Image #3) tells that story – when compared to measurements from the old, inaccurate, non-pristine network, temperature “warming” in the U.S. is being overstated anywhere from +0.5°C on average, up to almost +4.0°C (+0.9°F to +7.2°F) in some locations during the summer months.
To clarify, this range of overstatement depends on the given new and old stations being compared. However, when the new network versus old network results are examined in total, for the recent summer heat wave in the U.S., the old stations were reporting bogus warming during July that amounted to some +2.1°F higher than the actual temperatures.
What does this mean? Within the climate science realm, the old climate/weather station system had long been considered the best and most complete measurement network in the world. But when pitted against a brand new climate measurement system that has the best qualities that science can provide, we find that the traditional U.S. methodology is significantly overstating the “global warming” phenomenon. This means that if other countries replaced their own low quality network with NOAA’s greatest and latest technology, with the best location site standards applied, we would discover that world-wide temperature increases have been wildly overstated also.
>>>>>>>
http://www.c3headlines.com/2012/09/noaa-conducts-large-scale-experiment-and-proves-global-warming-skeptics-correct.html
How GISS Temperatures Diverge From RSS
By Paul Homewood
http://notalotofpeopleknowthat.wordpress.com/2012/11/26/how-giss-temperatures-diverge-from-rss/
Quoting:-
So, since 1998, GISS have found an extra 0.33C of warming that RSS cannot find. This is an astonishing figure, which effectively doubles the amount of warming found in total since 1979.
Comparison of GISS with HADCRUT and UAH shows a similar divergence.
As Doctor McCoy might have said “It’s temperature, Jim, but not as we know it”.
Have to say I’m astounded by the UAH and RSS January anomalies (0.3 C and more above Dec). I would have lost money betting on it.
http://wattsupwiththat.com/2013/02/06/from-uah-global-temperature-report-january-2013/
“Compared to seasonal norms, over the past month the coldest area on the globe was east central Russia near the town of Nyagan, where temperatures for the month averaged as much as 2.51 C (about 4.5 degrees Fahrenheit) cooler than seasonal norms. Compared to seasonal norms, the “warmest” area on the globe in January was the Norwegian arctic archipelago of Svalbard, which is north of Norway and east of Greenland. Temperatures there averaged 4.1 C (about 7.4 degrees Fahrenheit) warmer than seasonal norms for January.”
The Australian heat wave was just a minor contributor in this geo-spatial plot:-
http://wattsupwiththat.files.wordpress.com/2013/02/january-2013.png?w=960&h=594
There must have been very sudden tropospheric warming in polar/sub-polar regions for unexplained reasons from what I can gather. But that is just what the warmists predict of course.
Ryan Maue – February 2013 global temperature anomaly compared to 1981-2010 mean: -0.001°C or 1/1000th of a degree below avg.
https://twitter.com/RyanMaue/status/307309843356725249
This after a January spike up.
Big drop in global surface temperature in February, ocean temps flat
By Dr. Roy Spencer
UAH Global Temperature Update for February, 2013: +0.18 deg. C
Our Version 5.5 global average lower tropospheric temperature (LT) anomaly for February, 2013 is +0.18 deg. C, a large decrease from January’s +0.50 deg. C.
Global Microwave Sea Surface Temperature Update for Feb. 2013: -0.01 deg. C
The global average sea surface temperature (SST) update for Feb. 2013 is -0.01 deg. C, relative to the 2003-2006 average:
http://wattsupwiththat.com/2013/03/04/big-drop-in-global-surface-temperature-in-february-ocean-temps-flat/#more-81350
HadAT2 (radiosonde balloons) has been updated through December 31, 2012
http://www.c3headlines.com/2013/02/ipccs-global-warming-hypothesis-fails-ultimate-test-no-tropical-hotspot-after-17-years-of-immense-co.html
No hotspot
http://c3headlines.typepad.com/.a/6a010536b58035970c017ee8a84864970d-pi
Boring.
Hi all. I’m wondering if anyone can direct me to some info on the disparities between the night and daytime global temperatures. Is the night warming faster than the day & if so does this prove AGW?
Cheers.
Long way to go on that Magoo. Some background:-
Fall et al. 2011: What We Learned About the Climate
http://blog.chron.com/climateabyss/2011/05/fall-et-al-2011-what-we-learned-about-the-climate/
According to the best-sited stations, the diurnal temperature range in the lower 48 states has no century-scale trend.
http://wattsupwiththat.com/2011/05/19/according-to-the-best-sited-stations-the-diurnal-temperature-range-in-the-lower-48-states-has-no-century-scale-trend/
Comments On “The Shifting Probability Distribution Of Global Daytime And Night-Time Temperatures” By Donat and Alexander 2012 – A Not Ready For Prime Time Study
http://pielkeclimatesci.wordpress.com/2012/08/21/comments-on-the-shifting-probability-distribution-of-global-daytime-and-night-time-temperatures-by-donat-and-alexander-2012-a-not-ready-for-prime-time-study/
Thanks Richard.
Magoo. Daytime Max temperature actually supports a sun-temperature connection – AGW not so much:-
Bob Carter: William M. Briggs: Willie Soon: ‘Changing Sun, changing climate’
http://climaterealists.com/index.php?id=11282
“The reconfirmation now of a strong sun-temperature relation based specifically upon the daytime temperature maxima adds strong and independent scientific weight to the reality of the sun-temperature connection.
The close relationships between the abrupt ups and downs of solar activity and similar changes in temperature that we have identified occur locally in coastal Greenland; regionally in the Arctic Pacific and north Atlantic; and hemispherically for the whole circum-Arctic region. This suggests strongly that changes in solar radiation drive temperature variations on at least a hemispheric scale.
Close correlations like these simply do not exist for temperature and changing atmospheric CO2 concentration. In particular, there is no coincidence between the measured steady rise in global atmospheric CO2 concentration and the often dramatic multi-decadal (and shorter) ups and downs of surface temperature that occur all around the world.”
GISS Figures Out For February | NOT A LOT OF PEOPLE KNOW THAT
By Paul Homewood
While we waiting for HADCRUT, a quick look at GISS numbers for February, which are back down to 0.49C, from 0.60C in January.
The average for 2012 was 0.56C.
I really wanted, though, to show the GISS map for winter temperatures, i.e. Dec – Feb. The map shows the anomalies against the 1981-2010 baseline.
[…]
The line at the top of the map shows that there is an anomaly of 0.10C – in other words, the Dec 2012 – Feb 2013 temperature is a mere 0.10C higher than the 1981-2010 baseline.
Yes, a mere 0.10C. And this during a winter when ENSO conditions have been neutral, so alarmists cannot even hide behind La Nina excuses.
http://notalotofpeopleknowthat.wordpress.com/2013/03/22/giss-figures-out-for-february/
Latest UAH & RSS Numbers
April 8, 2013
By Paul Homewood
Both UAH and RSS are virtually unchanged. UAH is exactly the same at 0.18C, and RSS is up from 0.19 to 0.20C.
Both are around or below the level they were in Nov/Dec.
I thought we’d have a look at longer term trends today though. Using HADCRUT4 numbers, I have plotted the monthly numbers since 1979, with a 10-year running average.
http://notalotofpeopleknowthat.files.wordpress.com/2013/04/image_thumb15.png?w=1008&h=616
Within the last decade, temperatures have definitely flatlined. For instance, the latest 12-month average is 0.47C, and this compares with a figure of 0.49C for 2002.
But, intriguingly, the longer term, 10-year average also stopped rising at the end of 2010, and has actually started to fall back. This is not clear on the scale of the above graph, but zooming in below illustrates it well.
http://notalotofpeopleknowthat.files.wordpress.com/2013/04/image_thumb16.png?w=1008&h=596
The amounts may be small, but the evidence is clear – the world has been getting colder in the last few years. This is not down to a single year’s figures, nor ENSO variations, as the 10 year span will even out such fluctuations.
Perhaps we really should be worrying about global cooling again.
http://notalotofpeopleknowthat.wordpress.com/2013/04/08/latest-uah-rss-numbers/
# # #
So much for the Foster and Rahmstorf script too.
‘Australian Warming Overestimated By A Third’
By Paul Homewood h/t NIPCC
A new study by Stockwell and Stewart, “Biases in the Australian high quality temperature network” has found that warming in Australia over the last century could have been overestimated by 31%.
ABSTRACT
Various reports identify global warming over the last century as around 0.7°C, but warming in Australia at around 0.9°C, suggesting Australia may be warming faster than the rest of the world. This study evaluates potential biases in the High Quality Network (HQN) compiled from 100 rural surface temperature series from 1910 due to: (1) homogeneity adjustments used to correct for changes in location and instrumentation, and (2) the discrimination of urban and rural sites. The approach was to compare the HQN with a new network compiled from raw data using the minimal adjustments necessary to produce contiguous series, called the Minimal Adjustment Network (MAN). The average temperature trend of the MAN stations was 31% lower than the HQN, and by a number of measures, the trend of the Australian MAN is consistent with the global trend. This suggests that biases from these sources have exaggerated apparent Australian warming. Additional problems with the HQN include failure of homogenization procedures to properly identify errors, individual sites adjusted more than the magnitude of putative warming last century, and some sites of such poor quality they should not be used, especially under a “High Quality” banner.
http://landshape.org/wp-content/uploads/2013/02/06-Stockwell%5B1%5D.pdf
Which all raises the question, how reliable are temperatures from other parts of the world?
http://notalotofpeopleknowthat.wordpress.com/2013/06/19/australian-warming-overestimated-by-a-third/
Wave Buoy – Bay of Plenty Regional Council
http://monitoring.boprc.govt.nz/MonitoredSites/cgi-bin/hydwebserver.cgi/sites/details?site=241&treecatchment=23
Located 13 km off Pukehina Beach in a central position within the curve of the Bay of Plenty.
Sea surface temperature is measured from a sensor mounted near the base of the wave buoy (approximately 0.5 m below the sea surface).
Water Temperature 15.9 deg. C 23-Jun-2013 11:00
Click on a sensor name to view graphs and samples. Water Temperature since August 2012:
http://monitoring.boprc.govt.nz/MonitoredSites/cgi-bin/hydwebserver.cgi/points/plot?point=834&aggregate=0&Sdt=23/06/2012%2013%3A30&Fdt=23/06/2013%2013%3A30&trType=2&trParam=0&UUID=15879.06333039352
14.5 deg. C Aug 2012,
21 – 21.5 Feb/Mar 2013,
5 deg. C rise Dec.
3 deg. C fall then rise Jan
5 deg. C fall Apr – Jun.
Schnapper are now biting – apparently.
‘HADCET By Month Last 10 Years’
By sunshinehours
HADCET (Central England Temperature) over the last 10 years has been cooling at a rate of -1.2C per decade.
I thought I would graph each month with the seasons grouped together.
http://sunshinehours.files.wordpress.com/2013/10/hadcet-2003-to-2012.png?w=1024&h=511
I find it fascinating to see some months (such as October’s -.012C/Decade) are barely cooling while Janruary is cooling at -2.6C per decade.
http://sunshinehours.wordpress.com/2013/10/26/hadcet-by-month-last-10-years/
‘Australian Met Office Accused Of Manipulating Temperature Records’
Posted on August 23, 2014 by Anthony Watts
There’s quite a row developing after a scathing article in the Australian, some news clips follow. h/t to Dr. Benny Peiser at The GWPF
The [Australian] Bureau of Meteorology has been accused of manipulating historic temperature records to fit a predetermined view of global warming. Researcher Jennifer Marohasy claims the adjusted records resemble “propaganda” rather than science. Dr Marohasy has analysed the raw data from dozens of locations across Australia and matched it against the new data used by BOM showing that temperatures were progressively warming. In many cases, Dr Marohasy said, temperature trends had changed from slight cooling to dramatic warming over 100 years. –Graham Lloyd, The Australian, 23 August 2014
The escalating row goes to heart of the climate change debate — in particular, whether computer models are better than real data and whether temperature records are being manipulated in a bid to make each year hotter than the last. Marohasy’s research has put her in dispute with BoM over a paper she published with John Abbot at Central Queensland University in the journal Atmospheric Research concerning the best data to use for rainfall forecasting. BoM challenged the findings of the Marohasy-Abbot paper, but the international journal rejected the BoM rebuttal, which had been prepared by some of the bureau’s top scientists. This has led to an escalating dispute over the way in which Australia’s historical temperature records are “improved” through homogenisation, which is proving more difficult to resolve. –Graham Lloyd, The Australian, 23 August 2014
More>>>>>>>
http://wattsupwiththat.com/2014/08/23/australian-met-office-accused-of-manipulating-temperature-records/#more-114854
‘Australian Met Office Accused Of Manipulating Temperature Records’
Date: 23/08/14
Graham Lloyd, The Australian
The [Australian] Bureau of Meteorology has been accused of manipulating historic temperature records to fit a predetermined view of global warming.
Researcher Jennifer Marohasy claims the adjusted records resemble “propaganda” rather than science.
Dr Marohasy has analysed the raw data from dozens of locations across Australia and matched it against the new data used by BOM showing that temperatures were progressively warming.
In many cases, Dr Marohasy said, temperature trends had changed from slight cooling to dramatic warming over 100 years.
BOM has rejected Dr Marohasy’s claims and said the agency had used world’s best practice and a peer reviewed process to modify the physical temperature records that had been recorded at weather stations across the country.
It said data from a selection of weather stations underwent a process known as “homogenisation” to correct for anomalies. It was “very unlikely” that data homogenisation impacted on the empirical outlooks.
In a statement to The Weekend Australian BOM said the bulk of the scientific literature did not support the view that data homogenisation resulted in “diminished physical veracity in any particular climate data set’’.
Historical data was homogenised to account for a wide range of non-climate related influences such as the type of instrument used, choice of calibration or enclosure and where it was located.
“All of these elements are subject to change over a period of 100 years, and such non-climate related changes need to be accounted for in the data for reliable analysis and monitoring of trends,’’ BOM said.
Account is also taken of temperature recordings from nearby stations. It took “a great deal of care with the climate record, and understands the importance of scientific integrity”.
Dr Marohasy said she had found examples where there had been no change in instrumentation or siting and no inconsistency with nearby stations but there had been a dramatic change in temperature trend towards warming after homogenisation.
She said that at Amberley in Queensland, homogenisation had resulted in a change in the temperature trend from one of cooling to dramatic warming.
She calculated homogenisation had changed a cooling trend in the minimum temperature of 1C per century at Amberley into a warming trend of 2.5C. This was despite there being no change in location or instrumentation.
http://www.thegwpf.org/australian-met-office-accused-of-manipulating-temperature-records/
”The heat is on. Bureau of Meteorology ‘altering climate figures’ — The Australian’
Joanne Nova, August 23rd, 2014
Congratulations to The Australian again for taking the hard road and reporting controversial, hot, documented problems, that few in the Australian media dare to investigate.
How accurate are our national climate datasets when some adjustments turn entire long stable records from cooling trends to warming ones (or visa versa)? Do the headlines of “hottest ever record” (reported to a tenth of a degree) mean much if thermometer data sometimes needs to be dramatically changed 60 years after being recorded?
One of the most extreme examples is a thermometer station in Amberley, Queensland where a cooling trend in minima of 1C per century has been homogenized and become a warming trend of 2.5C per century. This is a station at an airforce base that has no recorded move since 1941, nor had a change in instrumentation. It is a well-maintained site near a perimeter fence, yet the homogenisation process produces a remarkable transformation of the original records, and rather begs the question of how accurately we know Australian trends at all when the thermometers are seemingly so bad at recording the real temperature of an area.
More (much more)>>>>>>>>
http://joannenova.com.au/2014/08/the-heat-is-on-bureau-of-meteorology-altering-climate-figures-the-australian/
“Star pick” comment at JoNova above:
EternalOptimist #5
August 23, 2014 at 4:53 am · Reply
star comment
Years ago, I used to look at my thermometer to see how warm it was. Now I have to wait 60 years to find out. thats not progress
http://joannenova.com.au/2014/08/the-heat-is-on-bureau-of-meteorology-altering-climate-figures-the-australian/#comment-1545646
‘How Australia’s Bureau of Meteorology is Turning Up The Heat’
Written by Graham Lloyd, Australian on 24 August 2014.
When raging floodwaters swept through Brisbane in January 2011 they submerged a much-loved red Corvette sports car in the basement car park of a unit in the riverside suburb of St Lucia.
On the scale of the billions of dollars worth of damage done to the nation’s third largest city in the man-made flood, the loss of a sports car may not seem like much.
But the loss has been the catalyst for an escalating row that raises questions about the competence and integrity of Australia’s premier weather agency, the Bureau of Meteorology, stretching well beyond the summer storms.
It goes to heart of the climate change debate — in particular, whether computer models are better than real data and whether temperature records are being manipulated in a bid to make each year hotter than the last.
With farmer parents, researcher Jennifer Marohasy says she has always had a fascination with rainfall and drought-flood cycles. So, in a show of solidarity with her husband and his sodden Corvette, Marohasy began researching the temperature records noted in historic logs that date back through the Federation drought of the late 19th century.
Specifically, she was keen to try forecasting Brisbane floods using historical data and the latest statistical modelling techniques.
Marohasy’s research has put her in dispute with BoM over a paper she published with John Abbot at Central Queensland University in the journal Atmospheric Research concerning the best data to use for rainfall forecasting. (She is a biologist and a sceptic of the thesis that human activity is bringing about global warming.) BoM challenged the findings of the Marohasy-Abbot paper, but the international journal rejected the BoM rebuttal, which had been prepared by some of the bureau’s top scientists.
Continues>>>>>>>
http://www.climatechangedispatch.com/how-australias-bom-is-turning-up-the-heat.html
‘Heat is on over weather bureau ‘homogenising temperature records’
Written by Dr Jennifer Marohasy on 24 August 2014.
EARLIER this year Tim Flannery said “the pause” in global warming was a myth, leading medical scientists called for stronger action on climate change, and the Australian Bureau of Meteorology declared 2013 the hottest year on record. All of this was reported without any discussion of the actual temperature data. It has been assumed that there is basically one temperature series and that it’s genuine.
But I’m hoping that after today, with both a feature (page 20) and a news piece (page 9) in The Weekend Australia things have changed forever.
I’m hoping that next time Professor Flannery is interviewed he will be asked by journalists which data series he is relying on: the actual recorded temperatures or the homogenized remodeled series. Because as many skeptics have known for a long time, and as Graham Lloyd reports today for News Ltd, for any one site across this wide-brown land Australia, while the raw data may show a pause, or even cooling, the truncated and homogenized data often shows dramatic warming.
When I first sent Graham Lloyd some examples of the remodeling of the temperature series I think he may have been somewhat skeptical. I know he on-forwarded this information to the Bureau for comment, including three charts showing the homogenization of the minimum temperature series for Amberley.
Mr Lloyd is the Environment Editor for The Australian newspaper and he may have been concerned I got the numbers wrong. He sought comment and clarification from the Bureau, not just for Amberley but also for my numbers pertaining to Rutherglen and Bourke.
I understand that by way of response to Mr Lloyd, the Bureau has not disputed these calculations.
This is significant. The Bureau now admits that it changes the temperature series and quite dramatically through the process of homogenization.
I repeat the Bureau has not disputed the figures. The Bureau admits that the data is remodeled.
What the Bureau has done, however, is try and justify the changes. In particular, for Amberley the Bureau is claiming to Mr Lloyd that there is very little available documentation for Amberley before 1990 and that information before this time may be “classified”: as in top secret. That’s right, there is apparently a reason for jumping-up the minimum temperatures for Amberley but it just can’t provide Mr Lloyd with the supporting meta-data at this point in time.
[See Amberley graph]
http://www.climatechangedispatch.com/heat-is-on-over-weather-bureau-homogenising-temperature-records.html
http://jennifermarohasy.com/2014/08/heat-is-on-over-weather-bureau-homogenising-temperature-records/
‘Australian Bureau of Meteorology accused of Criminally Adjusted Global Warming’
Written by James Delingpole, Breitbart London on 25 August 2014.
The Australian Bureau of Meteorology has been caught red-handed manipulating temperature data to show “global warming” where none actually exists.
At Amberley, Queensland, for example, the data at a weather station showing 1 degree Celsius cooling per century was “homogenized” (adjusted) by the Bureau so that it instead showed a 2.5 degrees warming per century.
At Rutherglen, Victoria, a cooling trend of -0.35 degrees C per century was magically transformed at the stroke of an Australian meteorologist’s pen into a warming trend of 1.73 degrees C per century.
Last year, the Australian Bureau of Meteorology made headlines in the liberal media by claiming that 2013 was Australia’s hottest year on record. This prompted Australia’s alarmist-in-chief Tim Flannery – an English literature graduate who later went on to earn his scientific credentials with a PhD in palaeontology, digging up ancient kangaroo bones – to observe that global warming in Australia was “like climate change on steroids.”
But we now know, thanks to research by Australian scientist Jennifer Marohasy, that the hysteria this story generated was based on fabrications and lies.
Continues>>>>>>
http://www.climatechangedispatch.com/australian-bureau-of-meteorology-accused-of-criminally-adjusted-global-warming.html
http://www.breitbart.com/Breitbart-London/2014/08/25/Australian-Bureau-of-Meteorology-accused-of-Criminally-Adjusted-Global-Warming
‘BOM finally explains! Cooling changed to warming trends because stations “might” have moved!’
by Joanne Nova, August 26th, 2014
It’s the news you’ve been waiting years to hear! Finally we find out the exact details of why the BOM changed two of their best long term sites from cooling trends to warming trends. The massive inexplicable adjustments like these have been discussed on blogs for years. But it was only when Graham Lloyd advised the BOM he would be reporting on this that they finally found time to write three paragraphs on specific stations.
Who knew it would be so hard to get answers. We put in a Senate request for an audit of the BOM datasets in 2011. Ken Stewart, Geoff Sherrington, Des Moore, Bill Johnston, and Jennifer Marohasy have also separately been asking the BOM for details about adjustments on specific BOM sites. (I bet Warwick Hughes has too). The BOM has ignored or circumvented all these, refusing to explain why individual stations were adjusted in detail.
The two provocative articles Lloyd put together last week were Heat is on over weather bureau and Bureau of Meteorology ‘altering climate figures, which I covered here. This is the power of the press at its best. The absence of articles like these, is why I have said the media IS the problem — as long as the media ignore the BOM failure to supply their full methods and reasons the BOM mostly get away with it. It’s an excellent development The Australian is starting to hold the BOM to account. (No sign of curiosity or investigation at the ABC and Fairfax, who are happy to parrot BOM press releases unquestioned like sacred scripts.)
Continues>>>>>>>
http://joannenova.com.au/2014/08/bom-finally-explains-cooling-changed-to-warming-trends-because-stations-might-have-moved/
‘Who’s going to be sacked for making-up global warming at Rutherglen?’
By Jennifer Marohasy on August 27, 2014
HEADS need to start rolling at the Australian Bureau of Meteorology. The senior management have tried to cover-up serious tampering that has occurred with the temperatures at an experimental farm near Rutherglen in Victoria. Retired scientist Dr Bill Johnston used to run experiments there. He, and many others, can vouch for the fact that the weather station at Rutherglen, providing data to the Bureau of Meteorology since November 1912, has never been moved.
Senior management at the Bureau are claiming the weather station could have been moved in 1966 and/or 1974 and that this could be a justification for artificially dropping the temperatures by 1.8 degree Celsius back in 1913.
Surely its time for heads to roll!
[See Rutherglen graph]
Some background: Near Rutherglen, a small town in a wine-growing region of NE Victoria, temperatures have been measured at a research station since November 1912. There are no documented site moves. An automatic weather station was installed on 29th January 1998.
Temperatures measured at the weather station form part of the ACORN-SAT network, so the information from this station is checked for discontinuities before inclusion into the official record that is used to calculate temperature trends for Victoria, Australia, and also the United Nation’s Intergovernmental Panel on Climate Change (IPCC).
The unhomogenized/raw mean annual minimum temperature trend for Rutherglen for the 100-year period from January 1913 through to December 2013 shows a slight cooling trend of 0.35 degree C per 100 years. After homogenization there is a warming trend of 1.73 degree C per 100 years. This warming trend is essentially achieved by progressively dropping down the temperatures from 1973 back through to 1913. For the year of 1913 the difference between the raw temperature and the ACORN-SAT temperature is a massive 1.8 degree C.
There is absolutely no justification for doing this.
This cooling of past temperatures is a new trick* that the mainstream climate science community has endorsed over recent years to ensure next year is always hotter than last year – at least for Australia.
There is an extensive literature that provides reasons why homogenization is sometimes necessary, for example, to create continuous records when weather stations move locations within the same general area i.e. from a post office to an airport. But the way the method has been implemented at Rutherglen is not consistent with the original principle which is that changes should only be made to correct for non-climatic factors.
In the case of Rutherglen the Bureau has just let the algorithms keep jumping down the temperatures from 1973. To repeat the biggest change between the raw and the new values is in 1913 when the temperature has been jumped down a massive 1.8 degree C.
In doing this homogenization a warming trend is created when none previously existed.
The Bureau has tried to justify all of this to Graham Lloyd at The Australian newspaper by stating that there must have been a site move, its flagging the years 1966 and 1974. But the biggest adjustment was made in 1913! In fact as Bill Johnston explains in today’s newspaper [See below], the site never has moved.
Surely someone should be sacked for this blatant corruption of what was a perfectly good temperature record.
http://jennifermarohasy.com/2014/08/whos-going-to-be-sacked-for-making-up-global-warming-at-rutherglen/
‘Climate records contradict Bureau of Meteorology’ [paywalled]
by Graham Lloyd, The Australian. 27th August 2014
THE official catalogue of weather stations contradicts the Bureau of Meteorology’s explanation that the relocation of the thermometer in the Victorian winegrowing district of Rutherglen has turned a cooling into a warming trend. THE Bureau of Meteorology appears to have invented a move in the official thermometer site at the Rutherglen weather station in Victoria to help justify a dramatic revision of historic temperatures to produce a warming trend.
http://www.theaustralian.com.au/national-affairs/climate/climate-records-contradict-bureau-of-meteorology/story-e6frg6xf-1227037936046
‘Big adjustments? BOM says Rutherglen site shifted, former workers there say “No” ‘
by Joanne Nova, August 27th, 2014
The hot questions for the Australian Bureau of Meteorology (BOM) mount up. Rutherglen was one of the temperature recording stations that was subject to large somewhat mysterious adjustments which turned a slight cooling trend into a strongly warming one. Yet the official notes showed that the site did not move and was a continuous record. On paper, Rutherglen appeared to be ideal — a rare long rural temperature record where measurements had come from the same place since 1913.
The original cooling trend of – 0.35C was transformed into a +1.73C warming after “homogenisation” by the BOM. To justify that the BOM claims that there may have been an unrecorded shift, and it was “consistent” with the old station starting further up the slope before it moved down to the hollow.
Today retired scientist Bill Johnston got in touch with Jennifer Marohasy, with me and with Graham Lloyd of The Australian to say that he worked at times at Rutherglen and the official thermometer had not moved. It was always placed where it is now at the bottom of the hollow. That information has already made it into print in The Australian.
More>>>>>>>
http://joannenova.com.au/2014/08/bom-claims-rutherglen-data-was-adjusted-because-of-site-move-but-it-didnt-happen/#more-37863
‘Uninformed climate amateurs ask professionals to explain their data revision’
by Joanne Nova, August 28th, 2014
David Karoly knew he had to defend the BOM with regard to the hot questions about adjustments to Amberley, Bourke, and Rutherglen data. What he didn’t have were photos of historic equipment, maps of thermometer sites, or quotes from people who took observations. Instead he wielded the magic wand of “peer review” — whereupon questions asked in English are rendered invalid if they are printed in a newspaper instead of a trade-magazine.
Prof David Karoly, Climate Professional called people who ask for explanations poorly informed amateurs [hotlink, paywall, see below]. In response, we Poorly Informed Climate Amateurs wonder what it takes to get Climate Professionals to inform us? Instead of hiding behind ‘peer review’, vague complex methods, and the glow of their academic aura, the professionals could act professional and explain exactly what they did to the data?
[…]
The articles by Graham Lloyd on Jennifer Marohasy’s analysis are generating debate.
Letters to The Australian [hotlink, see below] 28th August 2014
Bill Johnston, former NSW natural resources research scientist, Cook, ACT…….
Michael Asten, School of Earth Atmosphere and Environment, Monash University, Melbourne. Vic……
Greg Buchanan, Niagara Park, NSW…….
http://joannenova.com.au/2014/08/uninformed-climate-amateurs-ask-professionals-to-explain-their-data-revision/
‘Amateurs’ challenging Bureau of Meteorology climate figures [paywalled]
The Australian, August 26, 2014
CONCERNS about the accuracy of the Bureau of Meteorology’s historical data are being raised by “poorly informed amateurs”, one of Australia’s leading climate scientists has said. David Karoly of Melbourne University’s School of Earth Sciences, said claims BOM had introduced a warming trend by homogenising historical temperature data should be submitted for peer review.
http://www.theaustralian.com.au/national-affairs/climate/amateurs-challenging-bureau-of-meteorology-climate-figures/story-e6frg6xf-1227036441118
‘Bureau should supply details of its data revision’
Letters, The Australian, August 28, 2014
http://www.theaustralian.com.au/opinion/letters/bureau-should-supply-details-of-its-data-revision/story-fn558imw-1227039108627
‘Rewriting the History of Bourke: Part 2, Adjusting Maximum Temperatures Both Down and uP, and Then Changing Them Altogether’
By Jennifer Marohasy on April 6, 2014
“Anyone who doesn’t take truth seriously in small matters cannot be trusted in large ones either.” Albert Einstein.
[…] In a report entitled ‘Techniques involved in developing the Australian Climate Observation Reference Network – Surface Air Temperature (ACORN-SAT) dataset’ (CAWCR Technical Report No. 049), Blair Trewin explains that up to 40 neighbouring weather stations can be used for detecting inhomogeneities and up to 10 can be used for adjustments. What this means is that temperatures, ever so diligently recorded in the olden days at Bourke by the postmaster, can be change on the basis that it wasn’t so hot at a nearby station that may in fact be many hundreds of kilometres away, even in a different climate zone.
Consider the recorded versus adjusted values for January 1939, Table 1. The recorded values have been changed. And every time the postmaster recorded 40 degrees, Dr Trewin has seen fit to change this value to 39.1 degree Celsius. Why?
http://jennifermarohasy.com/2014/04/rewriting-the-history-of-bourke-part-2-adjusting-maximum-temperatures-both-down-and-up-and-then-changing-them-altogether/
Rutherglen
[From ‘Temperatures’ at Jennifer Marohasy’s blog]
NEAR Rutherglen, a small town in a wine-growing region of northeastern Victoria, temperatures have been measured at a research station since November 1912. There are no documented site moves. An automatic weather station was installed on 29th January 1998.
Temperatures measured at the weather station form part of the ACORN-SAT network, so the information from this station is homogenized before inclusion into the official record that is used to calculate temperature trends for Victoria and also Australia.
The unhomogenized/raw mean annual minimum temperature trend for Rutherglen for the 100-year period from January 1913 through to December 2013 shows a slight cooling trend of 0.35 degree C per 100 years, see Figure 1. After homogenization there is a warming trend of 1.73 degree C per 100 years. This warming trend is essentially achieved by progressively dropping down the temperatures from 1973 back through to 1913. For the year of 1913 the difference between the raw temperature and the ACORN-SAT temperature is a massive 1.8 degree C.
Continues >>>>>> (see graphs)
http://jennifermarohasy.com/temperatures/rutherglen/
‘Modelling Australian and Global Temperatures: What’s Wrong?’
Bourke and Amberley as Case Studies
Jennifer Marohasy, John Abbot, Ken Stewart and Dennis Jensen,
Also published in The Sydney Papers Online, Issue 26 following a presentation for the Sydney Institute on 25th June 2014
http://jennifermarohasy.com/wp-content/uploads/2011/08/Changing_Temperature_Data.pdf
‘Hiding something? BOM throws out Bourke’s hot historic data, changes long cooling trend to warming’
JoNova, August 30th, 2014
Hello Soviet style weather service? On January 3, 1909, an extremely hot 51.7C (125F) was recorded at Bourke. It’s possibly the hottest ever temperature recorded in a Stevenson Screen in Australia, but the BOM has removed it as a clerical error. There are legitimate questions about the accuracy of records done so long ago — standards were different. But there are very legitimate questions about the BOMs treatment of this historic data. ‘The BOM has also removed the 40 years of weather recorded before 1910, which includes some very hot times. Now we find out the handwritten original notes from 62 years of the mid 20th Century were supposed to be dumped in 1996 as well. Luckily, these historic documents were saved from the dustbin and quietly kept in private hands instead.
More>>>>>>>
http://joannenova.com.au/2014/08/hiding-something-bom-throws-out-bourkes-hot-historic-data-changes-long-cooling-trend-to-warming/
‘Newspapers as the guardians of hot history’
By Jennifer Marohasy
[…] I went and checked not only the old newspapers but also the book in the national archive, because, guess what? The Bureau of Meteorology is claiming it was all a clerical error. They have scratched this record made on 3rd January 1909 from the official record for Bourke, which means it’s also scratched from the NSW and national temperature record.
Yep. It never happened. No heatwave back in 1909.
They have also wiped the heatwave of January 1896. This was probably the hottest January on record, not just for Bourke, but Australia-wide. Yet according to the rules dictated by the Bureau, if it was recorded before 1910, it doesn’t count.
http://jennifermarohasy.com/2014/09/newspapers-as-the-guardians-of-hot-history/
‘1953 Headline: Melbourne’s weather is changing! Summers getting colder and wetter’
By Joanne Nova
Once upon a time — before the Great Politicization of Climate Science — CSIRO was able to analyze trends from 1880 to 1910. In 1953 CSIRO scientists were making a case that large parts of Australia had been hotter in the 1880s and around the turn of last century. >>>>>
http://joannenova.com.au/2014/09/1953-headline-melbournes-weather-is-changing-summers-getting-colder-and-wetter/#more-38262
‘Bureau of Meteorology warms to transparency over adjusted records’
Graham Lloyd, The Australian
September 11, 2014 12:00AM [Paywall]
THE Bureau of Meteorology has been forced to publish details of all changes made to historic temperature records as part of its homogenisation process to establish the nation’s climate change trend. Publication of the reasons for all data adjustments was a key recommendation of the bureau’s independent peer review panel which approved the bureau’s ACORN SAT methodology.
http://www.theaustralian.com.au/national-affairs/climate/bureau-of-meteorology-warms-to-transparency-over-adjusted-records/story-e6frg6xf-1227054494820
‘Scientists should know better: the truth was out there’
Graham Lloyd, The Australian
September 11, 2014 12:00AM [Paywall]
IT reflects poorly on key members of Australia’s climate science establishment that tribal loyalty is more important than genuine inquiry. Openness not ad hominem histrionics was always the answer for lingering concerns about what happened to some of the nation’s temperature records under the Bureau of Meteorology’s process of homogenisation.
http://www.theaustralian.com.au/national-affairs/opinion/scientists-should-know-better-the-truth-was-out-there/story-e6frgd0x-1227054494459
‘Understanding adjustments to temperature data’
by Zeke Hausfather
Pairwise Homogenization Algorithm (PHA) Adjustments
The Pairwise Homogenization Algorithm [hotlink #1 – see below] was designed as an automated method of detecting and correcting localized temperature biases due to station moves, instrument changes, microsite changes, and meso-scale changes like urban heat islands.
The algorithm (whose code can be downloaded here [hotlink] is conceptually simple: it assumes that climate change forced by external factors tends to happen regionally rather than locally. If one station is warming rapidly over a period of a decade a few kilometers from a number of stations that are cooling over the same period, the warming station is likely responding to localized effects (instrument changes, station moves, microsite changes, etc.) rather than a real climate signal.
To detect localized biases, the PHA iteratively goes through all the stations in the network and compares each of them to their surrounding neighbors. It calculates difference series between each station and their neighbors (separately for min and max) and looks for breakpoints that show up in the record of one station but none of the surrounding stations. These breakpoints can take the form of both abrupt step-changes and gradual trend-inhomogenities that move a station’s record further away from its neighbors.
http://judithcurry.com/2014/07/07/understanding-adjustments-to-temperature-data/
Hotlink #1:
‘Homogenization of Temperature Series via Pairwise Comparisons’
MATTHEW J. MENNE AND CLAUDE N. WILLIAMS JR. [MW09]
NOAA/National Climatic Data Center, Asheville, North Carolina
(Manuscript received 2 October 2007, in final form 2 September 2008)
Page 4 pdf,
a. Selection of neighbors and formulation of difference series
Next, time series of differences Dt are formed between all target–neighbor monthly temperature series.
To illustrate this, take two monthly series Xt and Yt, that is, a target and one of its correlated neighbors.
Following Lund et al. (2007), these two series can be represented as
XmT+v = uv^x + B^x (mT+v) + SmT+v^x + emT+v^x (1)
and
YmT+v = uv^y + B^y (mT+v) + SmT+v^y + emT+v^y (2)
where
m represents the monthly mean anomaly at the specific series,
T = 12 represents the months in the annual cycle,
v = (1, . . . , 12) is the monthly index,
m = the year (or annual cycle) number,
and the et terms denote mean zero error terms at time t for the two series.
The St terms represent shift factors cause by station changes, which are thought to be step functions.
ftp://ftp.ncdc.noaa.gov/pub/data/ushcn/papers/menne-williams2009.pdf
>”Following Lund et al. (2007)”
Typo, the paper is actually:
Lund, R., and J. Reeves, 2002: Detection of undocumented changepoints: A revision of the two-phase regression model. J. Climate, 15, 2547–2554.
http://rmgsc.cr.usgs.gov/outgoing/threshold_articles/Lund_Reeves2002.pdf
This seems to be the 95th Percentile Matching (PM-95) method that BOM uses for ACORN-SAT except it’s not an X to Y neighbour comparison as in BOM’s method.
2. An Fmax test statistic
We start with the simple two-phase linear regression
scheme for a climatic series {Xt} considered by Solow
(1987), Easterling and Peterson (1995), and Vincent
(1998; among others). This model can be written in the
form:
Xt = [model] (2.1)
where {et} is mean zero independent random error with
a constant variance.
The model in (2.1) is viewed as a classic simple linear
regression that allows for two phases. This allows for
both step- (u1 ± u2) and trend- (a1 ± a2) type changepoints.
Specifically, the time c is called a changepoint
in (2.1) if u1 ± u2 and/or a1 ± a2. In most cases, there
will be a discontinuity in the mean series values at the
changepoint time c, but this need not always be so (Fig.
10 in section 5 gives a quadratic-based example where
the changepoint represents more of a slowing of rate of
increase than a discontinuity).
# # #
Fmax test statistic series lengths (n) range from 10 (e.g. 10 months, RS93 k=0.4) to 5000 in Table 1, but I can’t see any recommendation for length n in respect to temperature series. There’s nothing said about n in equation 2.2 for example. What happens when a break (c) occurs at time 5 months of n = 100 months for example? Isn’t this just effectively n = 10?
F-test: http://en.wikipedia.org/wiki/F-test
Cross-reference to the 7SS k=1/2 vs k=4+ discussion here:
Critical debating points answered – Part 1
https://www.climateconversation.org.nz/2014/11/critical-debating-points-answered-part-1/#comment-1251101
‘Massive Tampering With Temperatures In South America’
By Paul Homewood, January 20, 2015
One of the regions that has contributed to GISS’ “hottest ever year” is South America, particularly Brazil, Paraguay and the northern part of Argentina. In reality, much of this is fabricated, as they have no stations anywhere near much of this area, as NOAA show below.
Nevertheless, there does appear to be a warm patch covering Paraguay and its close environs. However, when we look more closely, we find things are not quite as they seem.
[See data coverage graph]
There are just three genuinely rural stations in Paraguay that are currently operating – Puerto Casado, Mariscal and San Juan. They all show a clear and steady upward trend since the 1950’s, with 2014 at the top, for instance at Puerto Casada: [graph]
It could not be more clearcut, could it? However, it all looks a bit too convenient, so I thought I would check out the raw data (which is only available up to 2011 on the GISS site, so the last three years cannot be compared). Lo and behold! [graph]
As we so often see, the past has been cooled.
GHCN show the extent to which they have adjusted temperatures, the best part of 2 degree centigrade. [graphs]
Of course, there may be a genuine problem with Puerto Casada’s record, except that we see exactly the same thing happening at the other two Paraguayan sites. [Raw – Adjusted gif comparisons]
So we find that a large chunk of Gavin’s hottest year is centred around a large chunk of South America, where there is little actual data, and where the data that does exist has been adjusted out of all relation to reality.
Even by GHCN standards, this tampering takes some beating.
https://notalotofpeopleknowthat.wordpress.com/2015/01/20/massive-tampering-with-temperatures-in-south-america/
Delingpole on Paraquay:
‘Forget Climategate: this ‘global warming’ scandal is much bigger’
https://wp.breitbart.com/london/2015/01/30/forget-climategate-this-global-warming-scandal-is-much-bigger/
‘Cooling The Past In Bolivia’
By Paul Homewood, January 30, 2015
https://notalotofpeopleknowthat.wordpress.com/2015/01/30/cooling-the-past-in-bolivia/
From the above post, Paul Homewood’s recent temperature record posts can be accessed:
Recent Posts
How GHCN Keep Rewriting Reykjavik History
Cooling The Past In Bolivia
Cooling The Past In San Diego
Greene Hypocrisy
Higher Snowfalls Due to Change In Measurement, Not Global Warming.
Temperature Adjustments Around The World
Shub Niggurath On The Paraguayan Adjustments
From above:
‘Temperature Adjustments Around The World’ – Paul Homewood
https://notalotofpeopleknowthat.wordpress.com/2015/01/29/temperature-adjustments-around-the-world/
It is often claimed that these adjustments are needed to “correct” errors in the historic record, or compensate for station moves. All of which makes the adjustment at Valentia Observatory even more nonsensical [old vs new].
https://notalotofpeopleknowthat.files.wordpress.com/2015/01/valentia_thumb.gif?w=750&h=507
Valentia Observatory, situated in SW Ireland, is regarded as one of the highest quality meteorological stations in the world, located at the same site since 1892, well away from any urban or other non climatic biases. The Irish Met Office say this:-
“Since the setting up of the Irish Meteorological Service, the work programme of the Observatory has greatly expanded and it has always been equipped with the most technologically advanced equipment and instrumentation. The Observatory is well known and very highly regarded by the scientific community. As well as fulfilling its national and international role within Met Éireann it is involved in many projects with other scientific bodies both in Ireland and abroad.”
If we cannot get accurate temperature trends in Valentia, we cannot get them anywhere. Yet the GHCN algorithm decides that the actual temperatures measured there do not look right, and lops 0.4C off temperatures before 1967.
Worse still, the algorithm uses a bunch of hopelessly unreliable urban sites, as far away as Paris, to decide upon the “correct temperature”, as Ronan Connolly illustrated.
https://notalotofpeopleknowthat.files.wordpress.com/2015/01/image_thumb931.png
If you wanted to find a way to fraudulently alter the global temperature record upwards, you would be hard put to find a better way than this.
‘Temperature Adjustments Transform Arctic Climate History’
By Paul Homewood, February 4, 2015
[…] We saw previously how the temperature history for Paraguay, and a large slice of the surrounding region, had been altered as a result of temperature adjustments, which had significantly reduced historic temperatures and changed a cooling trend into a warming one.
I can now confirm that similar “cooling the past” adjustments have been carried out in the Arctic region, and that the scale and geographic range of these is breathtaking. Nearly every current station from Greenland, in the west, to the heart of Siberia (87E), in the east, has been altered in this way. The effect has been to remove a large part of the 1940’s spike, and as consequence removed much of the drop in temperatures during the subsequent cold decades.
See graphs and comments by Trausti Jonsson, senior meteorologist at the Iceland Met Office>>>>>>>
https://notalotofpeopleknowthat.wordpress.com/2015/02/04/temperature-adjustments-transform-arctic-climate-history/
‘Another Smoking Gun That NCDC Temperature Adjustments Are Incorrect’
Posted on February 5, 2015 by Steven Goddard
Yesterday I showed how 100% of US warming since 1990 is due to NCDC filling in fake temperatures for missing data. The actual measured data shows no warming
See gif: Measured vs No Underlying Data
See graph: Percent Of USHCN Final Temperatures Which Are “Estimated” [over 50%]
The next step is to look at the correlation between how much infilling is being done vs. the divergence between estimated temperatures and measured.
See graph
There is a very good correlation between infilling and inflated temperatures. The more fake data they generate, the larger the divergence in temperature vs. measured. This is likely due to loss of data at rural stations, which are now being doubly contaminated by gridded and infilled urban temperatures.
https://stevengoddard.wordpress.com/2015/02/05/another-smoking-gun-that-ncdc-temperature-adjustments-are-incorrect/
‘Cooling The Past In New Zealand’
By Paul Homewood, February 9, 2015
In case you thought widespread temperature adjustments were confined to the Arctic and South America, consider again. Apparently, New Zealand has caught Paraguayan fever!
There are five stations currently operational under GHCN in New Zealand and surrounding islands. It will come as no great surprise now to learn that GHCN warming adjustments have been added to every single one. (Full set of graphs below).
In all cases, other than Hokitika, the adjustment has been made in the mid 1970’s.
This adjustment has been triggered by a drop in temperatures in 1976, as we can see with Gisborne, below. (The algorithm did not spot that temperatures recovered to previous levels two years later!)
See graph: Raw Data, Gisborne Aero
Was this temperature drop due to some local, non-climatic factor at Gisborne. Apparently not, because the same drop occurred at all eleven of the other NZ stations operating at that time.
Below is a comparison of the unadjusted annual temperatures for 1975 and 1976.
Station 1975 1976 Change
Gisborne 14.37 13.37 -1.00
Napier 14.65 13.69 -0.96
New Plymouth 14.22 13.07 -1.15
Auckland 15.59 14.66 -0.93
Wellington 12.84 11.53 -1.31
Nelson 12.87 11.72 -1.15
Kaitaia 15.87 15.00 -0.87
Christchurch 11.86 10.52 -1.34
Hokitika 12.17 11.09 -1.08
Chatham I 11.58 10.36 -1.22
Invercargill 10.34 9.17 -1.17
Raoul I 19.57 19.04 -0.53
As a result of the adjustment, Gisborne’s temperatures for 1974 and earlier have been lowered by 0.7C. Similar sized adjustments seem to have been made at the other stations.
As the algorithm cannot have arrived at the adjustment by comparing NZ stations with each other, it must have used stations further away, presumably in Australia.
But can we really compare the two? Once again, the evidence points strongly to the adjustments being incorrect, and reacting to a genuine drop in temperature.
It is often claimed that, overall, temperature adjustments up and down largely cancel each other out. But, while we keep coming across warming adjustments that are questionable, I don’t see cooling ones similarly criticised. Maybe most of these are justifiable.
If this is the case, and many of the warming ones are not, then the overall effect would be much greater than suggested.
On the other hand, if many cooling adjustments are also incorrect, it does not inspire much confidence in the process.
APPENDIX
NCDC, NOAA, GHCN v3 station plots of above list.
https://notalotofpeopleknowthat.wordpress.com/2015/02/09/cooling-the-past-in-new-zealand/
# # #
>”As a result of the adjustment, Gisborne’s temperatures for 1974 and earlier have been lowered by 0.7C. Similar sized adjustments seem to have been made at the other stations.”
This is not an adjustment that NIWA make in the NZ 7SS compilation used by HadCRUT4. Which in turn has triple the trend of the alternative 7SS using Rhoades & Salinger (1993) methodology:
A Reanalysis of Long-Term Surface Air Temperature Trends in New Zealand – by C. R. de Freitas & M. O. Dedekind & B. E. Brill.
https://www.climateconversation.org.nz/2014/10/paper-adds-interesting-perspective-on-nz-temperature-trend/
Neither does BEST make the adjustment for the NZ stations e.g.:
GISBORNE AERODROME AWS
Breakpoint Adjusted Annual Average Comparison
http://berkeleyearth.lbl.gov/stations/157058
I’ve left this note at Paul Homewood’s post.
Gisborne Aero monthly plotted on this page (click graph to zoom in):
http://climate.unur.com/ghcn-v2/507/93292.html
There is no reason for a break between 1974 and 1979. But 1975/76 is only a 0.1 adjustment. There is a progressive cumulative adjustment in 0.1 increments adding to 0.7.
GISS raw monthly data (as plotted):
http://data.giss.nasa.gov/tmp/gistemp/STATIONS/tmp_507932920000_14_0/station.txt
GISS adj monthly data:
http://data.giss.nasa.gov/tmp/gistemp/STATIONS/tmp_507932920000_13_0/station.txt
See metANN column at far right of the data sheets.
At 1963 the cumulative adjustment is 0.7
At 1968 the cumulative adjustment is 0.6
At 1972 the cumulative adjustment is 0.5
At 1975 the cumulative adjustment is 0.4
At 1980 the cumulative adjustment is 0.3
At 1982 the cumulative adjustment is 0.2
At 1986 the cumulative adjustment is 0.1
At 2001 the cumulative adjustment is 0.1
At 2002 the cumulative adjustment is 0.0
There is no valid reason for adjustments of this nature. And there is no resemblance to the BEST adjustments: http://berkeleyearth.lbl.gov/stations/157058
Duplicated at Paul Homewood’s.
For some reason the GISS Gisborne Aero data sheet URLs return “Not found” sorry. Deleting “http://” works for the adjusted data, but for the raw data start at this page:
http://data.giss.nasa.gov/gistemp/station_data/
Select “after removing suspicious records” and then “Download monthly data as text”
I would point out that over the GISS period of Gisborne Aero dataset adjustments of which there appear to be about 7 in total 1962 – 2002, BEST make no adjustments whatsoever over the same period.
The GISS Gisborne Aero 1973 cumulative adjustment is 0.5
1973 monthly raw (top) vs adjusted (bottom)
19.4 18.5 16.2 14.6 12.7 10.0 8.6 10.5 12.3 14.2 17.2 17.2
18.9 18.0 15.7 14.1 12.2 9.5 8.1 10.0 11.8 13.7 16.7 16.8
0.5 difference for every month
The 1974 – 1977 range of common cumulative adjustment is 0.4
1974 monthly raw (top) vs adjusted (bottom)
17.7 20.6 15.1 14.8 11.2 10.1 10.1 8.9 12.1 13.6 15.5 17.8
17.3 20.2 14.7 14.4 10.8 9.7 9.7 8.5 11.7 13.2 15.1 17.4
0.4 difference for every month
1977 monthly raw (top) vs adjusted (bottom)
18.4 18.9 17.8 14.5 10.9 10.1 9.4 10.4 10.2 13.4 14.9 17.5
18.0 18.5 17.4 14.1 10.5 9.7 9.0 10.0 9.8 13.0 14.5 17.2
0.4 difference for every month
The 1978 cumulative adjustment is 0.3
1978 monthly raw (top) vs adjusted (bottom)
19.2 19.5 17.6 16.4 12.0 10.0 9.7 10.3 11.3 12.0 16.0 18.0
18.9 19.2 17.3 16.1 11.7 9.7 9.4 10.0 11.0 11.7 15.7 17.7
0.3 difference for every month
Apparently, according to GISS (but not BEST), there were 2 distinct 0.1 steps from 1978 to 1977 and from 1974 to 1973. Similarly for the other ranges of common cumulative adjustments.
There is no justification for these 2 steps (or the others) in view of the raw monthly data series (and no site moves): http://climate.unur.com/ghcn-v2/507/93292-zoomed.png
GISS has some explaining to do.
Cross-reference to ‘Scandel heating up’:
https://www.climateconversation.org.nz/2015/02/scandal-heating-up/
Covers GISS and BEST adjustments at Puerto Casado, Paraguay and Gisborne Aero, New Zealand along with links to the Climate Etc discussion and a bunch of other stuff.
‘Temperature adjustments in Australia’
by Euan Mearns, March 17, 2015
[…]
Using the excellent web platform provided by NASA GISS it is possible to access GHCN V2 and GHCN v3 records, compare charts and download the data. It does not take long to find V3 records that appear totally different to V2 and I wanted to investigate this further. At this point I was advised that the way homogenisation works is to adjust records in such a way that a warming trend added in one station is compensated by cooling added to another. This didn’t sound remotely scientific to me but I clicked on Alice Springs in the middle of Australia and recovered 30 V2 and V3 records in a 1000 km radius and set about a systematic comparison of the two. The results are described in detail below.
In summary I found that while individual stations are subject to large and what often appears to be arbitrary and robotic adjustments in V3, the average outcome across all 30 stations is effectively zero. At the regional level, homogenisation does not appear to be responsible for adding warming in Australia. But the thing that truly astonished me was the fact that the mean temperature trend for these 30 stations, 1880 to 2011, was a completely flat line. There has been no recorded warming across a very large portion of the Australian continent.
Continues>>>>>
http://judithcurry.com/2015/03/17/temperature-adjustments-in-australia/
Selected points:
# In Alice Springs the raw record is flat and has no sign of warming. In the adjusted record, homogenistaion has added warming by significantly cooling the past. Five other stations inside the 1000 km ring have similarly long and similarly flat records – Boulia, Cloncurry, Farina, Burketown and Donors Hill. There can be no conceivable reason to presume that the flat raw Alice Springs record is somehow false and in need of adjustment.
# Six records show a significant mid-1970s cooling of about 3˚C (Alice Springs, Barrow Creek, Brunette Down, Cammoo Weal, Boulia and Windorah) that owing to its consistency appears to be a real signal. Homegisation has tended to remove this real temperature history.
# The average raw temperature record for all 30 stations is completely flat from 1906 (no area weighting applied). There has been no measurable warming across the greater part of Australia. The main discontinuity in the record, pre-1906, arises from there being only 3 operating stations that do not provide representative cover.
# Homogenisation appears to have added warming or cooling to records where neither existed. Homogenisation may also have removed real climate signal.
# I find zero warming over such a large part of the Australian continent to be a surprise result that is consistent with Roger Andrew’s observation of no to little warming in the southern hemisphere, an observation that still requires more rigorous testing.
euanmearns | March 17, 2015 at 12:33
By way of a little further background. This post is a bit rough around the edges in part because it is a huge amount of work to clean the V3 data where large amounts of records are deleted and many are “created”. I was also feeling my way trying to make sense of how to treat the results. I have since moved on to look at Southern Africa and I hope these results will also be posted here. Excluding urban records that show warming trends, southern Africa looks like central Australia.
One thing I want to try and nail is how the likes of BEST manage to create warming from temperature records that are flat. I ventured on to Real Climate a few weeks ago and was told repeatedly that what GHCN and GISS were doing must be correct since BEST shows the same trends.
I have completed analysis of S S America and Antarctica that have yet to be published. All this pretty well confirms Roger Andrews observation that there is little warming in the southern hemisphere which I find is a real puzzle.
‘New paper finds a large warming bias in Northern Hemisphere temperatures from ‘non-valid’ station data’
The Hockey Shtick, May 28, 2015
A new paper published in the Journal of Atmospheric and Solar-Terrestrial Physics finds that the quality of Northern Hemisphere temperature data has significantly & monotonically decreased since the year 1969, and that the continued use of ‘non-valid’ weather stations in calculating Northern Hemisphere average temperatures has created a ‘positive bias’ and “overestimation of temperatures after including non-valid stations.”
The paper appears to affirm a number of criticisms of skeptics that station losses, fabricated/infilled data, and positively-biased ‘adjustments’ to temperature data have created a positive skew to the data and overestimation of warming during the 20th and 21st centuries.
Graphs from the paper below show that use of both valid and ‘non-valid’ station data results in a mean annual Northern Hemisphere temperature over 1C warmer at the end of the record in 2013 as compared to use of ‘valid’ weather station data exclusively.
[snip]
Extraction of the global absolute temperature for Northern Hemisphere using a set of 6190 meteorological stations from 1800 to 2013
Demetris T. Christopoulos
Highlights
•
Introduce the concept of a valid station and use for computations.
•
Define indices for data quality and seasonal bias and use for data evaluation.
•
Compute averages for mean and five point summary plus standard deviations.
•
Indicate a monotonically decreasing data quality after the year 1969.
•
Observe an overestimation of temperature after including non-valid stations.
http://hockeyschtick.blogspot.co.nz/2015/05/new-paper-finds-large-warming-bias-in.html
Interesting trends over regions in the new UAH Version 6.0 data:
http://www.reportingclimatescience.com/typo3temp/pics/d28b843ac4.jpg
NE USA up to NW Canada and Alaska (a little above trendless) completely different to SW USA (warming).
China and Japan only a little above trendless.
Radical warming Australia, Eastern Europe, NE Canada, Africa, Sth America. Arctic.
Vast areas of ocean trendless.
Some cooling across Antarctica.
http://www.reportingclimatescience.com/news-stories/article/new-analysis-brings-uah-temperatures-closer-to-rss.html
‘The Official Iceland Temperature Series – 2015’
January 12, 2016 by Paul Homewood
In most parts of Iceland, last year was the coldest since 2000, in marked contrast to 2014.
The Iceland Temperature Series below is built up from the seven following sites, which have long running, high quality temperature records back to 1931 and earlier. They also present a reasonable geographic distribution. It is also important to note that the temperature data has been carefully homogenised over the years by the Iceland Met Office, to adjust for station moves, equipment changes etc. (For more detail, see here).
Reykjavik
Stykkisholmur
Akureyri
Grimsstadir
Storhofoi
Teigarhorn
Haell
[…see graph…]
The warm years of the 1930’s and 40’s, and much colder ones that followed clearly correlate with the AMO cycle. Although the warmth has been more persistent in the last decade, only one year, 2014, has been warmer than those earlier years.
There is no evidence to suggest that temperature trends will increase in the next few years, and much to suggest that Iceland will suffer from a return of a very cold climate once the AMO turns cold again.
We can see the same patterns in the individual stations below:
[…see graphs…]
Needless to say, the adjusted GISS versions below bear no comparison.
[…see grpahs…]
https://notalotofpeopleknowthat.wordpress.com/2016/01/12/iceland-temperature-series-2015/
Spectacular NOAA Temperature Fraud In Michigan
February 1, 2016 by tonyheller
Below is plotted all of the currently active individual USHCN stations in Michigan. There is only one station which shows net warming since the 1950s (Mt. Pleasant) and that one appears to have a discontinuity at 1995 causing the problem.
The NOAA hockey stick in Michigan is completely fake. It has no basis in reality.
http://realclimatescience.com/2016/02/spectacular-noaa-temperature-fraud-in-michigan/
Are land + sea temperature averages meaningful?
by Greg Goodman
Rate of change in global temperature datasets
The rate of change of near surface land air temperature as estimated in the Berkeley “BEST” dataset is very similar to the rate of change in the sea surface temperature record, except that it shows twice the rate of change.
Sea water has a specific heat capacity about 4 times that of rock. This means that rock will change in temperature four times more than water for the same change in thermal energy, for example from incoming solar radiation.
Since soil, in general, is a mix of fine particles of rock and organic material with a significant water content. The two temperatures records are consistent with the notion of considering land as ‘moist rock’. This also partly explains the much larger temperature swings in desert regions: the temperature of dry sand will change four times faster than ocean water and be twice as volatile as non-desert land regions.
This also underlines why is it inappropriate to average land and sea temperatures as is done in several recognised global temperature records such as HadCRUT4 ( a bastard mix of HadSST3 and CRUTem4 ) as well as GISS-LOTI and the new BEST land and sea averages.
It is a classic case of ‘apples and oranges’. If you take the average of an apple and an orange, the answer is a fruit salad. It is not a useful quantity for physics based calculations such as earth energy budget and the impact of a radiative “forcings”.
The difference in heat capacity will skew the data in favour of the land air temperatures which vary more rapidly and will thus give an erroneous basis for making energy based calculations
https://judithcurry.com/2016/02/10/are-land-sea-temperature-averages-meaningful/
‘Historical Station Distribution’
Climate Audit, 2009
In his comment to How much Estimation is too much Estimation?, Anthony Watts suggested I create a scatter plot showing station distribution with latitude/longitude. It turned out not to be the ordeal I thought it might be, so I have posted some of the results in this thread. I started with 1885 and created a plot every 20 years, ending in 2005. I deliberately ended with 2005 because this is the final year in the GHCN record prior to the US station die-off of 2006.
Every dot on a plot represents a station, not a scribal record. Stations may be comprised of multiple records. A blue dot represents a station with an annual average that was fully calculated from existing monthly averages. A red dot represents a station that had missing monthly averages for that year, so the annual average had to be estimated. Stations that had insufficient data to estimate an annual average are not shown.
In the case where multiple scribal records exist for a station in the given year, I assigned a blue dot if all records were fully calculated from existing averages, a red dot if at least one record was estimated, and no dot if none of the records could produce an estimate. I believe this errs in the direction of assigning more blue dots than is deserved. Hansen’s bias method mathematically forces estimation to occur during the period of scribal record overlap.
The first plot shows coverage in 1885, five years into the GHCN record.
[see chart]
1905 shows improved coverage across the continental US, Japan and parts of Australia. A few stations have appeared in Africa.
[see chart]
1925 shows increased density in the western US, southern Canada, and the coast of Australia.
[see chart]
At the end of WWII, not a lot of change is noticeable other than improved coverage in Africa and South America as well as central China and Siberia.
[see chart]
In 1965 we see considerable increases in China, parts of Europe, Turkey, Africa and South America.
[see chart]
A decline in quality seems to be apparent in 1985, as many more stations show as red, indicating their averages are estimated due to missing monthly data.
[see chart]
A huge drop in stations is visible in the 2005 plot, notably Australia, China, and Canada. 2005 was the warmest year in over a century. Not surprising, as the Earth hadn’t seen station coverage like that in over a century.
[see chart]
The final plot illustrates the world-wide station coverage used to tell us “2006 Was Earth’s Fifth Warmest Year“.
[see chart]
http://climateaudit.org/2008/02/10/historical-station-distribution/
‘The Great Dying of Thermometers’
JoNova, 2010
It’s like watching the lights go out over the West. Sinan Unur has mapped the surface stations into a beautiful animation. His is 4 minutes long and spans from 1701-2010. I’ve taken some of his snapshots and strung them into a 10 second animation.
You can see as development spreads across the world that more and more places are reporting temperatures. It’s obvious how well documented temperatures were (once) in the US. The decay of the system in the last 20 years is stark.
Continues>>>>>
http://joannenova.com.au/2010/05/the-great-dying-of-thermometers/
‘The ‘Karlization’ of global temperature continues – this time RSS makes a massive upwards adjustment’
Anthony Watts / 7 hours ago March 2, 2016
http://wattsupwiththat.com/2016/03/02/the-karlization-of-global-temperature-continues-this-time-rss-makes-a-massive-upwards-adjustment/
# # #
Mears and Wentz aper was rejected by the first Journal where it was submitted.
Spencer:
As Anthony Watts puts it – “Yes, yes it does”.
‘Comments on New RSS v4 Pause-Busting Global Temperature Dataset’
March 4th, 2016 by Roy W. Spencer, Ph. D.
Now that John Christy and I have had a little more time to digest the new paper by Carl Mears and Frank Wentz (“Sensitivity of satellite derived tropospheric temperature trends to the diurnal cycle adjustment”, paywalled here), our conclusion has remained mostly the same as originally stated in Anthony Watts’ post.
While the title of their article implies that their new diurnal drift adjustment to the satellite data has caused the large increase in the global warming trend, it is actually their inclusion of what the evidence will suggest is a spurious warming (calibration drift) in the NOAA-14 MSU instrument that leads to most (maybe 2/3) of the change. I will provide more details of why we believe that satellite is to blame, below.
Also, we provide new radiosonde validation results, supporting the UAH v6 data over the new RSS v4 data.
[…]
Conclusion
The evidence suggests that the new RSS v4 MT dataset has spurious warming due to a lack of correction for calibration drift in the NOAA-14 MSU instrument. Somewhat smaller increases in their warming trend are due to their use of a climate model for diurnal drift adjustment, compared to our use of an empirical approach that relies upon observed diurnal drift from the satellite data themselves. While the difference in diurnal drift correction methodolgy is a more legitimate point of contention, in the final analysis independent validation with radiosonde data and most reanalysis datasets suggest better agreement with the UAH product than the RSS product.
http://www.drroyspencer.com/2016/03/comments-on-new-rss-v4-pause-busting-global-temperature-dataset/
Annual Mean of the air temperature at Nuuk in West Greenland
https://notalotofpeopleknowthat.files.wordpress.com/2016/04/image61.png
From
The regime shift of the 1920s and 1930s in the North Atlantic – Drinkwater (2006)
https://notalotofpeopleknowthat.wordpress.com/2016/04/20/the-regime-shift-of-the-1920s-and-1930s-in-the-north-atlantic/
# # #
No resemblance to CO2 rise (or GISTEMP) whatsoever.
‘Steaming hot world sets temperature records’
Peter Hannam – SMH
http://www.smh.com.au/environment/climate-change/worse-things-in-store-steaming-hot-world-sets-more-temperature-records-20160419-goaf58.html
# # #
“Steaming hot” now.
Water turns into steam at 100 °C.
‘Mind Blowing Temperature Fraud In Boulder’
Posted on May 11, 2016 by tonyheller
Measured minimum temperatures in Boulder, CO show a strong cooling trend over the last 60 years, but NOAA massively tampers with the data to turn it into a warming trend.
http://realclimatescience.com/wp-content/uploads/2016/05/2016-05-11165016-1.gif
The adjustments are as much as 7.8 degrees F. [4.33 C]
http://realclimatescience.com/wp-content/uploads/2016/05/2016-05-11171342.png
If we look at daily temperatures for December, 1957 it becomes clear how wildly fraudulent NOAA adjustments are………
Continues>>>>>>
http://realclimatescience.com/2016/05/mind-blowing-temperature-fraud-in-boulder/
# # #
Ya gotta see this. In comments ColA says:
“That means that when I was born in 1956 my body temperature was 91F not 98F?”
‘Can Both GISS and HadCRUT4 be Correct? (Now Includes April and May Data)’
justthefactswuwt / June 16, 2016
Guest Post by Werner Brozek, Excerpted from Professor Robert Brown from Duke University, Conclusion by Walter Dnes and Edited by Just The Facts:
[see graph: HadCRUT4 vs GISTEMP 1960 – 1979]
[…]
The two anomalies match up almost perfectly from the right hand edge to the present. They do not match up well from 1920 to 1960, except for a brief stretch of four years or so in early World War II, but for most of this interval they maintain a fairly constant, and identical, slope to their (offset) linear trend! They match up better (too well!) — with again a very similar linear trend but yet another offset across the range from 1880 to 1920. But across the range from 1960 to 1979, Ouch! That’s gotta hurt. Across 20 years, HadCRUT4 cools Earth by around 0.08 C, while GISS warms it by around around 0.07C.
[…]
Finally, there is the ongoing problem with using anomalies in the first place rather than computing global average temperatures. Somewhere in there, one has to perform a subtraction. The number you subtract is in some sense arbitrary, but any particular number you subtract comes with an error estimate of its own. And here is the rub:
The place where the two global anomalies develop their irreducible split is square inside the mutually overlapping part of their reference periods!
That is, the one place they most need to be in agreement, at least in the sense that they reproduce the same linear trends, that is, the same anomalies is the very place where they most greatly differ. Indeed, their agreement is suspiciously good — as far as linear trend is concerned – everywhere else, in particular in the most recent present where one has to presume that the anomaly is most accurately being computed and the most remote past where one expects to get very different linear trends but instead get almost identical ones!
https://wattsupwiththat.com/2016/06/16/can-both-giss-and-hadcrut4-be-correct-now-includes-april-and-may-data/
# # #
>”But across the range from 1960 to 1979, Ouch! That’s gotta hurt”
Gotcha!
For the record, Gavin Schmidt’s prediction of a temperature record:
Gavin Schmidt @ClimateOfGavin
“With Apr update, 2016 still > 99% likely to be a new record (assuming historical ytd/ann patterns valid).”
Graph: Predicting the 2016 GISTEMP LOTI mean anomaly
https://pbs.twimg.com/media/CicqT6wW0AEcVuB.jpg
https://twitter.com/ClimateOfGavin/status/731599988141248512
# # #
About 1.3 C (1.15 – 1.45).
Schmidt’s “> 99% likely” 1.3 C is already dodgy. Even the 1.15 lower limit looks iffy.
LOTI Mean to end of June was 1.095 C
http://data.giss.nasa.gov/gistemp/tabledata_v3/GLB.Ts+dSST.txt
Schmidt predicts a 1.3 C anomaly for 2016 (plus or minus) but his graph uses a “Pre-industrial baseline (1880-1889)”. This is NOT the GISTEMP LOTI baseline.
Schmidt’s graph has 2015 at 1.1 C. GISS LOTI data sheet has 0.87 C for 2015. Schmidt’s 2016 prediction is 0.2 C higher than 2015. So from the LOTI data sheets 0.87 + 0.2 = 1.07 C for 2016. This is only a fraction below the YTD mean (1.095) with 6 months of data still to come in and La Nina cooling.
LOTI Annual
2013 0.65 ENSO-neutral
2014 0.74
2015 0.87
2016 1.07 Schmidt prediction in LOTI terms
http://data.giss.nasa.gov/gistemp/graphs/graph_data/Global_Mean_Estimates_based_on_Land_and_Ocean_Data/graph.txt
LOTI Monthly
http://data.giss.nasa.gov/gistemp/tabledata_v3/GLB.Ts+dSST.txt
Year Jan Feb Mar Apr May Jun
2016 114 133 129 109 93 79
0.54 C drop in 4 months Feb to Jun. Only another 0.23 C drop in 6 months and the 2016 mean would equal the 2015 mean (0.87). ENSO-neutral is about 0.65 and conditions will cross ENSO positive to negative this year i.e. neutral at some point before the end of the year.
Schmidt\s playing with different anomaly baselines and baseline data has reached absurdity in this Wash Post article:
‘July was ‘absolutely’ Earth’s hottest month ever recorded’
By Jason Samenow, August 16
https://www.washingtonpost.com/news/capital-weather-gang/wp/2016/08/16/july-was-absolutely-earths-hottest-month-ever-recorded/?tid=sm_tw#comments
In the top graph (scary) he uses a 1980-2015 baseline which I assume (doesn’t say) is in respect to each specific month rather than all months (i.e. different baseline for each month – I think). July anomaly is over 2 C (scary).
But in the next LOTI map the anomaly, from normal 1951-1980 all months mean baseline, is only 0.83 C (yawn).
July isn’t even the hottest LOTI month in 2016, February was.
With LOTI July now in at 0.83 C, the YTD LOTI mean only has to fall another 0.189 C over the next 5 months and Schmidt’s “> 99% chance” of a new record is toast.
But he’ll just point to one of his other graphs (the one that suits is story best like in Wash Post above) with different baseline criteria that provides him with the greatest anomaly departure.
Remains to be seen but enough cooler months will foil that plan too. In terms of LOTI, I think Schmidt must be sweating.
It might be time for you all to shift to Australia and vote One Nation:
https://twitter.com/QandA/status/765155774234439680
Yes I saw that clip and the rather lacklustre performance from Brian Cox. He doesn’t exactly inspire me as a physicist, celebrity or otherwise
Rock star-scientist Brian Cox confused on more than global temperatures
http://www.onlineopinion.com.au/view.asp?article=18459
>”move to Australia …..” [Re Malcolm Roberts, One Nation]
No need. Let my fingers do the walking (and talking) at JoNova. Raked over (in detail) there:
‘Malcolm Roberts on Q & A : 1 skeptic against 6 who believe the consensus’
http://joannenova.com.au/2016/08/malcolm-roberts-on-q-a-right-now/
Neither Cox nor Roberts addressed the IPCC’s primary climate change criteria. (I did at #17 – 10 “likes”. 19 ‘”likes” at #12.1.2 re CO2 disconnect). I doubt either of them have even the vaguest idea.
But in terms of temperature series, a useful resource to call up in the future. Way down at #30.1.1 I placed a comment re GISS “adjustments” to Gisborne Aero NZ. That comment, and others nearby, cross links to Judith Curry’s Climate Etc, Paul Homewood’s notalotofpeopleknowthat, Euan Mearns’ Energy Matters, Climate$you, and back to ‘Temperature Records’ here at CCG from Paul Homewood’s post. Further down other GISS and BEST “adjustment” case studies e.g. Puerto Casado. Zeke Hausfather did a runner on that one
#30.1.1
http://joannenova.com.au/2016/08/malcolm-roberts-on-q-a-right-now/#comment-1827020
Also:
‘Brian Cox thinks 17,000 employees at NASA always produce perfect graphs. NASA employees disagree. Who to believe?’
http://joannenova.com.au/2016/08/brian-cox-thinks-17000-employees-at-nasa-always-produce-perfect-graphs-nasa-employees-disagree-who-to-believe/
GISTEMP is rubbish and provably so but at least it’s a devil we know. And the data after the El Nino peak isn’t on Schmidt’s “long-term trend” anymore despite his frantic Tweeting. I think Schmidt’s on a hiding to nothing over the next 2, 3, 4 years.
NASA GISS temperature is topical (Brian Cox v Malcolm Roberts on Q & A). Suggest this Google search:
nasa giss iceland adjustments reykjavik
https://www.google.co.nz/webhp?complete=0&hl=en&gws_rd=cr&ei=2Fe1V9bXBsKx0gTO-Z6ABg#complete=0&hl=en&q=nasa+giss+iceland+adjustments+reykjavik
Start reading.
Gavin Schmidt is ‘flummoxed’ by Roberts apparently:
‘Malcolm Roberts leaves NASA ‘flummoxed’ with Q&A climate claims’
http://www.smh.com.au/environment/climate-change/malcolm-roberts-leaves-nasa-%27flummoxed%27-with-q&a-climate-claims-20160815-gqt9a4.html
Doesn’t appear to realize NASA provides space platforms for temperature monitoring that contradict GISTEMP (Or that NASA is – National Aeronautics and Space Administration).
‘Rock star-scientist Brian Cox confused on more than global temperatures’
By Jennifer Marohasy – posted Thursday, 18 August 2016
[…. 3 pages….click “ALL” at bottom……]
Cox may not care too much for facts. He is not only a celebrity scientist, but also a rock star. Just the other day I was watching a YouTube video of him playing keyboard as the lead-singer of the band screamed, “We don’t need a reason”.
There was once a clear distinction between science – that was about reason and evidence – and art that could venture into the make-believe including through the re-interpretation of facts. This line is increasingly blurred in climate science where data is now routinely remodeled to make it more consistent with global warming theory.
For example, I’m currently working on a 61-page expose of the situation at Rutherglen. Since November 1912, air temperatures have been measured at an agricultural research station near Rutherglen in northern Victoria, Australia. The data is of high quality, therefore, there is no scientific reason to apply adjustments in order to calculate temperature trends and extremes. Mean annual temperatures oscillate between 15.8°C and 13.4°C. The hottest years are 1914 and 2007; there is no overall warming-trend. The hottest summer was in 1938–1939 when Victoria experienced the Black Friday bushfire disaster. This 1938-39 summer was 3°C hotter than the average-maximum summer temperature at Rutherglen for the entire period: December 1912 to February 2016. Minimum annual temperatures also show significant inter-annual variability.
In short, this temperature data, like most of the temperature series from the 112 sites used to concoct the historical temperature record by the Australian Bureau of Meteorology does not accord with global warming theory.
So, adjustments are made by the Australian Bureau of Meteorology to these temperature series before they are incorporated into the Australian Climate Observations Reference Network – Surface Air Temperature (ACORN-SAT); and also the UK Met Office’s HadCRUT dataset, which informs IPCC deliberations.
The temperature spike in 1938-1939 is erroneously identified as a statistical error, and all temperatures before 1938 adjusted down by 0.62°C. The most significant change is to the temperature minima with all temperatures before 1974, and 1966, adjusted-down by 0.61°C and 0.72°C, respectively. For the year 1913, there is a 1.3°C difference between the annual raw minimum value as measured at Rutherglen and the remodelled value.
The net effect of the remodelling is to create statistically significant warming of 0.7 °C in the ACORN-SAT mean temperature series for Rutherglen, in general agreement with anthropogenic global warming theory.
NASA applies a very similar technique to the thousands of stations used to reproduce the chart that Cox held-up on Monday night during the Q&A program. I discussed these change back in 2014 with Gavin Schmidt, who oversees the production of these charts at NASA. I was specifically complaining about how they remodel the data for Amberley, a military base near where I live in Queensland.
Back in 2014, the un-adjusted mean annual maximum temperatures for Amberley – since recordings were first made in 1941 – show temperatures trending up from a low of about 25.5°Cin 1950 to a peak of almost 28.5°Cin 2002. The minimum temperature series for Amberley showed cooling from about 1970. Of course this does not accord with anthropogenic global warming theory. To quote Karl Braganza from the Bureau as published by online magazine The Conversation, “Patterns of temperature change that are uniquely associated with the enhanced greenhouse effect, and which have been observed in the real world include… Greater warming in winter compared with summer… Greater warming of night time temperatures than daytime temperatures”.
The Bureau has “corrected” this inconvenient truth at Amberley by jumping-up the minimum temperatures twice through the homogenization process: once around 1980 and then around 1996 to achieve a combined temperature increase of over 1.5°C.
This is obviously a very large step-change, remembering that the entire temperature increase associated with global warming over the 20th century is generally considered to be in the order of 0.9°C.
According to various peer-reviewed papers, and technical reports, homogenization as practiced in climate science is a technique that enables non-climatic factors to be eliminated from temperature series – by making various adjustments.
It is often done when there is a site change (for example from a post office to an airport), or equipment change (from a Glaisher Stand to a Stevenson screen). But at Amberley neither of these criteria can be applied. The temperatures have been recorded at the same well-maintained site within the perimeter of the air force base since 1941. Through the homogenization process the Bureau have changed what was a cooling trend in the minimum temperature of 1.0°Cper century, into a warming trend of 2.5°C per century.
Homogenization – the temperature adjusting done by the Bureau – has not resulted in some small change to the temperatures as measured at Amberley, but rather a change in the temperature trend from one of cooling to dramatic warming as was done to the series for Rutherglen.
NASA’s Goddard Institute for Space Studies (GISS) based in New York also applies a jump-up to the Amberley series in 1980, and makes other changes, so that the annual average temperature for Amberley increases from 1941 to 2012 by about 2°C.
The new Director of GISS, Gavin Schmidt, explained to me on Twitter back in 2014 that: “@jennmarohasy There is an inhomogenity detected (~1980) and based on continuity w/nearby stations it is corrected. #notrocketscience”.
When I sought clarification regarding what was meant by “nearby” stations I was provided with a link to a list of 310 localities used by climate scientists at Berkeley when homogenizing the Amberley data.
The inclusion of Berkeley scientists was perhaps to make the point that all the key institutions working on temperature series (the Australian Bureau, NASA, and also scientists at Berkeley) appreciated the need to adjust-up the temperatures at Amberley. So, rock star scientists can claim an absolute consensus?
But these 310 “nearby” stations, they stretch to a radius of 974 kilometres and include Frederick Reef in the Coral Sea, Quilpie post office and even Bourke post office. Considering the un-adjusted data for the six nearest stations with long and continuous records (old Brisbane aero, Cape Moreton Lighthouse, Gayndah post office, Bundaberg post office, Miles post office and Yamba pilot station) the Bureau’s jump-up for Amberley creates an increase for the official temperature trend of 0.75°C per century.
Temperatures at old Brisbane aero, the closest of these station, also shows a long-term cooling trend. Indeed perhaps the cooling at Amberley is real. Why not consider this, particularly in the absence of real physical evidence to the contrary? In the Twitter conversation with Schmidt I suggested it was nonsense to use temperature data from radically different climatic zones to homogenize Amberley, and repeated my original question asking why it was necessary to change the original temperature record in the first place. Schmidt replied, “@jennmarohasy Your question is ill-posed. No-one changed the trend directly. Instead procedures correct for a detected jump around ~1980.”
If Twitter was around at the time George Orwell was writing the dystopian fiction Nineteen Eighty-Four, I wonder whether he might have borrowed some text from Schmidt’s tweets, particularly when words like, “procedures correct” refer to mathematical algorithms reaching out to “nearby” locations that are across the Coral Sea and beyond the Great Dividing Range to change what was a mild cooling-trend, into dramatic warming, for an otherwise perfectly politically-incorrect temperature series.
http://www.onlineopinion.com.au/view.asp?article=18459&page=0
# # #
Warmies, including Cox, are clueless about this controversy.
Cox flashed this GISS graph:
Global Mean Estimates based on Land and Ocean Data [Mouse-over for annual anomaly values]
http://data.giss.nasa.gov/gistemp/graphs/graph_data/Global_Mean_Estimates_based_on_Land_and_Ocean_Data/graph.html
Apart from the estimate being merely “based on” data (i.e. not actual data), and that the 1998 El Nino relativity has been erased, there is a further eye teaser concerning the LOWESS smoothing (red line) at the end point 2015. In other words, the red line guides the eye but not the brain.
2004 to 2011 is the “hiatus/pause” in the red line. After 2011 the red line hikes up abruptly due to the El Nino spike. 2013 (0.66) was ENSO-neutral,
GISS appear to be assuming, given Gavin Schmidt’s 2016 prediction and reasoning (see Tweets), that the red line will continue upwards to a new record high (anomaly 0.87+) as per 1998 in the graph rather than return to neutral (anomaly 0.66 ish) and a continued “hiatus”.
I’m guessing, given the monthly data trajectory, that instead of continuing upwards what we will see in a couple of years is just a spike in the red line smoothing, but time will tell.
Basically, what I’m getting at is that the LOWESS smoothing (red line) of annual mean datapoints 2011 – 2015 is no indication of the series trajectory from 2015 onwards. But GISS and their fan base like Cox, and his fans (“fanbois” – Dellingpole) are convinced the red line is, actually, the data trajectory.
This series progression will be make or break between now and about 2018 for Schmidt and GISS given their posturing. We could see some “adjustments” of course. But then they still have the satellites to contend with.
Schmidt’s July update Tweet:
Gavin Schmidt @ClimateOfGavin
Graph: Predicting the 2016 GISTEMP LOTI mean anomaly
https://pbs.twimg.com/media/Cp6rnONW8AAEoW6.jpg
https://twitter.com/ClimateOfGavin/status/765237770839269378
‘More On The NOAA Africa Fraud’
Posted on September 21, 2016 by tonyheller
As I reported earlier, NOAA shows Angola as their hottest month ever in August. [see chart]
Amazing, since NOAA doesn’t actually have any thermometer readings in Angola – which is almost twice the size of Texas. [see chart]
The RSS TLT August anomaly for central Angola was 0.53, slightly above average and nowhere near a record. There has been almost no trend in Angola temperatures since the start of records in 1978 [see chart]
NOAA and NASA are defrauding the American people and the world with their junk science, which is used as the basis for policy.
http://realclimatescience.com/2016/09/more-on-the-noaa-africa-fraud/
‘NASA Joins In The NOAA African Fraud’
Posted on September 21, 2016 by tonyheller
NOAA claimed yesterday that Angola and Namibia had their hottest month ever last month, even though they don’t have any thermometers there.
Continues at length >>>>>>
http://realclimatescience.com/2016/09/nasa-joins-in-the-noaa-african-fraud/
Analysis: NASA’s ‘record heat’ in SW Africa is based on one tampered station, located next to asphalt in middle of rapidly growing city’
http://www.climatedepot.com/2016/09/21/analysis-nasas-record-heat-in-sw-africa-is-based-on-one-tampered-station-located-next-to-asphalt-in-middle-of-rapidly-growing-city/
Hello all
I am looking for an old televised climate change debate, but I cannot find it. I thought it was about 10 years old and that John Campbell was the moderator…….but nothing is popping up in Google so I must have this wrong.
From memory it was a debate between two experts/scientists. An early exchange was an expert showed a graph of rising temperatures. The opposing expert then explained that recent data was purposely excluded because it did not follow the trend.
Is there any chance someone knows the debate I’m thinking of, or who was in it?
I’ve been advised that the climate change debate I’m looking for happened more like 20 years ago, and was between David Wratt and Chris de Freitas on TV1. If someone knows what show that was on, or has a copy of the debate, that would be great. Cheers.