Q & A To Understand The Temperature Record
Zeke Hausfeather of BEST states:
href=”http://judithcurry.com/2014/07/07/understanding-adjustments-to-temperature-data/”>Understandingjustments to Temperature Data.
……Nearly every single station in the network in the network has been moved at least once over the last century, with many having 3 or more distinct moves. Most of the stations have changed from using liquid in glass thermometers (LiG) in Stevenson screens to electronic Minimum Maximum Temperature Systems (MMTS) or Automated Surface Observing Systems (ASOS). Observation times have shifted from afternoon to morning at most stations since 1960, as part of an effort by the National Weather Service to improve precipitation measurements.
All of these changes introduce (non-random) systemic biases into the network. For example, MMTS sensors tend to read maximum daily temperatures about 0.5 C colder than LiG thermometers at the same location. There is a very obvious cooling bias in the record associated with the conversion of most co-op stations from LiG to MMTS in the 1980s, and even folks deeply skeptical of the temperature network like Anthony Watts and his coauthors add an explicit correction for this in their paper…..
Nearly every single station in the network in the network has been moved at least once over the last century, with many having 3 or more distinct moves.
From what I understand they do some fancy dancy statistics looking for ‘breaks’ in the data and use that as an excuse to adjust the data.
I can see two reason why that is completely bogus.
#1. They are complaining about station moves of less than a few miles generally speaking but they have ZERO problem smearing the temperature of 1(ONE) data station over half the Arctic or half the Antarctic. Does Not Compute…Does Not Compute…
#2. Meteorology: A Text-book on the Weather, the Causes of Its Changes, and Weather Forecasting By Willis Isbister Milham 1918
On page 68 he says a thermometer in a Stevenson screen is correct to within a half degree. It is most in error on still days, hot or cold. “In both cases the indications of the sheltered thermometers are too conservative.” On page 70
“The Ventilated thermometer which is the best instrument for determining the real air temperature, was invented by Assman at Berlin in 1887…will determine the real air temperature correctly to a tenth of a degree.”
He then goes on to say on page 77:
If a good continuous thermograph record for at least twenty years is available, the normal hourly temperatures for the various days of the year can be computed….
“the average temperature for a day is found by averaging the 24 values of hourly temperature observed during that day”
If the normals are based on twenty years of observations, it will be found that there is not an even transition from day to day, but jumps of even two or three degrees occur….
I maybe reading the situation wrong but that says to me that the ‘breaks’ in the data that BEST uses may well be naturally occurring jumps in the data from a chaotic system. If BEST and GISS are doing ‘adjustments’ based only on those jumps or breaks without any underlying testing such as running both sets for a few years to determine the true bias then all they are doing is introducing additional error to the system. Heck I am not sure they are even determining if the station was actually moved or had some other system change like a replacement thermometer.
Why do you massively tamper with the temperature data, and turn a cooling trend into a warming trend?
You can’t use the raw surface data. It is a mess, affected by station moves, TOBS, different types of thermometers, missing data, etc. It needs large adjustments to be meaningful
Sounds bad – why not just use modern satellite data instead? It has much better coverage.
Satellite data has problems, and needs to be adjusted. The surface data is our most reliable data source.