Since records of surface temperature started being made
there have been iterations of the fixed points standards used by
national metrological institutes (that is not a typo). Assuming that all
meteorological measurements through time have been made to such
standards (which may be a considerable stretch) this would have imparted
changes to the records that are not physical in origin. As part of meteomet
efforts have been made to understand this. It is a relatively small
effect compared to effects of other long recognized data issues. Nevertheless it is important to properly and systematically consider all
sources of potential biases as exhaustively as possible.
The work itself was led by Peter Pavlasek of the Slovak Institute of Metrology. His introduction is reproduced below:
The work itself was led by Peter Pavlasek of the Slovak Institute of Metrology. His introduction is reproduced below:
Temperature
is one of the main quantities measured in meteorology and plays a key
role in weather forecasts and climate determination. The instrumental
temperature recordings now spans well over a century, with some records
extending back to the 17th century, and represents an invaluable tool in
evaluating historic climatic trends. However, ensuring the quality of
the data records is challenging, with issues arising from the wide range
of sensors used, how the sensors were calibrated, and how the data was
recorded and written down. In particular, the very definition of the
temperature scales have evolved. While they have always been based on
calibration of instruments via a series of material phase transitions
(fixed points), the evolution of sensors, measuring techniques and
revisions of the fixed points used has introduced differences that may
lead to difficulties when studying historic temperature records. The
conversion program here presented deals with this issue for 20th century
data by implementing a proposed mathematical model to allow the
conversion from historical scales to the currently adopted International
Temperature Scale of 1990 (ITS-90). This program can convert large
files of historical records to the current international temperature
scale, a feature which is intended to help in the harmonisation
processes of long historic series. This work is part of the project
“MeteoMet” funded by the EURAMET, the European association of National
Institutes of Metrology, and is part of a major general effort in
identifying the several sources of uncertainty in climate and
meteorological records.
Michael de Podesta, who has served on the steering committee since ISTI's inception, reviewed the software for ISTI and had the following summary:
Assuming that calibration procedures immediately spread throughout the world – homogenisation algorithms might conceivably see adjustments in 1968, with smaller adjustments in 1990.
If undetected, the effect would be to create a bias in the temperature record. This is difficult to calculate since the bias is temperature dependent, but if the mean land-surface temperature is ~10°C and if temperature excursions are typically ±10 °C then one might expect that the effect to be that records prior to 1968 were systematically overestimated by about 0.005 °C, and records between 1968 and 1990 by about 0.003 °C.
Michael's full summary which includes some graphical and tabular summaries can be found here.
The code package is a windows operating system based package. It is available here.
No comments:
Post a Comment