Science Facts - Temperature Standards

To most people, the measurement of temperature is very easy - just use a thermometer.

And for every day purposes - where one or two degrees error is acceptable - it is that easy.

However, for scientific purposes - the measurement of temperature is actually very difficult.

In school, I was taught that water freezes at 32F (degrees Fahrenheit) and boils at 212F.

Later, in science classes, we used the metric scale where water freezes at 0C (degrees Centigrade) and boils at 100C.

While researching various subjects related to atmospheric physics, I learned that water freezes at 0.01C (degrees Celsius) and boils at 99.974C.

To the casual observer, this makes no difference - water still boils when it boils and the temperature depends more on the atmospheric pressure than on some definition. (In the 3 examples above, the atmospheric pressure must be 1013.25 hPa - by definition.) The following table shows the expected boiling points at various pressures (using the IAPWS formulation [as provided via Vmel 2011] to compute relative humidity).

For the 3 higher pressures, a temperature +/- 2F might not be noticed. However, at 1 mile up (Denver), water boiling a full 10F below expected means that food will need to be cooked a bit longer.

Note: Computations of this type can be made with my water vapor programs.


The temperature scales are defined by international standards. One of the earliest temperature scales was defined in 1724 by the German physicist Daniel Gabriel Fahrenheit. In 1742, Swedish astronomer Anders Celsius created a different scale with 100 divisions between the freezing point of water and the boiling point. Both of these scales had various problems that have been addressed thru the years.

Without getting too deep into the physics, (See BIPM - Thermodynamic and practical temperature scales for details.)

Since the physical properties of water depend on its average atomic mass, it is necessary to define "standard water" as having a specific isotopic composition .. which is why the VSMOW was needed.

Below the triple point, the temperature scale is defined by 2 points - zero K and the triple point using water of a specific composition (VSMOW) and a specific pressure. Above the triple point, a different calibration method is used. For (IPTS-68), the boiling point of water at 100C (and 1 atm) was used. ITS-90 changed the calibration to use the melting points of a number of pure metals. As a result, the boiling point of water became a measured value instead of a defining value.

Stated another way, using atmospheric pressure to define a temperature scale is a bad idea.

The latest standard is adequate for normal usage since most thermometers are only accurate to about 0.1F and, before the digital age, read to only the nearest half degree. But there is still an issue - basically, temperature is measured on 4 different scales, each with its own calibration and interpolation methods. As a result, when extending the ITS-90 zero to 273.16 K scale, the boiling point of water is about 373.1339 K (99.9839C). However, when using the scale defined by pure metals, the boiling point of water is 373.124 K (99.974C). ref While this difference is small, it effects the equations used in many branches of science.

Note: The primary source, Preston-Thomas H., The International Temperature Scale of 1990 (ITS-90), is paywalled at $33 per peak!

Author: Robert Clemenzi
URL: http:// / Science_Facts / Temperature_Standards.html