From the Archives – An excerpt from a previous column, published in the Ponca City News Midweek in June 2017.
Our ability to measure temperature comes from the Zeroth Law of Thermodynamics: if object A is in thermal equilibrium with object B and object B is in thermal equilibrium with object C, then A and C have to be in thermal equilibrium with each other. Typically, object B is a thermometer, and we can now say that object A and C are at the same temperature.
Our understanding of how hot or cold something is based upon the scale of the thermometer that we are using. Here in the United States, our common scale is the Fahrenheit scale, while most of the rest of the world uses Celsius. So, a 90-degree day in the US would be about 32 degrees C. The Celsius scale uses two points of reference to establish 0 and 100 degrees. The zero on this scale is the temperature at which water freezes, while the 100-degree reference is the point at which water boils. There is another scale that is commonly used, and that is the Kelvin scale, which uses as a reference the point where molecules have no kinetic motion, i.e. absolute zero. Conversion between the Kelvin and Celsius scale is very easy, 0 degrees Celsius is 273.15 Kelvin, and 100 degrees Celsius is 373.15 Kelvin. You just need to add or subtract 273.15 to go back and forth between these two temperature scales depending upon whether you are going from Kelvin to Celsius or from Celsius to Kelvin.
But, what about the Fahrenheit scale? On this scale, water freezes at 32 degrees and boils at 212 degrees. And, anyone how has ever tried to convert between the two scales quickly realizes that the conversion is not as straight forward as going between Kelvin and Celsius. To convert from Fahrenheit to Celsius, first, you must subtract 32 from the temperature in Fahrenheit, then multiply the result by 5 and divide by 9. Yikes! It begs the question, how did the Fahrenheit scale become what it is?
The invention of the first modern thermometer is attributed Daniel Fahrenheit in 1714. However, his thermometer is a refined version based upon the prior work of others which had developed devices that could compare temperatures but had no standardized scale and thus weren’t very accurate. Olaus Roemer (also seen spelled as Romer) had developed a device that utilized a scale that chose two reference points, a scale that used 7.5 degrees where ice melts and 60 degrees to represent the temperature of boiling water. Using Roemer’s work, Fahrenheit opted to base his scale upon three easily repeatable points. The coldest point was a cool solution of water, ice, and an ammonium chloride salt, a brine, the freezing point of water, and human body temperature. After further experimentation, he settled upon a scale that used the brine solution to document zero, the freezing point of water which became 32 degrees and a scale that had 180 increments or degrees between the freezing point and the boiling point of water, 212 degrees. (Author’s note: while I could find no particular reference as to why Fahrenheit chose 180 degrees, I would speculate that it is related to other common “degree” references, i.e. sum of the three angles in a triangle equals 180 degrees and one-half of a circle is 180 degrees. There is a reference that indicates that he chose, 30 degrees and 90 degrees as points for ease of marking the scale on the thermometer.)
Shortly after Fahrenheit’s thermometer, Anders Celsius proposed the alternative scale that uses 100 degrees between the freezing point of water and 100 degrees for the boiling point. (Celsius originally set the zero at the boiling point and the 100 at the freezing point. The scale would be reversed later.) As we know both scales are still in use. And, there are still thermometers in existence built by Fahrenheit. In October 2012, a thermometer built by Fahrenheit sold for over $86,000 at auction. Although the conversions between the two scales may be cumbersome, there was some logic to the scale established by Fahrenheit.