What could be a more everyday scientific instrument than a thermometer? But let’s not forget that hundreds of years ago, a thermometer – a device that can measure temperature – was a cutting-edge piece of technology.
Daniel Gabriel Fahrenheit, who was born in May 1686, was the Dutch-German physicist who invented the mercury thermometer 300 years ago and the alcohol thermometer a little before that in 1709. As well as inventing the first modern thermometer, he also gave his name to the Fahrenheit temperature scale, which is still in use today in some countries.
The forerunner of the thermometer was the thermoscope, which we can think of as a thermometer without a scale. A thermoscope was only able to show differences in temperatures; that is to say, it could indicate something was getting hotter, but it did not measure the exact temperature in degrees. In 1593, the great Galileo Galilei invented a rudimentary water thermoscope, which for the first time, enabled temperature variations to be measured. It worked on the principle that the buoyancy of water changes with temperature.
Another Italian, Santorio Santorio became the first inventor to put a numerical scale on his thermoscope in 1612. His invention was perhaps the first clinical thermometer but wasn’t very accurate. Then, in 1654, yet another Italian invented the first enclosed liquid-in-a-glass thermometer to negate the effects of air pressure on expansion of the liquid column. The Grand Duke of Tuscany, Ferdinand II used alcohol as the liquid that expanded with temperature for his instrument. Nevertheless, it was still inaccurate and used no standardized scale.
So do thermometers work? At their most basic, thermometers measure temperature by using
materials that change in some way when they are heated or cooled. In a mercury or alcohol thermometer the liquid expands as it is heated and contracts when it is cooled, so the length of the liquid in a column is longer or shorter depending on the temperature. Equally importantly, thermometers also have a scale that enables us to read the temperature according to the position of the column of liquid.
The aforementioned Mr Fahrenheit was the first to bring us the mercury thermometer – you know, the kind that gets shoved into your mouth as a kid when you have ‘temperature’. Indeed, we can still detect the medical heritage of the Fahrenheit scale when we understand that human body temperature was originally close to 100° at 96° on the Fahrenheit scale, although this has since been adjusted to 98.4°F to make it more accurate. The Fahrenheit scale divides the freezing and boiling points of water into 180 degrees, with 32°F as the freezing point of water and 212°F as the boiling point; 0°F was based on the temperature of an equal mixture of water, ice, and ammonium chloride salt; this is a frigorific mixture which stabilizes its temperature automatically.
In 1742, Anders Celsius introduced his own “centigrade” scale, which as its name suggests divides the interval between the freezing point and boiling point of pure water (at sea level air pressure) into 100 degrees. The term "Celsius" was adopted in 1948 by an international conference on weights and measures.
As physics advanced, scientists found that they needed to extend their scale to measure the ultimate extremes of hot and cold. As a result, the renowned British physicist Lord Kelvin extended the Celsius scale to encompass “absolute zero” – a fictitious state at which the enthalpy and entropy of a cooled ideal gas reaches its minimum value – and is now agreed to be equivalent to −273.15° C. Although we can’t achieve absolute zero, near this temperature matter exhibits quantum effects such as superconductivity and superfluidity.