degrees celsius symbol

ºC means degree Celsius and not degree centigrade?

By the middle of the 17th century, sensitive thermometers were available, but there was no serious attempt to establish a universal scale for the measurement of temperature since there was not even a clear scientific knowledge of what it itself was. In 1669, Father Honoré Fabryen Leyde calibrated his thermometers, first “putting them on ice” and then “in water heated to the maximum degree” (he did not go so far as to say boiling). Some years later the French physicist and naturalist René Antoine Ferchault de Réaumur adopted the ice and water boiling points, but assigning them as values ​​0 and 80.

degrees symbols

With the beginning of the 18th century, Daniel Gabriel Fahrenheit began his work, key to the development of thermometry, which was partially based on that of the
Danish astronomer Olaus Römer, who had built a thermometer based on two fixed points. It seems that a misinterpretation of Römer’s scale led him to construct his own (the point that Römer described as water at blood temperature, Fahrenheit understood literally when Römer probably only meant warm water). In any case, the great merit of Fahrenheit was that it was able to build stable thermometers and a reproducible scale. He was not the first to suggest creating a scale using fixed points, but he was the first to take advantage of the method using good thermometers.

It is in 1742 when Anders Celsius publishes an article in “Kungliga Swenska Wetenskaps Academiens Handlinga”, the annals of the Royal Swedish Academy of Sciences, entitled “Observations on two constant degrees in a thermometer”. This article was the origin of the Celsius temperature scale. In it, after making a summary of the different ways of expressing the temperature of the time, Celsius presented a list of his experiments to elaborate his temperature scale with two fixed points: that of snow or melting ice and that of boiling water. . Initially Celsius assigned 0º to the boiling point and 100º to the solidification point, but this assignment was soon reversed, probably by Daniel Ekström, who made most of the scientific instruments that Celsius used.

At the end of the 19th century, for the comparison of length standards, the International Committee of Weights and Measures adopted, as the normal thermometric scale for the International Bureau of Weights and Measures, the centigrade scale of the constant volume hydrogen thermometer, which had as fixed points the temperature of the melting ice point (0º C) and that of the boiling distilled water (100º C); In this way, the unit of temperature would be the “degree centigrade” and would correspond to one hundredth of the difference (at normal pressure) between the melting points of ice and the boiling points of water.

The degree centigrade continued to be used as a unit of temperature until in 1948 the Consultative Committee on Thermometry discussed the need to replace the point of ice, with a reproducibility of the order of one thousandth of a degree, by the triple point of water, better. defined and whose reproducibility was, at that time, of the order of one ten-thousandth of a degree. The possibility was also studied, indicated as early as 1854 by Lord Kelvin, of defining the thermodynamic temperature by means of a single fixed point, instead of defining it by the interval from 0ºC to 100ºC. Finally this “revolutionary” idea was approved, not without many discussions, and the triple point of water became the pillar of the temperature scale, which was assigned the value of 0.01º C (or 273.16 K).

The General Conference on Weights and Measures in October 1948 accepted the principle of the single fixed point, but decided to wait to assign it a numerical value. This same Conference decided to replace the name of “degree centigrade” by “degree Celsius” because of confusion. It was not until the 10th General Conference on Weights and Measures in 1954 that the 273.16 K value for the triple point of water was officially adopted.

This decision had a surprising result, since for a few years two different Kelvin lived together. One defined as the fraction 1 / 273.16 of the thermodynamic temperature of the triple point of water and the other the kelvin of the 1948 centigrade scale defined as the one hundredth part of the temperature interval between the boiling point of water and the point from the melting ice. They were fortunate to be very close to each other, but it was certainly a difficult situation for the metrologists of the time.

Later, very wisely, it was decided to modify the text of the 1948 International Temperature Scale and, in 1960, a corrected edition was published by the 11th General Conference on Weights and Measures. This corrected scale was called the Practical International Temperature Scale and defined the kelvin as the fraction 1 / 273.16 of the thermodynamic temperature of the triple point of water. In this way, the International Temperature Scale definitively ceased to be a centigrade scale, since adopting the triple point of water as the primary fixed point with the temperature of 0.01º C, the interval that separated it from the boiling point of water did not it was 100º C.

Currently, the current and internationally accepted temperature scale is the 1990 International Temperature Scale, which establishes that the basic unit of the physical magnitude of thermodynamic temperature (symbol T ) is the kelvin (symbol K ), and it continues to be defined as the fraction 1 / 273.16 of the thermodynamic temperature of the triple point of water. Furthermore, as the expression of a temperature by its difference at 273.15 K (melting point of ice) was considered common practice , the use of the Celsius temperature (symbol t ), defined by:

t / ° C = T / K – 273.15

The unit of Celsius temperature is the degree Celsius (symbol ° C), which is, by definition, equal in magnitude to the kelvin.