Monthly Archives: July 2012
The accurate measurement of temperature is vital across a broad spectrum of human activities, including industrial processes (e.g. making steel), manufacturing; monitoring (in food transport and storage), and in health and safety. In fact, in almost every sector, temperature is one of the key parameters to be measured.
History of Thermometry
The means of accurately measuring temperatures has long fascinated people. One of the differences between temperature and other physical concepts, such as mass or length, is that it is subjective: different people will have different perceptions of what is hot and what is cold. To make objective measurements, we must use a thermometer in which some physical property of a substance changes with temperature in a reliable and reproducible way.
Thermoscopes, the ancestors of modern thermometers, have been around since about 200 BC. The first recognisable, modern thermometers were made in the 16th century by both the Italian Galileo Galilei and Santorio Santorio, a physician to the King of Poland. The latter produced a thermometer incorporating a scale, and his writings show that he understood the importance of the temperature measurement in the diagnosis of disease. The first sealed thermometer was made by the Grand Duke Ferdinand of Tuscany in 1641. This thermometer was more accurate than its predecessors since it wasn’t dependent on atmospheric pressure. Later, the scientists Fahrenheit and Celsius both made glass thermometers containing mercury, and used reference points (the melting point of pure ice and the boiling point of water) to improve the accuracy.
Types of Thermometer
Liquid-in-glass, in particular mercury, thermometers have been used for almost 300 years in science, medicine, metrology and in industry. They rely on the expansion of a fluid with temperature. The fluid is contained in a sealed glass bulb and the temperature is read using a scale etched along the stem of the thermometer.
Platinum resistance thermometer
In the modern world, mercury and spirit-filled thermometers have largely given way to electrical devices, which can be digitised and automated. Platinum resistance thermometers are electrical thermometers which make use of the variation of resistance of high-purity platinum wire with temperature. This variation is predictable, enabling accurate measurements to be performed. They are sensitive and, with sophisticated equipment, measurements, can routinely be made to better than a thousandth part of 1°C.
Thermocouples are the most common sensors in industrial use. They have a long history, the original paper on thermoelectricity by Seebeck being published in 1822. They consist of two dissimilar metallic conductors joined at the point of measurement. When the conductors are heated a voltage is generated in the circuit, and this can be used to determine the temperature.
Radiation (or Pyrometers)
Radiation thermometers, or pyrometers, make use of the fact that all objects emit thermal radiation, as seen when looking at the bars of an electric fire or a light bulb. The amount of radiation emitted can be measured and related to temperature using the Planck law of radiation. Temperatures can be measured remotely using this technique, with the sensor situated some distance away from the object. Hence it is useful for objects that are very hot, moving or in hazardous environments.
The two temperature scales commonly in use today date from the eighteenth century and are named after Gabriel Daniel Fahrenheit and the Swedish astronomy professor Anders Celsius. Fahrenheit designed his scale to have two reference points that could be set up in his workshop. He originally chose the melting point of pure ice and the temperature of a normal human body, which he took as being 32° and 96° respectively. These conveniently gave positive values for all the temperatures he encountered. Later he changed to using the boiling point of water (212°) as the upper fixed point of the scale.
Celsius also used the ice and steam points, but took them to be 0°C and 100°C respectively. Although the Celsius scale has taken precedence over the Fahrenheit scale, the latter is still familiar in weather reports in the United Kingdom: a summer’s day temperature of 75°F seems much more pleasant than one of 23°C!
A third, fundamental, temperature scale was proposed in 1854 by the Scottish physicist William Thomson, Lord Kelvin. It is based on the idea of the absolute zero, the point of no discernible energy, which is independent of any particular material substance. The Kelvin scale is widely used by physicists and engineers to determine and apply fundamental laws of thermodynamics.
Triple point of water cell – definition of the kelvin
The International Temperature Scale of 1990 (ITS-90)
Since 1954 the unit of (thermodynamic) temperature has been defined as the kelvin, and is the fraction 1/273.16 of the thermodynamic temperature of the triple point of water. This is the unique temperature and pressure at which the three phases of water (solid, liquid and vapour) co-exist in equilibrium. It is fractionally higher than the melting point, being 0.01°C or 273.16 K. From this single point it is possible to generate a thermodynamic temperature scale using gas thermometers and radiation thermometers which accurately obey known laws.
Such experiments are not easy and are rarely done, but good values have been established for a series of fixed points: freezing points of pure metals at high temperatures and triple points of gases at low temperatures. These are incorporated into the International Temperature Scale so that standard platinum resistance thermometers and radiation thermometers can be calibrated with excellent reproducibility. The National Physical Laboratory maintains the temperature scale (currently the International Temperature Scale of 1990, the ITS-90) in the UK, and compares this with the ITS-90 maintained in other national laboratories. In this way temperature standards around the world can be accurately equivalent, and all manner of thermometers can be reliably calibrated for everyday use.
Future of Thermometry
The international temperature community is working towards a redefinition of the kelvin. This would be based on a fundamental constant of nature known as the Boltzmann constant. The advantages of this is that the new definition would be freed from any physical artefact (i.e. the triple point of water) and allow the use of any appropriate thermodynamic method for temperature measurement.
The Spectrum of Temperature
|200 million °C||The Joint European Torus (JET) nuclear fusion project, Culham Oxfordshire|
|15 million °C||Temperature of the centre of the Sun|
|6 000 °C||Temperature of the surface of the Sun|
|1 200 °C to 1 500 °C||Molten glass / steel|
|1 064 °C||Melting point of gold|
|100 °C||Boiling point at one atmosphere of pressure|
|0 °C||Freezing point of pure water at one atmosphere of pressure|
|– 89.2 °C||All time coldest point on earth|
|– 196 °C||Cryogenic storage in liquid nitrogen|
|– 270 °C||Cosmic background radiation|
For more information visit www.npl.co.uk
The Large Hadron Collider
Our understanding of the Universe is about to change…
The Large Hadron Collider (LHC) is a gigantic scientific instrument near Geneva, where it spans the border between Switzerland and France about 100m underground. It is a particle accelerator used by physicists to study the smallest known particles – the fundamental building blocks of all things. It will revolutionise our understanding, from the minuscule world deep within atoms to the vastness of the Universe.
Two beams of subatomic particles called “hadrons” – either protons or lead ions – travel in opposite directions inside the circular accelerator, gaining energy with every lap. Physicists use the LHC to recreate the conditions just after the Big Bang, by colliding the two beams head-on at very high energy. Teams of physicists from around the world then analyse the particles created in the collisions using special detectors in a number of experiments dedicated to the LHC.
There are many theories as to what will result from these collisions. For decades, the Standard Model of particle physics has served physicists well as a means of understanding the fundamental laws of Nature, but it does not tell the whole story. Only experimental data using the high energies reached by the LHC can push knowledge forward, challenging those who seek confirmation of established knowledge, and those who dare to dream beyond the paradigm.
How the LHC works
The LHC, the world’s largest and most powerful particle accelerator, is the latest addition to CERN’s accelerator complex. It mainly consists of a 27-kilometre ring of superconducting magnets with a number of accelerating structures to boost the energy of the particles along the way.
Inside the accelerator, two beams of particles travel at close to the speed of light with very high energies before colliding with one another. The beams travel in opposite directions in separate beam pipes – two tubes kept at ultrahigh vacuum. They are guided around the accelerator ring by a strong magnetic field, achieved using superconducting electromagnets. These are built from coils of special electric cable that operates in a superconducting state, efficiently conducting electricity without resistance or loss of energy. This requires chilling the magnets to about ‑271°C – a temperature colder than outer space. For this reason, much of the accelerator is connected to a distribution system of liquid helium, which cools the magnets, as well as to other supply services.
Thousands of magnets of different varieties and sizes are used to direct the beams around the accelerator. These include 1232 dipole magnets of 15m length which are used to bend the beams, and 392 quadrupole magnets, each 5–7m long, to focus the beams. Just prior to collision, another type of magnet is used to “squeeze” the particles closer together to increase the chances of collisions. The particles are so tiny that the task of making them collide is akin to firing needles from two positions 10km apart with such precision that they meet halfway!
All the controls for the accelerator, its services and technical infrastructure are housed under one roof at the CERN Control Centre. From here, the beams inside the LHC are made to collide at four locations around the accelerator ring, corresponding to the positions of the particle detectors.
Higgs within reach
Proton-proton collision in the CMS experiment producing four high-energy muons (red lines). The event shows characteristics expected from the decay of a Higgs boson but it is also consistent with background Standard Model physics processes (Image: CMS)
At a seminar on 4 July, the ATLAS and CMS experiments at CERN presented their latest results in the search for the long-sought Higgs boson. Both experiments see strong indications for the presence of a new particle, which could be the Higgs boson, in the mass region around 126 gigaelectronvolts (GeV).
Both ATLAS and CMS gave the level of significance of the result as 5 sigma on the scale that particle physicists use to describe the certainty of a discovery. One sigma means the results could be random fluctuations in the data, 3 sigma counts as an observation and a 5-sigma result is a discovery. The results presented today are preliminary, as the data from 2012 is still under analysis. The complete analysis is expected to be published around the end of July.
The Higgs field has been likened to a kind of cosmic “treacle” spread through the universe.
According to Prof Higgs’s theory, the field interacts with the tiny particles that make up atoms, and weighs them down so that they don’t just whizz around space at the speed of light.
Since then people have been trying to prove that the Higgs Field really exists.
Prof Higgs predicted that the field would have a signature particle, a massive boson which is what CERN have identified.
Visit http://public.web.cern.ch/public/ for more details.