Dedicated to the balanced discussion of global warming
ClimatePolice.com – May 11, 2007
I think this blog entry is worthy of your attention. The entry discusses the differences in measuring temperature using the ‘modern’ computer based methods and the traditional baseline methods of thermometers. The comments also are quite interesting and could be argued are more educational than the original post in the actual use of these two measuring devices and their individual bias.
I have real concerns about analyzing temperature. I do not find it to be mathematically valid to calculate ‘average’ temperatures. While the argument can be made that it is valid to average the temperature in a short period of time (probably less than an hour) in one single location, it is likely not valid to average it over longer periods (such as the entire day) and definitely not for multiple locations.
Temperature is the amalgamation of several physical properties of the body that is being measured. In the case of atmospheric temperature it includes radiation energy from surrounding entities, energy directly from the sun, and the energy level of the atmosphere surrounding the device. It also includes how much energy can be absorbed in that atmosphere which is largely influenced by the amount of water that is in the air, the pressure of that air, and the amount of movement of that air.
We tend to be confused in global warming discussions between temperature and the amount of heat energy that is stored in the atmosphere. If we would measure the amount of K-calories or Joules that are in the atmosphere at any location and at any time, we could accurately average these amounts over time at that location or over multiple locations. But to do mathematical operations on temperature is simply not mathematically valid since we are adding different things together. In reality, the high temperature yesterday at my house was not 100 deg F (we set a record) but rather was 100 deg F with the wind blowing a certain direction at a particular speed and with a pressure of X and humidity of Y resulting in Z Joules/cubic feet of air. Since today is supposed to be warmer than yesterday we could do the same measurement today and find out if there was more heat today or yesterday – discussing the temperature is actually quite irrelevant.
Thermometers evolved from noticing that certain liquids expanded at a consistent rate based on the amount of heat they were exposed to. Scientists at the time created several scales that correlated this physical expansion to a number that we now call ‘degrees’. With the advent of current electronics, we now map the movement of electrons in a particular metal to that same degree. We must not confuse these indirect measurements with the core of what we are trying to measure – the amount of heat in the atmosphere at that location.
Weather observations have been taken around the world for centuries. Up until the early 1980s, a majority of the temperature observations were taken with a Liquid in Glass (LIG) mercury thermometer. Special LIG thermometers, known as minimum and maximum thermometers, were used to record the daily high and low temperature.
In the 1980s, technology allowed for sensors to become automated and computerized. Instead of using the absolute maximum or minimum, the new digital thermometers utilize algorithms to calculate the high and low temperature. The algorithm uses a 60 second sampling rate and calculates a running 5 minute average.
A joint study conducted by the University of Nebraska and Nation Climate Data Center demonstrated that the difference between the automated sensors and LIG thermometers was between .15 and .5 degrees C.
Click through here to read the entire posting and the relevant comments.
Did you know that you can have these articles emailed to you? Click on the Subscribe to email link in the upper right corner, fill out the details, and you are set. No one will see your email address and you won’t get more spam by doing this.climate models, corn, mercury, scientists, temperature, water, weather, wind