“It’s not the heat, it’s the humidity.” This old adage heard throughout much of the summer in the eastern US, refers to how the amount of water vapor in the air affects human comfort. Since the body’s main source of cooling is evaporation of perspiration, the more moisture there is in the air, the less evaporation takes place and the warmer we feel. Two ways to indicate atmospheric moisture content are relative humidity and the dew point temperature.
Relative humidity (RH) measures the actual amount of moisture in the air compared to the total amount of moisture that the air can hold. It is expressed as a percentage and is commonly used in generic weather reports and apps. A high RH can produce fog and a low RH can cause rapid dehydration in both people and plants – important information for some sectors such farmers and crews fighting wildfires. But, since warm air can hold more moisture than cool air, the relative humidity changes as the air temperature changes.
The dew point temperature, on the other hand, is an absolute measurement and is often the preferred metric of meteorologists. It is the temperature to which air must be cooled in order to reach saturation. In other words, when the air temperature and the dew point temperature are same, the air is saturated and the relative humidity is 100%. If the air were to cool further, the water vapor would condense into liquid water, such as dew or precipitation.
The classic example of this phenomenon is a glass of cold liquid sitting on a table outside on a warm, muggy day. The beverage cools the air around it and beads of water form on the outside of the glass. The temperature at which the beads of water form is the dew point.
Simply put, the closer the dew point temperature is to the air temperature, the more humid it feels. In summer, when the air is warm and can hold a lot of moisture, a dew point temperature in the 50s is generally considered comfortable. Dew points in the 60s are thought of as muggy and once they reach the 70s or higher, the air can feel oppressive. On the opposite end of the spectrum, dew points in the 40s or lower are considered dry, and dry air has its own set of comfort issues.
“It’s not the heat, it’s the humidity.” This common refrain heard throughout much of the eastern U.S. in summer refers to how the amount of water vapor in the air affects human comfort. Since the main source of body cooling is evaporation of perspiration, the higher the moisture content of the air, the less evaporation takes place and the warmer we feel. One measure of atmospheric moisture is the dew point temperature.
The dew point, as defined by the NWS, is the temperature to which air must be cooled in order to reach saturation. In other words, the difference between the air temperature and the dew point temperature tells you how much moisture is in the air. The greater the distance between the two, the dryer the air is. Conversely, the closer they are together, the higher the moisture content of the air.
While everyone has a different tolerance for humidity, in summer – when air temperatures tend to be high – a dew point temperature of 50°F is generally considered comfortable. Dew points in the 60s are thought of as muggy and once they reach the 70s or higher, the air can feel down right oppressive. On the opposite end of the spectrum, dew points in the 40s or lower are considered dry. Dry air has its own set of comfort issues, including skin irritations.
Coming from the humid east coast, one of the first things you notice upon arrival in the southwestern United States is how dry the air is. Dew points are often in the 40s while the air temperature soars into the 80s and 90s during the summer. This is why swamp coolers are popular in the region.
A swamp cooler is an evaporative cooling device. It takes hot, dry outside air and blows it across water soaked pads. This allows the process of evaporation – the transition of liquid water to water vapor – to cool the air that is pumped into a building. It also adds some moisture to the inside air, making it more comfortable.
While the U.S. Energy Department says swamp coolers cost about one-half as much to install as central air conditioners and use about one-quarter as much energy, they do not work well everywhere. In hot, muggy climates, for example, the high relative humidity would significantly reduce the rate of evaporation. Moreover, adding extra water vapor to the air would not be considered a bonus in an already uncomfortably humid environment.
For this reason, only 3% of homes nationwide utilize swamp coolers, according to a report from the Energy Information Administration. In the arid Rocky Mountain region, however, they are found in more than 26% of all households.
Sign advertising swamp coolers in Salida, CO. Image Credit: The Weather Gamut.