sdw wrote:I've been learning about dew point from your posts. I figured I might get a whipping from you if I didn't include something showing the dew point too
Am I correct in figuring, from your explanation, the closer the dew point is to the temperature, the more miserable it will be? And thank you for the explanation! I'm beginning to understand it better. Also, had the dew point been 74 (same as temp) would it have been foggy?
How close the dew point temp and air temp are is what drives the relative humidity. When the two are close the relative humidity gets close to 100%. But 100% relative humidity at 40 degrees is very different than 100% at 75 degrees. One is humid to the feel and the other isn't. That's why the actual dew point temp is so important to how humid it feels. Cooler air can't hold as much moisture as warmer air. The dew point temperature is the temperature the air would have to decrease to before the air is completely saturated (100% relative humidity) and water vapor begins to condense on objects (morning dew) or actually form a cloud (fog). That's why fog usually forms in the morning. The temperature fell during the night until it reached the dew point temperature and the water vapor in the air condenses to form a cloud at the ground we call fog. Once the sun rises, the air warms above the dew point temperature, and the liquid water droplets that make the cloud visible evaporate back into invisible vapor and the fog vanishes, literally into thin air (although it's still there in the form of invisible water vapor).
So it is the dew point temperature that tells you how humid it feels, not the relative humidity. Again, you can have 100% relative humidity at 20 degrees (air temp of 20 and dew point of 20), yet that will NOT feel humid! At the same time you can have a relative humidity of only 53% that will feel exceptionally oppressive because the air temp is 95 degrees and the dew point of 75 degrees. It is the 75 degree dew point that tells you how humid it is. The two relative humidity values are so different--100% at 20 degrees versus 53% at 95 degrees--because warm air can hold so much more water vapor than cold air. 95 degree air can hold a massive amount of water vapor. 20 degree air cannot.
In fact, what normally occurs in the summer is that the dew point temperature doesn't change during the day, but obviously the air temperature does, climbing during the day and falling at night. This causes the relative humidity NUMBER to fall dramatically during the middle of the day but rise to near saturation (near 100%) at night. The hot mid-day air
could hold much more moisture than the cooler night-time temps, hence the lower mid-day relative humidity. Yet the amount of moisture in the air, and the "felt" humidity, isn't changing day or night. When we get up in the morning and watch the morning news, it's common to see a morning temp of 75 with a dew point of 73, which makes the relative humidity 94%. However, if you were to check mid-day, when the air temp has risen to 98 degrees, that same dew point of 73 (and same "felt" humidity) produce a much lower relative humidity of only 45% because 98 degree air
could hold so much more moisture. The "relative humidity" number is literally the percent of maximum moisture the air is currently holding (percent of maximum). Yet "felt" humidity is the amount of moisture actually in the air, and that is numerically represented by the dew point temperature.