How Dew Point and Relative Humidity are calculated
Relative Humidity is the ratio of the existing amount of water vapor in the air at a given temperature to the maximum amount that could exist at that temperature (i.e. fully saturated); usually expressed in percent (adapted from US FAA, Federal Aviation Administration, definition).
At higher temperature, air can hold more water than at lower temperatures. The US NOAA (National Oceanic and Atmospheric Administration) point out that, since the amount of water that can be held by air is dependent on its temperature, Relative Humidity is a function of both moisture content and temperature. As such, Relative Humidity by itself does not directly indicate the actual amount of atmospheric moisture present.
As the amount of water air can carry is a function of moister content and temperature, Relative Humidity is not a convenient measure of humidity.
Dew Point is a more convenient measure of humidity and is the temperature the air needs to be cooled to (at constant pressure) in order to achieve a Relative Humidity of 100%. This means that a higher Dew Point the air is more saturated and more easily condenses than at a lower Dew Point, hence a high Dew Point indicates high humidity and a low Dew Point indicates low humidity.