It has been asserted that global warming should increase the amount of water in the air in amounts disproportionate to temperature; the assumption is that relative humidity is approximately constant at 50%, and that absolute humidity will therefore increase with temperature in accord with the Clapeyron-Clasius (CC) equation for vapor pressure.
According to Trenberth and Smith, 2004 atmospheric water levels contribute between 2.3 and 2.6 hPa of annually varying pressure. http://www.cgd.ucar.edu/staff/trenbert/trenberth.papers/massERA40JC.pdF
If the amount of water in the atmosphere is increasing there should be an an increase in atmospheric pressure. So there are two questions: would such increase be observable, and if so, is it observed?
From the book “A Short Course in Cloud Physics” by Rogers and Yau the vapor pressure of water at temperature T celsius is e(T) = 6.112(hPa) * EXP( 17.67 T /(T + 243.5))
The average temperature of the planet is something like 16 oC.
e(16) = 18.16 hPa at 100% relative humidity (RH).
e(16) can be compared to the Trenberth and Smith fraction of pressure observed due to water ~ 2.5 / 985; the apparent saturation is 2.5/ 18.16 = 13% This number certainly gives cause to question any assumption of 50% relative humidity. Averaging the CC equation across all possible temperature profiles might explain this result.
The formula e(T) shows that vapor pressure increases as ~ 6.5% / degree C.
6.5% of Trenberth’s 2.5 is an increased pressure of 0.16 hPa per degree C rise.
The following historical data displays do not suggest an increase in pressure with the presumed increase in temperature anomaly, but 0.16 is certainly deep in the noise.
http://onlinelibrary.wiley.com/doi/10.1002/wea.420/pdf seems to show mostly up or down from 1850 – 2000+
Offhand, from the perspective of overall signal to noise ratio long term comparisons of pressure change due to added water should be comparable to long term comparisons of temperature.