Radar Measurement of Rainfall—A Summary

Abstract
Radar can produce detailed precipitation information for large areas from a single location in real time. Although radar has been used experimentally for nearly 30 years to measure rainfall, operational implementation has been slow. Today we find that data are underutilized and both confusion and misunderstanding exist about the inherent ability of radar to measure rainfall, about factors that contribute to errors, and about the importance of careful calibration and signal processing. Areal and point rainfall estimates are often in error by a factor of two or more. Error sources reside in measurement of radar reflectivity factor, evaporation and advection of precipitation before reaching the ground, and variations in the drop-size distribution and vertical air motions. Nevertheless, radar can be of lifesaving usefulness by alerting forecasters to the potential for flash flooding. The most successful technique for improving the radar rainfall estimates has been to “calibrate” the radar with rain gages. Simple techniques that combine sparse gage reports (one gage per 1000–2000 km2) with radar produce smaller measurement errors (10–30%) than either system alone. When high accuracy rainfall measurements are needed (average error less than about 10–20%) the advantage of radar is diminished, since the number of gages required for calibration is itself sufficient to provide the desired accuracy.