Using Deep Learning for Restoration of Precipitation Echoes in Radar Data

Abstract
Raw data issued from meteorological radars are often corrupted by unwanted signals generically called clutter. Hills, tall buildings, atmospheric turbulence, birds, and insects yield patterns that complicate the interpretation of radar images and might add bias in the quantitative precipitation estimates (QPE). Clutter differs from precipitating echoes by both their polarimetric signatures and their particular shapes. This work deals with the removal of clutter. The core idea is to use a fully convolutional network (FCN) to take clutter shapes into account. For a straightforward approach by supervised learning, one would need radar images and their cleaned counterpart, which are not available. We developed a weakly supervised learning method that allows circumventing this issue. This method only requires auxiliary data from rain gauges. The additivity of the reflectivity allows presenting the learning problem in the form of a supervised restoration task with noisy targets. This problem is solved by successive training of a standard FCN (U-net). As ground truth is missing, standard metrics cannot be employed to make a global evaluation. Nevertheless, our method is quantitatively assessed on two clutter classes: ground clutter and interferences. A case study completes the evaluation. A qualitative comparison with the Météo-France algorithm is also performed on a couple of difficult cases.
Funding Information
  • CNES/TOSCA
  • IPSL Fundings and conducted in collaboration with Météo-France