Abstract
In this paper the fusion of multimodal images into one greylevel image is aimed at. A multiresolution technique, based on the wavelet multiscale edge representation is applied. The fusion consists of retaining only the modulus maxima of the wavelet coefficients from the different bands and combining them. After reconstruction, a synthetic image is obtained that contains the edge information from all bands simultaneously. Noise reduction is applied by removing the noise-related modulus maxima. In several experiments on test images and multispectral satellite images, we demonstrate that the proposed technique outperforms mapping techniques, as PCA and SOM and other wavelet-based fusion techniques.