Comparison and improvement of wavelet‐based image fusion
- 21 December 2007
- journal article
- research article
- Published by Informa UK Limited in International Journal of Remote Sensing
- Vol. 29 (3), 673-691
- https://doi.org/10.1080/01431160701313826
Abstract
The wavelets used in image fusion can be categorized into three general classes: orthogonal, biorthogonal, and non‐orthogonal. Although these wavelets share some common properties, each wavelet also has a unique image decomposition and reconstruction characteristic that leads to different fusion results. This paper focuses on the comparison of the image‐fusion methods that utilize the wavelet of the above three general classes, and theoretically analyses the factors that lead to different fusion results. Normally, when a wavelet transformation alone is used for image fusion, the fusion result is not good. However, if a wavelet transform and a traditional fusion method, such as an IHS transform or a PCA transform, are integrated, better fusion results may be achieved. Therefore, this paper also discusses methods to improve wavelet‐based fusion by integrating an IHS or a PCA transform. As the substitution in the IHS transform or the PCA transform is limited to only one component, the integration of the wavelet transform with the IHS or PCA to improve or modify the component, and the use of IHS or PCA transform to fuse the image, can make the fusion process simpler and faster. This integration can also better preserve colour information. IKONOS and QuickBird image data are used to evaluate the seven kinds of wavelet fusion methods (orthogonal wavelet fusion with decimation, orthogonal wavelet fusion without decimation, biorthogonal wavelet fusion with decimation, biorthogonal wavelet fusion without decimation, wavelet fusion based on the ‘à trous’, wavelet and IHS transformation integration, and wavelet and PCA transformation integration). The fusion results are compared graphically, visually, and statistically, and show that wavelet‐integrated methods can improve the fusion result, reduce the ringing or aliasing effects to some extent, and make the whole image smoother. Comparisons of the final results also show that the final result is affected by the type of wavelets (orthogonal, biorthogonal, and non‐orthogonal), decimation or undecimation, and wavelet‐decomposition levels.Keywords
This publication has 16 references indexed in Scilit:
- Comparison between Mallat's and the ‘à trous’ discrete wavelet transform based algorithms for the fusion of multispectral and panchromatic imagesInternational Journal of Remote Sensing, 2005
- A wavelet-based image fusion tutorialPattern Recognition, 2004
- Context-driven fusion of high spatial and spectral resolution images based on oversampled multiresolution analysisIEEE Transactions on Geoscience and Remote Sensing, 2002
- The joint use of IHS transform and redundant wavelet decomposition for fusing multispectral and panchromatic imagesInternational Journal of Remote Sensing, 2002
- A categorization of multiscale-decomposition-based image fusion schemes with a performance study for a digital camera applicationProceedings of the IEEE, 1999
- Multiresolution-based image fusion with additive wavelet decompositionIEEE Transactions on Geoscience and Remote Sensing, 1999
- Review article Multisensor image fusion in remote sensing: Concepts, methods and applicationsInternational Journal of Remote Sensing, 1998
- A wavelet transform method to merge Landsat TM and SPOT panchromatic dataInternational Journal of Remote Sensing, 1998
- Image merging and data fusion by means of the discrete two-dimensional wavelet transformJournal of the Optical Society of America A, 1995
- A theory for multiresolution signal decomposition: the wavelet representationIeee Transactions On Pattern Analysis and Machine Intelligence, 1989