Image fusion using steerable dyadic wavelet transform
An image fusion algorithm based on multiscale analysis along arbitrary orientations is presented. After a steerable dyadic wavelet transform decomposition of multi-sensor images is carried out, the maximum local oriented energy is determined at each level of scale and spatial position. Maximum local oriented energy and local dominant orientation are used to combine transform coefficients obtained from the analysis of each input image. Reconstruction is accomplished from the modified coefficients, resulting in a fused image. Examples of multi-sensor fusion and fusion using different settings of a single sensor are demonstrated.
- 65.pdf application/pdf 741 KB Download File
Also Published In
More About This Work
- Academic Units
- Biomedical Engineering
- Published Here
- August 18, 2010
Proceedings: International Conference on Image Processing: October 23-26, 1995, Washington, D.C., vol. 3 (Los Alamitos, Calif.: IEEE Computer Society Press, 1995), pp. 232-235.