Research Papers

Fast, simple, and good pan-sharpening method

[+] Author Affiliations
Gintautas Palubinskas

German Aerospace Center DLR, Remote Sensing Technology Institute, Oberpfaffenhofen, 82234 Wessling, Germany

J. Appl. Remote Sens. 7(1), 073526 (Aug 09, 2013). doi:10.1117/1.JRS.7.073526
History: Received November 5, 2012; Revised March 21, 2013; Accepted July 9, 2013
Text Size: A A A

Open Access Open Access

Abstract.  Pan-sharpening of optical remote sensing multispectral imagery aims to include spatial information from a high-resolution image (high frequencies) into a low-resolution image (low frequencies) while preserving spectral properties of a low-resolution image. From a signal processing view, a general fusion filtering framework (GFF) can be formulated, which is very well suitable for a fusion of multiresolution and multisensor data such as optical-optical and optical-radar imagery. To reduce computation time, a simple and fast variant of GFF-high-pass filtering method (HPFM)—is proposed, which performs filtering in signal domain and thus avoids time-consuming FFT computations. A new joint quality measure based on the combination of two spectral and spatial measures was proposed for quality assessment by a proper normalization of the ranges of variables. Quality and speed of six pan-sharpening methods—component substitution (CS), Gram-Schmidt (GS) sharpening, Ehlers fusion, Amélioration de la Résolution Spatiale par Injection de Structures, GFF, and HPFM—were evaluated for WorldView-2 satellite remote sensing data. Experiments showed that the HPFM method outperforms all the fusion methods used in this study, even its parentage method GFF. Moreover, it is more than four times faster than GFF method and competitive with CS and GS methods in speed.

Figures in this Article

Multiresolution image fusion also known as pan-sharpening aims to include spatial information from a high-resolution image, e.g., panchromatic or synthetic aperture radar (SAR) image, into a low-resolution image, e.g., multispectral or hyperspectral image, while preserving spectral properties of a low-resolution image. A large number of algorithms and methods to solve this problem were introduced during the last two decades, which can be divided into two large groups based on a linear spectral transformation followed by a component substitution (CS)1 and a spatial frequency decomposition usually performed by means of high-pass filtering (HPF)2,3 or multiresolution analysis.4,5,6 Sometimes it is quite difficult to orient between all these methods though some classification attempts were already performed.68 We propose to look at these methods from a signal processing view. This type of analysis allowed us to recognize similarities and differences of various methods quite easily and thus perform a systematic classification of most known multiresolution image fusion approaches and methods.9 Moreover, this analysis resulted in a general fusion filtering framework (GFF) for a multiresolution image fusion. Sometimes state-of-the-art methods are quite computer time consuming, thus restricting their usage in praxis. To reduce computation time, a simple and fast variant of the GFF method further called a high-pass filtering method (HPFM) is proposed in this paper. It performs filtering in signal domain and thus avoids time-consuming FFT computations.

In parallel to pan-sharpening methods development, many attempts were undertaken to assess quantitatively their quality usually using measures originating from image processing such as mean square error, cross-correlation (CC), structural similarity image (SSIM) index,10 Wald’s protocol,11 and finally recently proposed joint measures: product of spectral measure based on SSIM and spatial measure based on CC12 and quality with no reference (QNR) measure.13 Some comparisons are presented in Refs. 14 and 15. Recently a statistical evaluation of most popular pan-sharpening quality assessment measures was performed in Refs. 16, 17, and 18. To measure two different image properties such as spectral and spatial quality, at least two measures are needed, which makes the task of ranking different fusion methods not easy. Following these results, a new joint quality measure (JQM) based both on spectral and spatial quality measures was proposed. This measure was enhanced by proposing a practical normalization of measures and successfully used for a quantitative fusion quality assessment in this paper.

The paper is organized in the following way. First, the four pan-sharpening methods—GFF, HPFM, CS, and Ehlers fusion method—are introduced. Then, a new JQM based on both spectral and spatial quality measures is described. Finally, experiments with a very-high-resolution satellite optical remote sensing WorldView-2 (WV-2) data are performed followed by discussion and conclusions.

In this section the following four pan-sharpening methods are described: GFF, HPFM, CS, and Ehlers fusion method.

General Fusion Filtering

Here the GFF method is shortly introduced in order to better understand the rationale behind a new image fusion method introduced in Sec. 2.2. For detailed description of the GFF method, see 9.

Let us denote by msk a low-resolution image, which can be a multispectral/hyperspectral or any other image, with the number of bands equal to k=1,,n(n{1,2,}) and by pan a high-resolution image, e.g., panchromatic band, intensity image of synthetic aperture radar (SAR), or any other image. A lot of existing multiresolution methods or algorithms can be seen as an implementation of a general fusion framework:

  • Low-resolution image interpolation msik=I(msk);
  • Fusion msfk=F(msik,pan);
  • Histogram matching msfk=M(msfk,msk).

Each band is processed independently. Further indices are omitted intentionally for the sake of clarity. First and third steps can be included in the fusion step depending on the method. Usually, I is a bilinear (BIL) or cubic convolution (CUB) interpolation and F is a linear or other function of images. In the following we formulate a GFF fusion method including interpolation and fusion in one step.

In order to preserve spectral properties of a low-resolution image ms, one should add only high-frequency information from a high-resolution image pan, thus preventing mixing of low-frequency information of pan with a low-resolution image. The natural way to do it is in a spectral or Fourier domain (signal processing view).

First, both images are transformed into spectral/Fourier space MS=FFT(ms) and PAN=FFT(pan). Then, high-frequency components are extracted from PAN (solid line, Fig. 1) and added to zero padded spectrum of MS (dotted line, Fig. 1). The formula is written as Display Formula

MSF=ZP(W·MS)+PAN·HPF,(1)
where ZP stands for zero padding, W(f)=α+(1α)·cos(2πf/PBWLR) is Hamming window with α=0.54 (aliasing/ringing avoidance), PBWLR is processing bandwidth of low-resolution image, and HPF is high-pass filter. Cutoff frequency of HPF allows us to control the amount of details added to a low-resolution image. Equivalently we can rewrite Eq. (1) for a low-pass filter (LPF) as Display Formula
MSF=ZP(W·MS)+PAN·(1LPF).(2)

Graphic Jump LocationF1 :

Addition of spectra of high-resolution (HR, solid line) and low-resolution (LR, dotted line) images. PBW stands for processing bandwidth, f for frequency, and fcutoff_HR for cutoff frequency of a high-pass filter.

Finally, the inverse Fourier transform delivers a fused image with an enhanced spatial resolution Display Formula

msf=FFT1(MSF).(3)

Thus GFF method performs image fusion in Fourier domain. Due to known equivalence formula in signal and spectral domains and assuming that interpolation is performed in a signal domain, we can rewrite Eqs. (1) to (3) as Display Formula

msf=msi+pan*hpf=msi+panpan*lpf.(4)

Equation (4) defines a fusion function F introduced above and helps to better understand the relation of the proposed method to already known methods.9 Here we have to note that Eq. (4) is not the same as GFF due to different interpolation methods used.

GFF method was first introduced in 9, where its similarity and differences to a high-pass filtering method (2) is discussed.

So the only parameter to be selected is a cutoff frequency parameter fcutoff_HR controlling the amount of filtering. In this paper a Gaussian filter is used. Display Formula

LPF=exp[12(ffcutoff_HR)2].(5)

Of course, any other filter, e.g., Butterworth, can be used. Our experience showed no significant influence of a filter type on the fusion quality. Thus the GFF method depends only on the filtering parameter fcutoff_HR. We will see in Sec. 4.4 that the optimal parameter for fcutoff_HR is 0.15 for WV-2 satellite data, which is well supported by other studies (e.g., 19) and existing experience. Further studies on more data are planned. Moreover, we have to note that the band-dependent filtering parameter can be easily implemented and can increase the quality of pan-sharpening as already known from other studies.20

High-Pass Filtering Method

As seen from Eq. (4), it is possible to avoid Fourier transform in the GFF method in order to reduce computation time. Thus a simple and fast variant of the GFF method, further called HPFM, can be derived as follows:

  1. Instead of zero-padding in spectral domain, an interpolation of multispectral image in signal domain can be performed.
  2. High-pass filter HPF or low-pass filter LPF with desired properties can be built up in a spectral domain and then transformed into signal domain hpf or lpf, respectively.
  3. Equation (4) is applied for image fusion in signal domain using convolution with a designed filter.
  4. Finally histogram matching is performed.

Proposed method differs from a method introduced in 2 in the way the filter is constructed and histogram matching performed. Usually a simple boxcar filter is used in signal domain, thus making a fine selection of amount of filtering difficult. For the HPFM the same filters as for GFF method, e.g., Eq. (5), can be used. Here we have to note that due to addition/subtraction operations in Eq. (4), the HPFM can be interpreted as an additive model. Multiplicative model can be written as Display Formula

msf=msipan*lpfpan.(6)

Thus the HPFM method depends on the filtering parameter fcutoff_HR, model, and interpolation method.

Component Substitution

CS method is one of the simplest and, maybe, oldest image fusion methods. Here a short description following the recent enhancement of intensity hue saturation transformation method1 is given. Under the assumption that msi and pan are highly correlated, one can calculate intensity or mean as Display Formula

i=1nk=1nmsik,(7)
where n is the number of multispectral bands.

Now CS fusion under the assumption of Eq. (7) can be written as Display Formula

msf=msii+pan.(8)

Similarly as for the HPFM, this way of fusion can be seen as an additive model. Multiplicative model can be written as Display Formula

msf=msiipan.(9)

Thus a selection of model and interpolation method is required for this method.

Ehlers Fusion

For a detailed description of the Ehlers fusion method, see Refs. 3 and 19. Its relation to the GFF method is discussed in 9. The Ehlers method depends on three parameters: interpolation method (cubic convolution was used as recommended by the authors), cutoff frequency fcutoff_HR for high-pass filtering of panchromatic image, and cutoff frequency fcutoff_I for low-pass filtering of intensity calculated according to Eq. (7). For Ehlers fusion in this paper again Gaussian filter [Eq. (5)] was used for both types of filtering. Finally, we have to note that our implementation of the method further called EhlersX in Xdibias software environment was used. The filtering for this method is implemented in a spectral domain. Validation of our implementation with the original Ehlers software implementation in MATLAB resulted in comparable results. As for other filtering methods GFF and HPFM, the optimal parameters for fcutoff_HR and fcutoff_I are 0.15 for WV-2 data, which coincides well with the experience of the authors of the method.

WV-2 satellite remote sensing data over the city of Munich in south Germany were used in our experiments. For scene details see Table 1.

Table Grahic Jump Location
Table 1Scene parameters for WorldView-2 data over Munich city.

The quality of pan-sharpening is usually measured by spectral and/or spatial quality measures to cover both attributes of a processing result. Measures calculated for the whole image are called global. Window-based measures are calculated in selected areas or, e.g., using sliding window and can distinguish image parts with a different quality. The latter measures are outside the scope of this paper.

Spectral Quality

Many spectral quality measures already have been proposed in the literature, e.g., Refs. 5 and 15. Recent comparison17,18 showed that the correlation (CORR) between original spectral bands and corresponding low-pass filtered and subsampled pan-sharpened bands is one of the best. It allows us to measure a spectral quality or preservation of a pan-sharpening method for individual bands or by averaging for all bands Display Formula

CORR=1nkCC(msk,(fmsk*lpf)).(10)

It has high values (optimal value is 1) for a good spectral preservation and low values for low spectral characteristics preservation. This measure alone is not able to assess the quality of fusion result, because it is calculated only in reduced image resolution/scale.

Spatial Quality

The same investigation17,18 showed a preference of SSIM between original panchromatic band and pan-sharpened bands for a spatial quality assessment. Display Formula

SSIM=1nkSSIM(pan,fmsk).(11)

It exhibits high values (optimal value 1) for high spatial quality and low values for low spatial quality. Here we have to note that due to different width of spectral spectra of multispectral and panchromatic data, the CC as proposed in 12 may not be sufficient because of possible mean and standard deviation differences. SSIM allows us to account for such differences much better.

Joint Quality Measure

In an ideal case, pan-sharpening method should exhibit both high spectral and spatial quality measure values. But it is not possible practically, because, e.g., for GFF method (also valid for other filtering methods) different parameters (amount of filtering) lead to different image qualities. Thus a larger high-pass filtering parameter value will lead to a higher spectral quality at the same time reducing spatial quality and vice versa [see Fig 2(b)]. None of the known separate quality measures can fulfill this requirement as a sole measure.14 Thus, a JQM could be helpful to achieve optimal parameter selection or best trade-off between spectral and spatial quality or find the best method for a particular application.

Graphic Jump LocationF2 :

Joint quality measure JQM (a) and separate quality measures (b) CORR and SSIM of HPFM pan-sharpening method with filtering parameters ranging from 0.05 to 0.7 for WorldView-2 Munich data.

One could think of a simple average or product12 of both measures: one for spectral and another one for spatial (in this particular case SSIM for spectral quality and CC for spatial quality). In such a way derived quality measure can be easily biased due to different range values of the separate measures. Moreover, CC for spatial quality can be insufficient for data exhibiting different spectral properties as already stated in Sec. 4.2.

In this paper we propose a new JQM that is based on CORR for spectral quality10 and SSIM for spatial quality,11 measures resulting from discussion in Secs. 4.1 and 4.2.

Due to different ranges of two measures (SSIM is usually lower than CORR as can be seen in Fig 2(b)] we propose here to normalize one of the measures before averaging (producing a joint measure) using a linear scaling transform Display Formula

SSIMnorm=SSIMSSIMminSSIMmaxSSIMmin·(CORRmaxCORRmin)+CORRmin,(12)
where SSIMmin stands for minimum of all SSIM values. For example these values can be calculated from the results of HPFM method using different filtering parameters (see Sec. 4.4). Similarly other minimum and maximum values are defined. We have to note that mean and standard deviation values (standardization) can be used for normalization instead of extreme range values with a risk of errors due to insufficient size of number of samples.

Now the averaging of corresponding spectral and normalized spatial measures Display Formula

JQM=(CORR+SSIMnorm)/2(13)
delivers a much more meaningful JQM, which is more suitable for parameter selection or comparison of different methods.

How to Normalize Quality Measures?

For the normalization as proposed in Eq. (12), we need to define four parameters: minimum and maximum values of two variables CORR and SSIM. This can be derived in different ways, e.g., from existing experience (subjective) or experiments with a lot of different methods exhibiting different fusion quality, and thus covering the whole range of spectral and spatial qualities. We follow the latter approach with the following optimizations. Due to diversity of content in an image (remote sensing scene) we propose a data-driven estimation of extreme values. To reduce computation time, we propose to use a single method: a filtering-based method, e.g., HPFM, with two different parameters producing best and worst qualities, e.g., small parameter value (0.05) for high spatial and at the same time low spectral quality and large parameter value (0.7) for low spatial and at the same time high spectral quality [see Fig. 2(b)]. Thus only two runs of a very fast HPFM method deliver the required four parameters. JQM around 0.15 suggests an optimal parameter value for HPFM [Fig. 2(a)].

How to Use JQM for Comparison of Different Methods?

JQM introduced in Eq. (13) requires knowledge of normalized SSIMnorm, which is not available for all methods. As shown in the previous section, it is even not necessary because these extreme values can be derived using a reference method, e.g., HPFM. Then we can rewrite Eq. (13) using Eq. (12) Display Formula

JQM=(CORR+A·SSIM+B)/2,(14)
where Display Formula
A=CORRmaxCORRminSSIMmaxSSIMmin,(15)
Display Formula
B=SSIMmin·(CORRmaxCORRmin)SSIMmaxSSIMmin+CORRmin.(16)

Applying extreme value ranges as proposed above (Sec. 4.4) results in the following A and B values (see Table 2), which are data dependent but are very easy and fast to derive. Thus proposed JQM [Eq. (14)] can be used for estimating quality of any pan-sharpening method on these data.

Table Grahic Jump Location
Table 2Estimated values of A and B according to Eqs. (15) and (16) from the proposed extreme range values used for calculation of JQM.

Additional margin of 10% added to these extreme values (avoiding out-of-range values) ensures the appropriate comparison or quality assessment of other pan-sharpening methods. Thus the proposed normalization of quality measures is data dependent, which means it should be performed individually for each image/scene. But this is not a really great drawback because the normalization should be estimated only once per scene and a fast fusion method such as HPFM with different parameter settings can be used.

Here we have to note that there exists one more JQM called QNR (13), which is not included in this study, but is one of the topics of the next paper.

In this section we shall compare different pan-sharpening methods using JQM as proposed in the previous section. Two more known fusion methods were added to the comparison: Amélioration de la résolution spatiale par injection de structures (ARSIS) (21) variant, À Trous wavelet transform model for wavelet fusion4 (implementation of 22) and Gram-Schmidt (GS) spectral sharpening implemented in ENVI Software with an averaging method for low-resolution file calculation and bilinear resampling. Results of quality assessment for various methods are presented in the next two sections by using values of A and B from Table 2. Here we have to note that for the quality assessment only bands spectrally overlapping with panchromatic band are used to assure physically justified results.8 Thus, the following three bands were excluded from further analysis: coastal, NIR1, and NIR2.

Comparison of Interpolation Methods

In many papers, in addition to the assessment of fusion methods, quality measures for only interpolation (resampling of low-resolution image to high-resolution image) are also referred. Comparison of the four most popular (fast enough for operational applications) interpolation methods is presented in Table 3 and visualized in Fig. 3.

Table Grahic Jump Location
Table 3Quality measures of various interpolation methods for WorldView-2 Munich data.
Graphic Jump LocationF3 :

Joint quality measure JQM (a) and separate quality measures (b) CORR and SSIM of different interpolation methods: (1) nearest neighbor, (2) zero padding, (3) bilinear interpolation, and (4) cubic interpolation for WorldView-2 Munich data.

BIL and CUB interpolation methods exhibit the best spectral quality (CORR), which is supported by existing experience. More surprising are results of spatial quality (SSIM). Here the best is nearest neighbor followed by BIL. JQM suggests BIL as interpolation method for pan-sharpening, which agrees well with existing experience. Here we have to note that SSIM is designed for quality assessment of fusion methods whereas interpolation method is not a fusion method and contains no information from panchromatic band. Thus the results of SSIM and JQM for interpolation methods should be treated cautiously and cannot be compared directly with the results of the next section.

Comparison of Pan-Sharpening Methods

Comparison of quality and speed of six pan-sharpening methods (some of them with different parameters) is presented in Table 4 and visualized in Fig. 4.

Table Grahic Jump Location
Table 4Quality measures and computation time of various pan-sharpening methods for WorldView-2 Munich data on Intel Core 2 Quad CPU Q9450 at 2.66 GHz.
Graphic Jump LocationF4 :

Joint quality measure JQM (a) and separate quality measures (b) CORR and SSIM of 12 pan-sharpening methods for WorldView-2 Munich data (see method number in Table 4).

We see that HPFM method [all four parameter settings (methods 2 to 5)] outperforms its parentage GFF method (method 1). This is because zero padding interpolation smoothens multispectral image to a much greater extent than, e.g., bilinear interpolation (compare CORR values in Table 3). Multiplicative model is better than additive model, whereas the difference between BIL and CUB interpolation methods is negligible. Optimal filtering parameter 0.15 [see Fig. 2(a)] was used for all filtering methods in this paper. Existing experience supports this value even for other data and sensors. Following two parameters 0.05 (method 6) and 0.7 (method 7) of HPFM were used to derive normalizing constants A and B, because the results exhibit extreme spectral and spatial qualities as seen in Fig. 4(b). Their JQM is comparable with that of ARSIS method known for its high spatial quality. The lowest JQM as expected exhibit both CS methods and GS sharpening. Ehlers fusion finds its place somewhere between this group and ARSIS. These observations are fully supported by existing experience and visual analysis (Fig. 5).

Graphic Jump LocationF5 :

Multispectral bilinear interpolated (bands: 5, 3, 2) and GFF, HPFM, CS, Ehlers, and ARSIS pan-sharpened images of WorldView-2 Munich data.

Additionally computation time of the methods is presented in Table 4 for multispectral image size 1024×1024, panchromatic image size 4096×4096, and for all eight bands of WV-2 Munich data. We see that the proposed HPFM pan-sharpening method is more than four times faster than the parentage GFF fusion method. Moreover, the speed of the proposed method HPFM is comparable with that of classical CS and GS methods. Ehlers and ARSIS fusion methods are about two times slower than GFF and thus are less suitable for operational applications.

A simplified version of a GFF method—a fast, simple, and good HPFM—is introduced with a potential for operational remote sensing applications. It performs filtering in signal domain, thus avoiding time-consuming FFT computations.

A new JQM based on both spectral and spatial quality measures (carefully selected from previous studies) is used to assess the fusion quality of six pan-sharpening methods—GFF, HPFM (with different parameter settings), CS, GS, Ehlers fusion, and ARSIS pan-sharpening methods—on a very high-resolution WV-2 satellite optical remote sensing data. Only spectral bands whose spectral spectrum is overlapping with a spectrum of panchromatic band were used for quality assessment to ensure physically consistent evaluation. The proposed JQM allows a comfortable ranking of different methods using a sole quality measure.

Experiments showed that the HPFM pan-sharpening method exhibits the best fusion quality among several popular methods tested (even better than its parentage method GFF) and at the same time more than four times lower computation time than GFF method. Thus HPFM method, being competitive in speed with known fast methods such as CS and GS but exhibiting a much higher quality, is a good candidate for operational applications. A new quality measure JQM allowed the correct ranking of different pan-sharpening methods, which is consistent with existing experience and visual analysis, thus claiming to be a suitable quality measure for selecting the parameters of a particular fusion method and comparison of different methods.

We would like to thank DigitalGlobe and European Space Imaging for the collection and provision of WorldView-2 scene over the Munich city.

Tu  T. M. et al., “A new look at IHS-like image fusion methods,” Inform. Fusion. 2, (3 ), 177 –186 (2001), CrossRef. 1566-2535 
Hill  J. et al., “A local correlation approach for the fusion of remote sensing data with different spatial resolution in forestry applications,” in  Proc. of Int. Archives of Photogrammetry and Remote Sensing , Vol. 32, Part 7-4-3 W6, pp. 167 –174,  ISPRS ,  Valladolid, Spain  (1999).
Klonus  S., Ehlers  M., “Image fusion using the Ehlers spectral characteristics preservation algorithm,” GIsci. Rem. Sens.. 44, (2 ), 93 –116 (2007), CrossRef. 1548-1603 
Aiazzi  B. et al., “Context-driven fusion of high spatial and spectral resolution images based on oversampled multiresolution analysis,” IEEE Trans. Geosci. Rem. Sens.. 40, (10 ), 2300 –2312 (2002), CrossRef. 0196-2892 
Alparone  L. et al., “Comparison of pansharpening algorithms: outcome of the 2006 GRS-S data-fusion contest,” IEEE Trans. Geosci. Rem. Sens.. 45, (10 ), 3012 –3021 (2007), CrossRef. 0196-2892 
Aiazzi  B. et al., “A comparison between global and context-adaptive pansharpening of multispectral images,” IEEE Geosci. Rem. Sens. Lett.. 6, (2 ), 302 –306 (2009), CrossRef. 1545-598X 
Wang  Z. et al., “A comparative analysis of image fusion methods,” IEEE Trans. Geosci. Rem. Sens.. 43, (6 ), 1391 –1402 (2005), CrossRef. 0196-2892 
Thomas  C. et al., “Synthesis of multispectral images to high spatial resolution: a critical review of fusion methods based on remote sensing physics,” IEEE Trans. Geosci. Remote Sens.. 46, (5 ), 1301 –1312 (2008), CrossRef. 0196-2892 
Palubinskas  G., Reinartz  P., “Multi-resolution, multi-sensor image fusion: general fusion framework,” in  Proc. of Joint Urban Remote Sensing Event , pp. 313 –316,  IEEE ,  Township, NJ  (2011).
Wang  Z. et al., “Image quality assessment: from error visibility to structural similarity,” IEEE Trans. Image Process.. 13, (4 ), 600 –612 (2004), CrossRef. 1057-7149 
Wald  L., Ranchin  T., Mangolini  M., “Fusion of satellite images of different spatial resolutions: assessing the quality of resulting images,” Photogramm. Eng. Rem. Sens.. 63, (6 ), 691 –699 (1997). 0099-1112 
Padwick  C. et al., “WorldView-2 pan-sharpening,” in  Proc. American Society for Photogrammetry and Remote Sensing , pp. 13 ,  ASPRS ,  Bethesda, Maryland  (2010).
Alparone  L. et al., “Multispectral and panchromatic data fusion assessment without reference,” Photogramm. Eng. Rem. Sens.. 74, (2 ), 193 –200 (2008). 0099-1112 
Li  S., Li  Z., Gong  J., “Multivariate statistical analysis of measures for assessing the quality of image fusion,” Int. J. Image Data Fusion. 1, (1 ), 47 –66 (2010).CrossRef
Du  Q. et al., “On the performance evaluation of pan-sharpening techniques,” IEEE Geosci. Rem. Sens. Lett.. 4, (4 ), 518 –522 (2007), CrossRef. 1545-598X 
Makarau  A., Palubinskas  G., Reinartz  P., “Multiresolution image fusion: phase congruency for spatial consistency assessment,” in  Proc. of ISPRS Technical Commision VII Symposium–100 Years ISPRS–Advancing Remote Sensing Science , Vol. XXXVIII, Part 7B, pp. 383 –388,  ISPRS ,  Vienna, Austria  (2010).
Makarau  A., Palubinskas  G., Reinartz  P., “Selection of numerical measures for pan-sharpening assessment,” in  Proc. Int. Geoscience and Remote Sensing Symp. , pp. 2264 –2267,  IEEE ,  Township, NJ  (2012).
Makarau  A., Palubinskas  G., Reinartz  P., “Analysis and selection of pan-sharpening assessment measures,” J. Appl. Rem. Sens.. 6, (1 ), 063548  (2012), CrossRef. 1931-3195 
Klonus  S., “Optimierung und Auswirkungen von ikonischen Bildfusionsverfahren zur Verbesserung von fernerkundlichen Auswerteverfahren,” Ph.D. Dissertation, Universität Osnabrück, Germany (2011).
Aiazzi  B. et al., “MTF-tailored multiscale fusion of high-resolution MS and Pan imagery,” Photogramm. Eng. Rem. Sens.. 72, (5 ), 591 –596 (2006). 0099-1112 
Ranchin  T., Wald  L., “Fusion of high spatial and spectral resolution images: the ARSIS concept and its implementation,” Photogramm. Eng. Rem. Sens.. 66, (1 ), 49 –61 (2000). 0099-1112 
Canty  M. J., “IDL extensions for ENVI,” 2009, http://mcanty.homepage.t-online.de/software.html (25  Mar 2013).

Grahic Jump LocationImage not available.

Gintautas Palubinskas received MS and PhD degrees in mathematics from Vilnius University, Vilnius, Lithuania, in 1981, and Institute of Mathematics and Informatics (IMI), Vilnius, Lithuania, in 1991, respectively. His doctoral dissertation was on spatial image recognition. He was a research scientist at the IMI from 1981 to 1997. From 1993 to 1997, he was a visiting research scientist at German Remote Sensing Data Center, DLR; the Department of Geography, Swansea University, Wales, U.K.; Institute of Navigation, Stuttgart University, Germany; Max-Planck-Institute of Cognitive Neuroscience, Leipzig, Germany. Since 1997, he has been a research scientist at German Remote Sensing Data Center (later Remote Sensing Technology Institute), German Aerospace Center DLR. He is the author/coauthor of about 40 papers published in peer-reviewed journals. Current interests are in image processing, image fusion, classification, change detection, traffic monitoring, data fusion for optical and SAR remote sensing applications.

© The Authors. Published by SPIE under a Creative Commons Attribution 3.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.

Citation

Gintautas Palubinskas
"Fast, simple, and good pan-sharpening method", J. Appl. Remote Sens. 7(1), 073526 (Aug 09, 2013). ; http://dx.doi.org/10.1117/1.JRS.7.073526


Figures

Graphic Jump LocationF1 :

Addition of spectra of high-resolution (HR, solid line) and low-resolution (LR, dotted line) images. PBW stands for processing bandwidth, f for frequency, and fcutoff_HR for cutoff frequency of a high-pass filter.

Graphic Jump LocationF2 :

Joint quality measure JQM (a) and separate quality measures (b) CORR and SSIM of HPFM pan-sharpening method with filtering parameters ranging from 0.05 to 0.7 for WorldView-2 Munich data.

Graphic Jump LocationF3 :

Joint quality measure JQM (a) and separate quality measures (b) CORR and SSIM of different interpolation methods: (1) nearest neighbor, (2) zero padding, (3) bilinear interpolation, and (4) cubic interpolation for WorldView-2 Munich data.

Graphic Jump LocationF5 :

Multispectral bilinear interpolated (bands: 5, 3, 2) and GFF, HPFM, CS, Ehlers, and ARSIS pan-sharpened images of WorldView-2 Munich data.

Graphic Jump LocationF4 :

Joint quality measure JQM (a) and separate quality measures (b) CORR and SSIM of 12 pan-sharpening methods for WorldView-2 Munich data (see method number in Table 4).

Tables

Table Grahic Jump Location
Table 2Estimated values of A and B according to Eqs. (15) and (16) from the proposed extreme range values used for calculation of JQM.
Table Grahic Jump Location
Table 3Quality measures of various interpolation methods for WorldView-2 Munich data.
Table Grahic Jump Location
Table 4Quality measures and computation time of various pan-sharpening methods for WorldView-2 Munich data on Intel Core 2 Quad CPU Q9450 at 2.66 GHz.
Table Grahic Jump Location
Table 1Scene parameters for WorldView-2 data over Munich city.

References

Tu  T. M. et al., “A new look at IHS-like image fusion methods,” Inform. Fusion. 2, (3 ), 177 –186 (2001), CrossRef. 1566-2535 
Hill  J. et al., “A local correlation approach for the fusion of remote sensing data with different spatial resolution in forestry applications,” in  Proc. of Int. Archives of Photogrammetry and Remote Sensing , Vol. 32, Part 7-4-3 W6, pp. 167 –174,  ISPRS ,  Valladolid, Spain  (1999).
Klonus  S., Ehlers  M., “Image fusion using the Ehlers spectral characteristics preservation algorithm,” GIsci. Rem. Sens.. 44, (2 ), 93 –116 (2007), CrossRef. 1548-1603 
Aiazzi  B. et al., “Context-driven fusion of high spatial and spectral resolution images based on oversampled multiresolution analysis,” IEEE Trans. Geosci. Rem. Sens.. 40, (10 ), 2300 –2312 (2002), CrossRef. 0196-2892 
Alparone  L. et al., “Comparison of pansharpening algorithms: outcome of the 2006 GRS-S data-fusion contest,” IEEE Trans. Geosci. Rem. Sens.. 45, (10 ), 3012 –3021 (2007), CrossRef. 0196-2892 
Aiazzi  B. et al., “A comparison between global and context-adaptive pansharpening of multispectral images,” IEEE Geosci. Rem. Sens. Lett.. 6, (2 ), 302 –306 (2009), CrossRef. 1545-598X 
Wang  Z. et al., “A comparative analysis of image fusion methods,” IEEE Trans. Geosci. Rem. Sens.. 43, (6 ), 1391 –1402 (2005), CrossRef. 0196-2892 
Thomas  C. et al., “Synthesis of multispectral images to high spatial resolution: a critical review of fusion methods based on remote sensing physics,” IEEE Trans. Geosci. Remote Sens.. 46, (5 ), 1301 –1312 (2008), CrossRef. 0196-2892 
Palubinskas  G., Reinartz  P., “Multi-resolution, multi-sensor image fusion: general fusion framework,” in  Proc. of Joint Urban Remote Sensing Event , pp. 313 –316,  IEEE ,  Township, NJ  (2011).
Wang  Z. et al., “Image quality assessment: from error visibility to structural similarity,” IEEE Trans. Image Process.. 13, (4 ), 600 –612 (2004), CrossRef. 1057-7149 
Wald  L., Ranchin  T., Mangolini  M., “Fusion of satellite images of different spatial resolutions: assessing the quality of resulting images,” Photogramm. Eng. Rem. Sens.. 63, (6 ), 691 –699 (1997). 0099-1112 
Padwick  C. et al., “WorldView-2 pan-sharpening,” in  Proc. American Society for Photogrammetry and Remote Sensing , pp. 13 ,  ASPRS ,  Bethesda, Maryland  (2010).
Alparone  L. et al., “Multispectral and panchromatic data fusion assessment without reference,” Photogramm. Eng. Rem. Sens.. 74, (2 ), 193 –200 (2008). 0099-1112 
Li  S., Li  Z., Gong  J., “Multivariate statistical analysis of measures for assessing the quality of image fusion,” Int. J. Image Data Fusion. 1, (1 ), 47 –66 (2010).CrossRef
Du  Q. et al., “On the performance evaluation of pan-sharpening techniques,” IEEE Geosci. Rem. Sens. Lett.. 4, (4 ), 518 –522 (2007), CrossRef. 1545-598X 
Makarau  A., Palubinskas  G., Reinartz  P., “Multiresolution image fusion: phase congruency for spatial consistency assessment,” in  Proc. of ISPRS Technical Commision VII Symposium–100 Years ISPRS–Advancing Remote Sensing Science , Vol. XXXVIII, Part 7B, pp. 383 –388,  ISPRS ,  Vienna, Austria  (2010).
Makarau  A., Palubinskas  G., Reinartz  P., “Selection of numerical measures for pan-sharpening assessment,” in  Proc. Int. Geoscience and Remote Sensing Symp. , pp. 2264 –2267,  IEEE ,  Township, NJ  (2012).
Makarau  A., Palubinskas  G., Reinartz  P., “Analysis and selection of pan-sharpening assessment measures,” J. Appl. Rem. Sens.. 6, (1 ), 063548  (2012), CrossRef. 1931-3195 
Klonus  S., “Optimierung und Auswirkungen von ikonischen Bildfusionsverfahren zur Verbesserung von fernerkundlichen Auswerteverfahren,” Ph.D. Dissertation, Universität Osnabrück, Germany (2011).
Aiazzi  B. et al., “MTF-tailored multiscale fusion of high-resolution MS and Pan imagery,” Photogramm. Eng. Rem. Sens.. 72, (5 ), 591 –596 (2006). 0099-1112 
Ranchin  T., Wald  L., “Fusion of high spatial and spectral resolution images: the ARSIS concept and its implementation,” Photogramm. Eng. Rem. Sens.. 66, (1 ), 49 –61 (2000). 0099-1112 
Canty  M. J., “IDL extensions for ENVI,” 2009, http://mcanty.homepage.t-online.de/software.html (25  Mar 2013).

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging & repositioning the boxes below.

Related Book Chapters

Topic Collections

PubMed Articles
Advertisement
  • Don't have an account?
  • Subscribe to the SPIE Digital Library
  • Create a FREE account to sign up for Digital Library content alerts and gain access to institutional subscriptions remotely.
Access This Article
Sign in or Create a personal account to Buy this article ($20 for members, $25 for non-members).
Access This Proceeding
Sign in or Create a personal account to Buy this article ($15 for members, $18 for non-members).
Access This Chapter

Access to SPIE eBooks is limited to subscribing institutions and is not available as part of a personal subscription. Print or electronic versions of individual SPIE books may be purchased via SPIE.org.