Remote Sensing Applications and Decision Support

Spatiotemporal image-fusion model for enhancing the temporal resolution of Landsat-8 surface reflectance images using MODIS images

[+] Author Affiliations
Khaled Hazaymeh

University of Calgary, Schulich School of Engineering, Department of Geomatics Engineering, 2500 University Drive NW, Calgary, Alberta T2N 1N4, Canada

Quazi K. Hassan

University of Calgary, Schulich School of Engineering, Department of Geomatics Engineering, 2500 University Drive NW, Calgary, Alberta T2N 1N4, Canada

J. Appl. Remote Sens. 9(1), 096095 (Jan 16, 2015). doi:10.1117/1.JRS.9.096095
History: Received October 23, 2014; Accepted December 18, 2014
Text Size: A A A

Open Access Open Access

Abstract.  Our aim was to evaluate a spatiotemporal image-fusion model (STI-FM) for enhancing the temporal resolution (i.e., from 16 to 8 days) of Landsat-8 surface reflectance images by utilizing the moderate-resolution imaging spectroradiometer (MODIS) images, and assess its applicability over a heterogeneous agriculture dominant semiarid region in Jordan. Our proposed model had two major components: (i) establishing relationships between two 8-day MODIS composite images acquired at two different times (i.e., time 1 and time 2); and (ii) generating synthetic Landsat-8 surface reflectance images at time 2 as a function of Landsat-8 images available at time 1 and the relationship constructed in the first component. We evaluated the synthetic images with the actual Landsat-8 images and observed strong relations between them. For example: the coefficient of determination (r2) was in the range: (i) 0.72 to 0.82; (ii) 0.71 to 0.79; and (iii) 0.78 to 0.83; for red, near-infrared (NIR), and shortwave infrared (SWIR2.2μm) spectral bands, respectively. In addition, root mean square error (RMSE) and absolute average difference (AAD) values were: (i) in between 0.003 and 0.004, and 0.0002, respectively, for red band; (ii) 0.005 and 0.0003, respectively, for NIR band; and (iii) 0.004 and in between 0.0001 and 0.0002, respectively, for SWIR2.2μm band. The developed method would be useful in understanding the dynamics of environment issues (e.g., agriculture drought and irrigation management), which require both relatively high spatial (i.e., 30 m) and high temporal resolution (i.e., 8 days) images.

Figures in this Article

Since the 1970s earth observation remote sensing satellites have been providing an enormous amount of information in the form of images for monitoring various environmental phenomena.1 These images can be categorized into several classes on the basis of their spectral characteristics, such as (i) visible (i.e., in the range 0.4 to 0.7μm); (ii) near-infrared (NIR: 0.7 to 0.9μm); (iii) shortwave infrared (SWIR: 1.0 to 2.5μm); (iv) thermal infrared (TIR: 3 to 14μm); and (iv) microwave (i.e., 3 mm to 3 m). In general, different satellites may acquire images over a particular spectral range; however, these images may differ significantly in both spatial and temporal resolutions. Usually the relatively high spatial resolution images have low temporal resolution, and vice versa.2 For example, the moderate-resolution imaging spectroradiometer (MODIS) and Landsat satellites have similar spectral ranges and also provide consistent surface reflectance35 while they vary in their spatial and temporal resolutions (see Table 1 for details). It is interesting to note that some environmental issues (e.g., agricultural drought, irrigation management, and grassland) require high spatial (e.g., 30 m) and high temporal (e.g., weekly) resolutions68 due to the rapid changes in these applications. In order to address this, a new data fusion research area (called as spatiotemporal data fusion) has emerged during the past several years. These techniques have been applied and successfully predicted synthetic high resolution images for different environmental issues such as vegetation indices,5,9 evapotranspiration,10 urban heat island,11 forest change detection,12 and surface temperature.2 In most of these instances, these fusion techniques have merged Landsat with MODIS images in order to generate synthetic images at the spatial resolution of Landsat (i.e., 30 m) and the temporal resolution of MODIS (i.e., 1 to 8 days).1318 In addition, some researchers have used other data, such as: (i) Landsat images and medium resolution imaging spectrometer (MERIS) having spatial resolution in the range 300 to 1200 m with 3 days temporal resolution;19,20 and (ii) HJ-1 CCD satellite images having 30-m spatial resolution with 4 days temporal resolution and MODIS.21 In general, these techniques can be broadly divided into three groups, (i) the spatial–temporal adaptive reflectance-fusion model (STAR-FM)-based techniques, (ii) unmixing-based fusion techniques, and (iii) sparse representation-based fusion techniques. Some of the example cases are briefly described in Table 2.

Table Grahic Jump Location
Table 1Comparison between spectral, spatial, and temporal resolutions of MODIS and Landsat-8 images.
Table Grahic Jump Location
Table 2Description of some of the spatiotemporal data fusion techniques implemented over the visible and shortwave infrared spectral bands.

In general, the above mentioned techniques were complex and required relatively long processing times. In order to address these, we developed a new data fusion technique called spatiotemporal image-fusion model (STI-FM) and demonstrated its effectiveness in enhancing the temporal resolution of Landsat-8 surface temperatures using MODIS data.24 Here, we proposed to evaluate the applicability of the STI-FM with required modifications in generating a synthetic Landsat-8 image [i.e., synth-L(t2)] for the red, NIR, and SWIR spectral bands by integrating a pair of MODIS images [i.e., M(t1) and M(t2)] and a Landsat-8 image [i.e., L(t1)]. The rationale behind choosing these spectral bands were that these had been widely used in the calculation of vegetation greenness and wetness conditions and are effective in monitoring these parameters in short time periods, such as the plants’ growing seasons. Our aim with regard to this paper was to implement this technique over a heterogeneous agriculture-dominant semiarid region in Jordan, Middle East.

General Description of the Study Area

The country of Jordan is located in the Middle East. It is divided into three major geographic areas (i.e., Jordan Valley, Mountainous Heights Plateau, and Eastern Desert) [see Fig. 1(a)]. Our study area [Fig. 1(b)] is located in the northwestern part of the Mountainous Heights Plateau, where the elevation varies in the range 600 to 1100 m above mean sea level. Geographically, it is located between 31°47′N to 32°29′N and 35°37′E to 36°00′E covering approximately 3500km2 [see Fig. 1(b)]. In terms of climate, it experiences Mediterranean climatic conditions with a hot and dry summer (i.e., average temperature 25°C with no rainfall during May to August); and cool and wet winter (i.e., average temperature 5 to 7°C with 250 to 650 mm rainfall during November to February); and two transition seasons (i.e., spring during March and April; and fall during September and October). In general, about 70% of the annual evaporation is observed in the dry season (i.e., May–August) with an annual potential evaporation of 1900mm. In our study area, the major agriculture activities include: (i) agricultural cereal crops (i.e., wheat and/or barley grown between November and July) account for 65.5% of the area; (ii) orchards (i.e., olives, apple, nectarine, etc.) occupy 25.5%; and (iii) grazing and forestry account for 9.0%.25,26

Graphic Jump LocationF1 :

(a) Map of Jordan illustrating three major geographic regions; (b) a Landsat-8 image at 30-m spatial resolution covering the study area as shown by the black polygon.

Data Requirements and Its Preprocessing

In this study, we employed remote sensing data acquired by two satellite systems: (i) Landsat-8 Operational Land Imager freely available from United States Geological Survey (USGS); and (ii) MODIS freely available from National Aeronautics and Space Administration (NASA). For both satellites, we selected the spectral bands of red (i.e., in the range 0.62 to 0.67μm), NIR (i.e., 0.84 to 0.88μm), and SWIR (i.e., 2.10 to 2.29μm). For Landsat-8, we obtained five almost cloudy-free images acquired between June and August 2013 at 30-m spatial resolution. On the other hand, we obtained nine MODIS-based 8-day composite of surface reflectance products (i.e., MOD09A1) at 500-m spatial resolution for the same period. The 8-day composite images would lessen the probability of cloud-contamination of the daily MODIS products.27 The selected dates are presented in Fig. 2.

Graphic Jump LocationF2 :

Acquisition dates of moderate-resolution imaging spectroradiometer (MODIS) and Landsat-8 imagery used in the study.

Preprocessing of MODIS images

The acquired MODIS surface reflectance products were originally provided in sinusoidal projection. We used the MODIS reprojection tool (MRT 4.1)28 to subset the images into the spatial extent of the study area and reproject them into the coordinate system of Landsat-8 images [i.e., Universal Transverse Mercator (Zone 36N-WGS84)]. Then, we coregistered these images using Landsat-8 images to allow for accurate geographic comparisons and to reduce the potential geometric errors (e.g., position and orientation) as effects of spatial miss-registration can influence the derived information. Finally, we also checked the cloud-contaminated pixels by using the quality control band (i.e., 500 m flag; another layer available in the MOD09A1 dataset) and excluded them from further analysis.

Preprocessing of Landsat-8 images

The Landsat-8 images were available in the form of digital numbers (DN). These DN values were converted into top of atmosphere (TOA) reflectance using the following equation illustrated in 29: Display Formula

ρTOA=M*DN+Asin(θSE),(1)
where ρTOA is the band-specific Landsat-8 TOA reflectance; M is the band-specific multiplicative rescaling factor; A is the band-specific additive rescaling factor; and θSE is the local sun elevation angle at the scene center. The values of A, M, and θSE were available in the metadata file of each image.

In order to transform the TOA reflectance into surface reflectance, we employed a simple but effective atmospheric correction method that would not require any information about the atmosphere conditions during the image acquisition period. This was done using MODIS surface reflectance images based on the fact that MODIS and Landsat have consistent surface reflectance values.3,4,13,30 This was accomplished in three distinct steps. In the first step, we employed an averaging method over a moving window of 17×17pixels (i.e., approximately the equivalent of 500×500m) for up-scaling pixels of Landsat-8 ρTOA images into the spatial resolution of MODIS images (i.e., 500 m). This was done in order to make both the Landsat-8 and MODIS images similar in the context of their spatial resolutions and to increase the spectral reliability.31 In this way, the spectral variance between the images would decrease while the spatial autocorrelation would increase; these were investigated in different studies.3235 In the second step, we determined linear relationships between the up-scaled Landsat-8 and MODIS images for each of the spectral bands by generating scatter plots between them. The coefficients of the linear relationships (i.e., slope and intercept) were then used with the original Landsat-8 ρTOA images (i.e., 30 m) in order to generate Landsat-8 surface reflectance images in the scope of the third step. It would be worthwhile to mention that the use of the Landsat Ecosystem Disturbance Adaptive Processing System atmospheric correction algorithm was not applicable for Landsat-8 due to the absence of climate data records.36 Finally, we employed the Landsat-8 quality assessment (QA) band for determining the cloud-contaminated pixels and excluded them from further analysis.

Figure 3 shows a schematic diagram of the STI-FM framework. It consisted of two major components, (i) establishing the relationships between MODIS images acquired at two different times [i.e., M(t1) and M(t2)], and (ii) generating the synthetic Landsat-8 surface reflectance images at time two [i.e., synth-L(t2)] by combining the Landsat-8 images acquired at time 1 [i.e., L(t1)] and the relationship constructed in the first component and its validation.

Graphic Jump LocationF3 :

Schematic diagram of the proposed spatiotemporal image-fusion model for enhancing the temporal resolution of Landsat-8 surface reflectance images.

In order to establish relations between the two MODIS images [i.e., M(t1) and M(t2)], we performed the following steps:

  • We calculated a ratio image [i.e., M(t2)/M(t1)] using the pair of MODIS images for each of the spectral bands of interest in order to determine the rate of the temporal change between the two dates.
  • We plotted M(t1) by M(t2) in order to assess the trends of changes in surface reflectance between the two images; this could be determined using a pixel-wise linear regression analysis.
  • Based on the results of the previous two steps, the ratio image was classified into three clusters of land cover change on the basis of assuming that approximately ±15% variation (rate of change) in surface reflectance (i.e., albedo) would be common for various natural surfaces (e.g., conifer forests, deciduous forest, agriculture crops, grass, etc.).37,38 These clusters included: (i) negligible changes [i.e., rate of change is within ±15%; M(t2)M(t1)]; (ii) negative change [i.e., <15%; M(t2)<M(t1)]; and (iii) positive change [i.e., >15%; M(t2)>M(t1)]. We considered that this was an important issue that was not taken in consideration in other spatial and temporal fusion models. For each of the three clusters, we produced cluster-specific scatter plots between M(t1) and M(t2), and performed linear regressions (see Fig. 4 for details).

Graphic Jump LocationF4 :

Conceptual relationships between the two MODIS images at two different times.

In generating the synthetic Landsat-8 surface reflectance image [i.e., synth-L(t2)] at 8-day intervals, we employed the Landsat-8 image acquired at time 1 [i.e., L(t1)] in conjunction with the classified image and cluster-specific linear regression models derived from the previous steps. In order to perform this combination, we applied different conditional linear functions based on MODIS classified image to assign the surface reflectance value of each pixel in the synthetic image instead of using one linear equation for the entire scene. For example, in order to generate synthetic Landsat-8 image of the day of year (DOY) 185, we first calculated the linear regression between MODIS image on DOY 169 to 176 and DOY 185 to 192, and then we applied the regression coefficients to Landsat-8 LST image of DOY 169. Here, our image-fusion model assumed that the linear relationship between two MODIS images should be applicable and comparable to the linear relationship between the two corresponding Landsat-8 images as they would have consistent values. Therefore, the use of this linear regression would be practical for generating synthetic Landsat-8 images.

Validating STI-FM

Upon producing the synth-L(t2) images, we evaluated them with the actual L(t2) images acquired at 16-day intervals as the actual Landsat-8 were only available at every 16 day temporal resolution. In this case, we used two methods: (i) qualitative evaluation that involved visual examination and (ii) quantitative evaluation using statistical measurements, such as coefficient of determination (r2), root mean square error (RMSE), and absolute average difference (AAD). The equations for these statistical measures are as follows: Display Formula

r2=[(A(t)A(t)¯)(S(t)S(t)¯)(A(t)A(t)¯)2(A(t)S(t)¯)2]2,(2)
Display Formula
RMSE=[S(t)A(t)]2n,(3)
Display Formula
AAD=1n|S(t)A(t)|,(4)
where A(t) and S(t) are the actual and the synthetic Landsat-8 surface reflectance images; A(t)¯ and S(t)¯ are the mean values of the actual and the synthetic Landsat-8 images; and n is the number of observations.

Evaluation of the Relationships Between MODIS Images Acquired at Two Different Times

Figure 5 shows the relation between 8-day composites of MODIS images acquired in two different dates for the spectral bands of red, NIR, and SWIR2.13μm during the period June 2 to August 12, 2013. It revealed that a strong relation existed for each of the clusters (i.e., negligible change, negative change, and positive change) over all of the spectral bands during the period of observation. For example: the r2, slope, and intercept values were in the range: (i) 0.85 to 0.95, 0.95 to 1.07, and 0.003 to 0.01, respectively, for the negligible change cluster; (ii) 0.81 to 0.94, 0.77 to 0.94, and 0.0005 to 0.04, respectively, for the negative change cluster; and (iii) 0.79 to 0.91, 1.05 to 1.21, and 0.01 to 0.08 respectively, for the positive change cluster. Regression analysis showed that the negligible change clusters revealed the highest correlation values because no significant changes occurred in the study area during the two 8-day composite MODIS images of interest at 16-day intervals. Note that we were unable to compare our findings as there were no similar studies found in the literature so far. Although the use of 8-day composites of MODIS might reduce the cloud-contamination, it might bring another issue. For example, two consecutive MODIS images might potentially be apart in the range of 2 to 16 days as the 8-day composite were generated based on the minimum-blue criterion,27 which coincided with the most clear-sky day during the composite period of interest. The quantification of the impact of such an 8-day composite against daily data was, in fact, beyond the scope of this paper, but might be an interesting issue for further exploration.

Graphic Jump LocationF5 :

Relation between 8-day composite of MODIS surface reflectance images acquired at time 1 [i.e., M(t1)] and time 2 [i.e., M(t2)] for the spectral bands of red [(a)–(d)], near-infrared (NIR) [(e)–(h)], and shortwave infrared (SWIR2.13μm) [(i)–(l)] during the period June 2 to August 12, 2013 [i.e., day of year (DOY) between 153 and 224]. Note that for each of the panels, three clusters (i.e., negligible change, negative change, and positive change) are formed as per Fig. 3. Also, the dotted and continue lines represent 11 and regression line, respectively.

Evaluation of the Synthetic Landsat-8 Surface Reflectance Images

Prior to conducting quantitative evaluations, we performed qualitative evaluations by comparing the actual and synthetic Landsat-8 images. In these cases, we generated pseudocolor composite images by putting the NIR, red, and SWIR spectral bands in the red, green, and blue color planes of the computer; and such an example is shown in Fig. 6. Figure 6 shows the actual Landsat-8 image acquired on June 18, 2013 (DOY 169) [Fig. 6(a)] and its corresponding synthetic image Landsat-8 image [Fig. 6(b)], which was produced using an image pair of Landsat-8 and MODIS acquired in the DOY 153 and one MODIS image acquired in the DOY 169. In fact, we evaluated four different land cover types [i.e., agricultural lands in Figs. 6(a1) and 6(b1); forests in Figs. 6(a2) and 6(b2); water bodies in Figs. 6(a3) and 6(b3); and urban areas in Figs. 6(a4) and 6(b4)]. In general, we observed that the visual clues (e.g., location, shape, size, and texture in particular) were reproduced in the synthetic images with negligible differences in comparison to that of the actual images. However, the tones (i.e., the DN representing the surface reflectance values) had some differences. These might happen due to the use of 500-m spatial resolution MODIS surface reflectance images in calculating the Landsat-8 surface reflectance values at 30-m resolution.

Graphic Jump LocationF6 :

Example comparison between pseudocolor composite images by putting the NIR, red, and SWIR spectral bands in the red, green, and blue color planes of the computer, respectively, for actual and synthetic Landsat-8 images during June 18, 2013. Note that the panels [(a1), (b1)], [(a2), (b2)], [(a3), (b3)], and [(a4), (b4)] show enlarged images for agricultural land, forest, water body, and urban area, respectively, for both actual and synthetic images. Note that the synthetic image was produced using an image pair of Landsat-8 and MODIS acquired in the DOY 153 and one MODIS image acquired in the DOY 169.

Figure 7 shows the relationship between the actual Landsat-8 surface reflectance images and the synthetic Landsat-8 surface reflectance images for red, NIR, and SWIR2.2μm spectral bands for the DOY 169, 185, 201, and 217. It demonstrated that strong relations existed between the actual and synthetic images for all the spectral bands of interest over the period of study. In the context of linear regression analysis, the r2, slope, and intercept values were in the range: (i) 0.72 to 0.82, 0.86 to 0.98, and 0.0009 to 0.042, respectively, for red spectral band; (ii) 0.71 to 0.79, 0.80 to 0.87, and 0.166 to 0.0800, respectively, for NIR spectral band; and (iii) 0.78 to 0.83, 0.91 to 0.94, and 0.0096 to 0.0367, respectively, for SWIR2.2μm spectral band. In the context of RMSE analyses, they were: (i) in between 0.003 an d0.004 for red spectral band; (ii) 0.005 for NIR spectral band; and (iii) 0.004 for SWIR2.2μm spectral band. In addition, the AAD values were 0.0002, 0.0003, and between 0.0001 and 0.0002 for the red, NIR, and SWIR2.2μm spectral bands, respectively.

Graphic Jump LocationF7 :

Relation between the actual Landsat-8 surface reflectance image and its corresponding synthetic Landsat-8 surface reflectance images for the red panels [(a)–(d)], NIR [(e)–(h)], and SWIR2.2μm [(i)–(l)] spectral bands during the DOY 169 (i.e., June 18, 2013) [(a), (e), (i)], DOY 185 (i.e., July 4, 2013) [(b), (f), (j)], DOY 201 (i.e., July 20, 2013) [(c), (g), (k)], and DOY 217 (i.e., August 5, 2013) [(d), (h), (l)]. The dotted and continued lines represent 11 and regression line, respectively.

It would be worthwhile to note that our findings were quite similar or even better in some cases compared to other studies. For example: (i) Gao et al.13 implemented STARFM over boreal forest and obtained AAD values of 0.004, 0.0129, and 0.0078 for red, NIR, and SWIR2.2μm spectral bands, respectively; (ii) Roy et al.14 applied a semiphysical fusion model over two study sites in the United States (Oregon and Idaho) and got AAD values of 0.015, 0.22, and 0.28 for Oregon site for red, NIR, and SWIR2.2μm spectral bands, respectively; (iii) Zhu et al.15 applied ESTARFM over heterogeneous regions and achieved AAD values of 0.0095 and 0.0196 for red and NIR spectral bands, respectively; (iv) Walker et al.5 used STARFM to generate synthetic Landsat ETM+ surface reflectance images over dry-land forests; and found that the r2 values were 0.85 and 0.51 for red and NIR spectral bands; (v) Song and Hang23 employed a sparse representation-based synthetic technique over boreal forests and found r2 values of 0.71 and 0.90; RMSE values of 0.02 and 0.03; and AAD values of 0.01 and 0.21 for red and NIR spectral bands, respectively; and (vi) Zhang et al.18 used ESTDFM and observed r2 values of 0.73 and 0.82, and AAD values of 0.009 and 0.0167 for red and NIR spectral bands, respectively. It is also important to mention that the proposed model would be applicable for other satellite systems that would have similar spectral and orbital characteristics, such as other Landsat series, MODIS, MERIS, and ASTER. In addition, it is also interesting to point out that the model is a relatively simple and easily reproducible approach, which might satisfy most of the user’s needs; such a simple and less sophisticated method might be the most suitable for different applications. Although our results demonstrated strong relations between actual and synthetic Landsat-8 images, some issues would be worthwhile to consider for further improvements, such as:

  • In this study, we used MODIS surface reflectance images at 500-m spatial resolution. However, it would be possible to use such images acquired at 250-m spatial resolution in the case of red and NIR spectral bands in particular, which might enhance the quality of the synthetic image.31
  • One of the major requirements for the input images [i.e., L(t1), M(t1), and M(t2)] was to be free from cloud-contamination. However, it might not be possible to have images completely free from such contamination. In such events, we might use the cloud-infilling algorithm described in Chowdhury and Hassan39 that required images acquired at current and previous dates.
  • Although the use of MODIS surface reflectance products to generate Landsat-8 surface reflectance images led to the prediction of reasonable synthetic Landsat-8 surface reflectance images, it would be interesting to use climate data records and compare its outcome with the method adopted here. However, such climatic records were not available for Landsat-8 images at the time of this study.40

In this study, we demonstrated the applicability of the STI-FM technique for enhancing the temporal resolution of Landsat-8 images from 16 to 8 days using 8-day MODIS based surface reflectance images and demonstrated its implementation over heterogeneous agriculture-dominant semiarid region in Jordan. Our results showed that the proposed method could generate synthetic Landsat-8 surface reflectance images for red, NIR, and SWIR spectral bands with relatively strong accuracies (r2, RMSE, and AAD values were in the range 0.71 to 0.83; 0.003 to 0.005; and 0.0001 to 0.0003, respectively). In general, our method would be considered as a simple one because it would not require any correction parameters or high quality land-use maps in order to predict the synthetic images. Despite the accuracy and simplicity, we would recommend that the proposed method should be thoroughly evaluated prior to adoption in other environmental conditions except for semiarid regions like ours.

We would like to thank Yarmouk University in Jordan for providing partial support in the form of a PhD scholarship to Mr. K. Hazaymeh; and the National Sciences and Engineering Research Council of Canada for a Discovery grant to Dr. Q. Hassan. We would also like to thank USGS and NASA for providing Landsat-8 and MODIS images free of cost.

Jensen  J., Remote Sensing of the Environment: An Earth Resource Perspective. , 2nd ed.,  Pearson Prentice Hall ,  New Jersey  (2007).
Weng  Q., Fu  P., Gao  F., “Generating daily land surface temperature at Landsat resolution by fusing Landsat and MODIS data,” Remote Sens. Environ.. 145, , 55 –67 (2014). 0034-4257 CrossRef
Masek  J. G. et al., “A Landsat surface reflectance dataset,” IEEE Trans. Geosci. Remote Sens.. 3, (1 ), 68 –72 (2006). 0196-2892 CrossRef
Feng  M. et al., “Quality assessment of Landsat surface reflectance products using MODIS data,” Comp. Geosci.. 38, (1 ), 9 –22 (2012). 0098-3004 CrossRef
Walker  J. J. et al., “Evaluation of Landsat and MODIS data fusion products for analysis of dryland forest phenology,” Remote Sens. Environ.. 117, , 381 –393 (2012). 0034-4257 CrossRef
Tong  A., He  Y., “Comparative analysis of SPOT, Landsat, MODIS, and AVHRR normalized difference vegetation index data on the estimation of leaf area index in a mixed grassland ecosystem,” J. Appl. Remote Sens.. 7, (1 ), 073599  (2013). 1931-3195 CrossRef
Rocha  C. et al., “Remote sensing based crop coefficients for water management in agriculture,” Chapter 8 in Sustainable Development-Authoritative and Leading Edge Content for Environmental Management. , Curkovic  S., Ed., pp. 167 –192,  InTech  Open Access,  Croatia  (2012).
Atzberger  C., “Advances in remote sensing of agriculture: context description, existing operational monitoring systems and major information needs,” Remote Sens.. 5, (2 ), 949 –981 (2013). 2072-4292 CrossRef
Udelhoven  T., “Long term data fusion for a dense time series analysis with MODIS and Landsat imagery in an Australian Savanna,” J. Appl. Remote Sens.. 6, (1 ), 063512  (2012). 1931-3195 CrossRef
Cammalleri  C. et al., “A data fusion approach for mapping daily evapotranspiration at field scale,” Water Resour. Res.. 49, (8 ), 4672 –4686 (2013). 0043-1397 CrossRef
Huang  B. et al., “Generating high spatiotemporal resolution land surface temperature for urban heat island monitoring,” IEEE Geosci. Remote Sens. Lett.. 10, (5 ), 1011 –1015 (2013). 1545-598X CrossRef
Hilker  T. et al., “Generation of dense time series synthetic Landsat data through data blending with MODIS using a spatial and temporal adaptive reflectance fusion model,” Remote Sens. Environ.. 113, (9 ), 1988 –1999 (2009). 0034-4257 CrossRef
Gao  F. et al., “On the blending of the Landsat and MODIS surface reflectance: predicting daily Landsat surface reflectance,” IEEE Trans. Geosci. Remote Sens.. 44, (8 ), 2207 –2218 (2006). 0196-2892 CrossRef
Roy  D. P. et al., “Multi-temporal MODIS–Landsat data fusion for relative radiometric normalization, gap filling, and prediction of Landsat data,” Remote Sens. Environ.. 112, (6 ), 3112 –3130 (2008). 0034-4257 CrossRef
Zhu  X. et al., “An enhanced spatial and temporal adaptive reflectance fusion model for complex heterogeneous regions,” Remote Sens. Environ.. 114, (11 ), 2610 –2623 (2010). 0034-4257 CrossRef
Fu  D. et al., “An improved image fusion approach based on enhanced spatial and temporal the adaptive reflectance fusion model,” Remote Sens.. 5, (12 ), 6346 –6360 (2013). 2072-4292 CrossRef
Wu  M. et al., “Use of MODIS and Landsat time series data to generate high-resolution temporal synthetic Landsat data using a spatial and temporal reflectance fusion model,” J. Appl. Remote Sens.. 6, (1 ), 063507  (2012). 1931-3195 CrossRef
Zhang  W. et al., “An enhanced spatial and temporal data fusion model for fusing Landsat and MODIS surface reflectance to generate high temporal Landsat-like data,” Remote Sens.. 5, (10 ), 5346 –5368 (2013). 2072-4292 CrossRef
Zurita-Milla  R. et al., “Downscaling time series of MERIS full resolution data to monitor vegetation seasonal dynamics,” Remote Sens. Environ.. 113, (9 ), 1874 –1885 (2009). 0034-4257 CrossRef
Zurita-Milla  R. et al., “Using MERIS fused images for land-cover mapping and vegetation status assessment in heterogeneous landscapes,” Int. J. Remote Sens.. 32, (4 ), 973 –991 (2011). 0143-1161 CrossRef
Meng  J., Du  X., Wu  B., “Generation of high spatial and temporal resolution NDVI and its application in crop biomass estimation,” Int. J. Digital Earth. 6, (3 ), 203 –218 (2013). 1753-8947 CrossRef
Huang  B., Song  H., “Spatiotemporal reflectance fusion via sparse representation,” IEEE Trans. Geosci. Remote Sens.. 50, (10 ), 3707 –3716 (2012). 0196-2892 CrossRef
Song  H., Huang  B., “Spatiotemporal satellite image fusion through one-pair image learning,” IEEE Trans. Geosci. Remote Sens.. 51, (4 ), 1883 –1896 (2013). 0196-2892 CrossRef
Hazaymeh  K., Hassan  Q. K., “Fusion of MODIS and Landsat-8 surface temperature images: a new approach,” PLoS One. , in press. 1932-6203 CrossRef
Food and Agriculture Organization (FAO), “Country pasture/forage resource profiles, Jordan,” (2006), http://www.fao.org/ag/agp/agpc/doc/counprof/PDF%20files/Jordan.pdf (3  October 2014).
Jordan Ministry of Environment (JME), “National strategy and action plan to combat desertification,” (2006), http://ag.arizona.edu/oals/IALC/jordansoils/_html/NAP.pdf (3  June 2014).
Vermote  E. F., Vermeulen  A., “MODIS algorithm technical background document, atmospheric correction algorithm: spectral reflectances (MOD09), Version 4.0,” (1999), http://modis.gsfc.nasa.gov/data/atbd/atbd_mod08.pdf (17  June 2014).
Dwyer  G., Schmidt  M. J., “The MODIS reprojection tool,” Chapter 13 in Earth Science Satellite Remote Sensing, Vol. 2, Data, Computational Processing, and Tools. , Qu  V. V. et al., Eds., pp. 162 –177,  Springer, Tsinghua University Press ,  China  (2006).
United States Geological Survey (USGS), “Using the USGS Landsat 8 product,” (2013), http://landsat.usgs.gov/Landsat8_Using_Product.php (3  June 2014).
Feng  M. et al., “Global surface reflectance products from Landsat: assessment using coincident MODIS observations,” Remote Sens. Environ.. 134, , 276 –293 (2013). 0034-4257 CrossRef
Ling  Y. et al., “Effects of spatial resolution ratio in image fusion,” Int. J. Remote Sens.. 29, (7 ), 2157 –2167 (2008). 0143-1161 CrossRef
He  H. S., Ventura  S. J., Mladenoff  D. J., “Effects of spatial aggregation approaches on classified satellite imagery,” Int. J. Geogr. Inf. Sci.. 16, (1 ), 93 –109 (2002). 1365-8816 CrossRef
Goodin  D. G., Henebry  G. M., “The effect of rescaling on fine spatial resolution NDVI data: a test using multi-resolution aircraft sensor data,” Int. J. Remote Sens.. 23, (18 ), 3865 –3871 (2002). 0143-1161 CrossRef
Ju  J., Gopal  S., Kolaczyk  E. D., “On the choice of spatial and categorical scale in remote sensing land cover classification,” Remote Sens. Environ.. 96, (1 ), 62 –77 (2005). 0034-4257 CrossRef
Nelson  M. D. et al., “Effects of satellite image spatial aggregation and resolution on estimates of forest land area,” Int. J. Remote Sens.. 30, (8 ), 1913 –1940 (2009). 0143-1161 CrossRef
NASA, “Landsat surface reflectance climate data record,” 2014, http://landsat.usgs.gov/CDR_LSR.php (19  October 2014).
Oke  T. R., Boundary Layer Climates. , 2nd ed.,  Routledge ,  New York  (1987).
Ahrens  C. E., Jackson  C. D., Jackson  P. L., Meteorology Today: An Introduction to Weather, Climate, and the Environment. , 8th ed.,  Brooks/Cole Cengage Learning ,  Stamford  (2012).
Chowdhury  E. H., Hassan  Q. K., “Use of remote sensing-derived variables in developing a forest fire danger forecasting system,” Nat. Hazards. 67, (2 ), 321 –334 (2013). 0921-030X CrossRef
USGS, “Landsat climate data record (CDR) surface reflectance. Product guide, version 4,” (2014), http://landsat.usgs.gov/documents/cdr_sr_product_guide_v40.pdf (3  September 2014).

Khaled Hazaymeh received his BA degree in geography and spatial planning from Yarmouk University, Jordan, in 2004 and his MSc degree in remote sensing and GIS from the University Putra Malaysia in 2009. He is currently pursuing his PhD degree in earth observation in the Department of Geomatics Engineering at the University of Calgary, Canada. His research interests focus on environmental modeling using remote sensing techniques.

Quazi K. Hassan received his PhD degree in remote sensing and ecological modeling from the University of New Brunswick, Canada. He is currently an associate professor in the Department of Geomatics Engineering at the University of Calgary. His research interests include the integration of remote sensing, environmental modeling, and GIS in addressing environmental issues.

© The Authors. Published by SPIE under a Creative Commons Attribution 3.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.

Citation

Khaled Hazaymeh and Quazi K. Hassan
"Spatiotemporal image-fusion model for enhancing the temporal resolution of Landsat-8 surface reflectance images using MODIS images", J. Appl. Remote Sens. 9(1), 096095 (Jan 16, 2015). ; http://dx.doi.org/10.1117/1.JRS.9.096095


Figures

Graphic Jump LocationF5 :

Relation between 8-day composite of MODIS surface reflectance images acquired at time 1 [i.e., M(t1)] and time 2 [i.e., M(t2)] for the spectral bands of red [(a)–(d)], near-infrared (NIR) [(e)–(h)], and shortwave infrared (SWIR2.13μm) [(i)–(l)] during the period June 2 to August 12, 2013 [i.e., day of year (DOY) between 153 and 224]. Note that for each of the panels, three clusters (i.e., negligible change, negative change, and positive change) are formed as per Fig. 3. Also, the dotted and continue lines represent 11 and regression line, respectively.

Graphic Jump LocationF4 :

Conceptual relationships between the two MODIS images at two different times.

Graphic Jump LocationF3 :

Schematic diagram of the proposed spatiotemporal image-fusion model for enhancing the temporal resolution of Landsat-8 surface reflectance images.

Graphic Jump LocationF2 :

Acquisition dates of moderate-resolution imaging spectroradiometer (MODIS) and Landsat-8 imagery used in the study.

Graphic Jump LocationF1 :

(a) Map of Jordan illustrating three major geographic regions; (b) a Landsat-8 image at 30-m spatial resolution covering the study area as shown by the black polygon.

Graphic Jump LocationF6 :

Example comparison between pseudocolor composite images by putting the NIR, red, and SWIR spectral bands in the red, green, and blue color planes of the computer, respectively, for actual and synthetic Landsat-8 images during June 18, 2013. Note that the panels [(a1), (b1)], [(a2), (b2)], [(a3), (b3)], and [(a4), (b4)] show enlarged images for agricultural land, forest, water body, and urban area, respectively, for both actual and synthetic images. Note that the synthetic image was produced using an image pair of Landsat-8 and MODIS acquired in the DOY 153 and one MODIS image acquired in the DOY 169.

Graphic Jump LocationF7 :

Relation between the actual Landsat-8 surface reflectance image and its corresponding synthetic Landsat-8 surface reflectance images for the red panels [(a)–(d)], NIR [(e)–(h)], and SWIR2.2μm [(i)–(l)] spectral bands during the DOY 169 (i.e., June 18, 2013) [(a), (e), (i)], DOY 185 (i.e., July 4, 2013) [(b), (f), (j)], DOY 201 (i.e., July 20, 2013) [(c), (g), (k)], and DOY 217 (i.e., August 5, 2013) [(d), (h), (l)]. The dotted and continued lines represent 11 and regression line, respectively.

Tables

Table Grahic Jump Location
Table 2Description of some of the spatiotemporal data fusion techniques implemented over the visible and shortwave infrared spectral bands.
Table Grahic Jump Location
Table 1Comparison between spectral, spatial, and temporal resolutions of MODIS and Landsat-8 images.

References

Jensen  J., Remote Sensing of the Environment: An Earth Resource Perspective. , 2nd ed.,  Pearson Prentice Hall ,  New Jersey  (2007).
Weng  Q., Fu  P., Gao  F., “Generating daily land surface temperature at Landsat resolution by fusing Landsat and MODIS data,” Remote Sens. Environ.. 145, , 55 –67 (2014). 0034-4257 CrossRef
Masek  J. G. et al., “A Landsat surface reflectance dataset,” IEEE Trans. Geosci. Remote Sens.. 3, (1 ), 68 –72 (2006). 0196-2892 CrossRef
Feng  M. et al., “Quality assessment of Landsat surface reflectance products using MODIS data,” Comp. Geosci.. 38, (1 ), 9 –22 (2012). 0098-3004 CrossRef
Walker  J. J. et al., “Evaluation of Landsat and MODIS data fusion products for analysis of dryland forest phenology,” Remote Sens. Environ.. 117, , 381 –393 (2012). 0034-4257 CrossRef
Tong  A., He  Y., “Comparative analysis of SPOT, Landsat, MODIS, and AVHRR normalized difference vegetation index data on the estimation of leaf area index in a mixed grassland ecosystem,” J. Appl. Remote Sens.. 7, (1 ), 073599  (2013). 1931-3195 CrossRef
Rocha  C. et al., “Remote sensing based crop coefficients for water management in agriculture,” Chapter 8 in Sustainable Development-Authoritative and Leading Edge Content for Environmental Management. , Curkovic  S., Ed., pp. 167 –192,  InTech  Open Access,  Croatia  (2012).
Atzberger  C., “Advances in remote sensing of agriculture: context description, existing operational monitoring systems and major information needs,” Remote Sens.. 5, (2 ), 949 –981 (2013). 2072-4292 CrossRef
Udelhoven  T., “Long term data fusion for a dense time series analysis with MODIS and Landsat imagery in an Australian Savanna,” J. Appl. Remote Sens.. 6, (1 ), 063512  (2012). 1931-3195 CrossRef
Cammalleri  C. et al., “A data fusion approach for mapping daily evapotranspiration at field scale,” Water Resour. Res.. 49, (8 ), 4672 –4686 (2013). 0043-1397 CrossRef
Huang  B. et al., “Generating high spatiotemporal resolution land surface temperature for urban heat island monitoring,” IEEE Geosci. Remote Sens. Lett.. 10, (5 ), 1011 –1015 (2013). 1545-598X CrossRef
Hilker  T. et al., “Generation of dense time series synthetic Landsat data through data blending with MODIS using a spatial and temporal adaptive reflectance fusion model,” Remote Sens. Environ.. 113, (9 ), 1988 –1999 (2009). 0034-4257 CrossRef
Gao  F. et al., “On the blending of the Landsat and MODIS surface reflectance: predicting daily Landsat surface reflectance,” IEEE Trans. Geosci. Remote Sens.. 44, (8 ), 2207 –2218 (2006). 0196-2892 CrossRef
Roy  D. P. et al., “Multi-temporal MODIS–Landsat data fusion for relative radiometric normalization, gap filling, and prediction of Landsat data,” Remote Sens. Environ.. 112, (6 ), 3112 –3130 (2008). 0034-4257 CrossRef
Zhu  X. et al., “An enhanced spatial and temporal adaptive reflectance fusion model for complex heterogeneous regions,” Remote Sens. Environ.. 114, (11 ), 2610 –2623 (2010). 0034-4257 CrossRef
Fu  D. et al., “An improved image fusion approach based on enhanced spatial and temporal the adaptive reflectance fusion model,” Remote Sens.. 5, (12 ), 6346 –6360 (2013). 2072-4292 CrossRef
Wu  M. et al., “Use of MODIS and Landsat time series data to generate high-resolution temporal synthetic Landsat data using a spatial and temporal reflectance fusion model,” J. Appl. Remote Sens.. 6, (1 ), 063507  (2012). 1931-3195 CrossRef
Zhang  W. et al., “An enhanced spatial and temporal data fusion model for fusing Landsat and MODIS surface reflectance to generate high temporal Landsat-like data,” Remote Sens.. 5, (10 ), 5346 –5368 (2013). 2072-4292 CrossRef
Zurita-Milla  R. et al., “Downscaling time series of MERIS full resolution data to monitor vegetation seasonal dynamics,” Remote Sens. Environ.. 113, (9 ), 1874 –1885 (2009). 0034-4257 CrossRef
Zurita-Milla  R. et al., “Using MERIS fused images for land-cover mapping and vegetation status assessment in heterogeneous landscapes,” Int. J. Remote Sens.. 32, (4 ), 973 –991 (2011). 0143-1161 CrossRef
Meng  J., Du  X., Wu  B., “Generation of high spatial and temporal resolution NDVI and its application in crop biomass estimation,” Int. J. Digital Earth. 6, (3 ), 203 –218 (2013). 1753-8947 CrossRef
Huang  B., Song  H., “Spatiotemporal reflectance fusion via sparse representation,” IEEE Trans. Geosci. Remote Sens.. 50, (10 ), 3707 –3716 (2012). 0196-2892 CrossRef
Song  H., Huang  B., “Spatiotemporal satellite image fusion through one-pair image learning,” IEEE Trans. Geosci. Remote Sens.. 51, (4 ), 1883 –1896 (2013). 0196-2892 CrossRef
Hazaymeh  K., Hassan  Q. K., “Fusion of MODIS and Landsat-8 surface temperature images: a new approach,” PLoS One. , in press. 1932-6203 CrossRef
Food and Agriculture Organization (FAO), “Country pasture/forage resource profiles, Jordan,” (2006), http://www.fao.org/ag/agp/agpc/doc/counprof/PDF%20files/Jordan.pdf (3  October 2014).
Jordan Ministry of Environment (JME), “National strategy and action plan to combat desertification,” (2006), http://ag.arizona.edu/oals/IALC/jordansoils/_html/NAP.pdf (3  June 2014).
Vermote  E. F., Vermeulen  A., “MODIS algorithm technical background document, atmospheric correction algorithm: spectral reflectances (MOD09), Version 4.0,” (1999), http://modis.gsfc.nasa.gov/data/atbd/atbd_mod08.pdf (17  June 2014).
Dwyer  G., Schmidt  M. J., “The MODIS reprojection tool,” Chapter 13 in Earth Science Satellite Remote Sensing, Vol. 2, Data, Computational Processing, and Tools. , Qu  V. V. et al., Eds., pp. 162 –177,  Springer, Tsinghua University Press ,  China  (2006).
United States Geological Survey (USGS), “Using the USGS Landsat 8 product,” (2013), http://landsat.usgs.gov/Landsat8_Using_Product.php (3  June 2014).
Feng  M. et al., “Global surface reflectance products from Landsat: assessment using coincident MODIS observations,” Remote Sens. Environ.. 134, , 276 –293 (2013). 0034-4257 CrossRef
Ling  Y. et al., “Effects of spatial resolution ratio in image fusion,” Int. J. Remote Sens.. 29, (7 ), 2157 –2167 (2008). 0143-1161 CrossRef
He  H. S., Ventura  S. J., Mladenoff  D. J., “Effects of spatial aggregation approaches on classified satellite imagery,” Int. J. Geogr. Inf. Sci.. 16, (1 ), 93 –109 (2002). 1365-8816 CrossRef
Goodin  D. G., Henebry  G. M., “The effect of rescaling on fine spatial resolution NDVI data: a test using multi-resolution aircraft sensor data,” Int. J. Remote Sens.. 23, (18 ), 3865 –3871 (2002). 0143-1161 CrossRef
Ju  J., Gopal  S., Kolaczyk  E. D., “On the choice of spatial and categorical scale in remote sensing land cover classification,” Remote Sens. Environ.. 96, (1 ), 62 –77 (2005). 0034-4257 CrossRef
Nelson  M. D. et al., “Effects of satellite image spatial aggregation and resolution on estimates of forest land area,” Int. J. Remote Sens.. 30, (8 ), 1913 –1940 (2009). 0143-1161 CrossRef
NASA, “Landsat surface reflectance climate data record,” 2014, http://landsat.usgs.gov/CDR_LSR.php (19  October 2014).
Oke  T. R., Boundary Layer Climates. , 2nd ed.,  Routledge ,  New York  (1987).
Ahrens  C. E., Jackson  C. D., Jackson  P. L., Meteorology Today: An Introduction to Weather, Climate, and the Environment. , 8th ed.,  Brooks/Cole Cengage Learning ,  Stamford  (2012).
Chowdhury  E. H., Hassan  Q. K., “Use of remote sensing-derived variables in developing a forest fire danger forecasting system,” Nat. Hazards. 67, (2 ), 321 –334 (2013). 0921-030X CrossRef
USGS, “Landsat climate data record (CDR) surface reflectance. Product guide, version 4,” (2014), http://landsat.usgs.gov/documents/cdr_sr_product_guide_v40.pdf (3  September 2014).

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging & repositioning the boxes below.

Related Book Chapters

Topic Collections

Advertisement
  • Don't have an account?
  • Subscribe to the SPIE Digital Library
  • Create a FREE account to sign up for Digital Library content alerts and gain access to institutional subscriptions remotely.
Access This Article
Sign in or Create a personal account to Buy this article ($20 for members, $25 for non-members).
Access This Proceeding
Sign in or Create a personal account to Buy this article ($15 for members, $18 for non-members).
Access This Chapter

Access to SPIE eBooks is limited to subscribing institutions and is not available as part of a personal subscription. Print or electronic versions of individual SPIE books may be purchased via SPIE.org.