In the agricultural field, optical remote sensing technology plays an important role in crop monitoring or production estimation. However, the widespread distribution of clouds and rain limits the application of optical remote sensing. Synthetic aperture radar (SAR) has been widely used for studies of oceans, atmosphere, land, and space exploration, as well as by the military due to its all-weather nature, penetration to surface and cloud layers, and diversity of information carriers. However, it is difficult to classify ground objects with high accuracy based on SAR data. Considering the features of these two datasets, we proposed a framework to improve crop classifications in cloudy and rainy areas based on the optical-SAR response mechanism. Specifically, this method is designed to train a parametric analytic model in the area using both kinds of datasets and applied in the area with only SAR data to obtain the optical time-series features. Then crops from the second area were classified by the long-short-term memory network. As an example, the parametric analytic model in Lixian County was studied and was applied to Xifeng County to classify the crops with the OA of 61%, which had proved the robustness of the method.
Feature-based change detection technologies using multitemporal remote sensing images are widely applied to find newly increased built-up areas (NIBUA) during the period of observation. This paper proposes an automatic object-based NIBUA extraction method using high-resolution remote sensing images, which is based on the integration of spectrum feature, edge-derived line-density-based visual saliency (LDVS) feature, and texture-derived built-up presence index (PanTex) feature. In the proposed method, image segmentation is first employed to obtain objects as basic units of detection. Next, due to the complexity of built-up areas in high-resolution images, LDVS images and PanTex images are produced for each temporal image, respectively. Then, to highlight built-up areas in complex scenes, a comprehensive measure for each object is calculated by integrating the newly increased measures from spectrum, LDVS, and PanTex features via a manner of Dempster–Shafer evidence fusion. Finally, the object-based NIBUA can be extracted by conducting binarization on the newly increased fused measure image. Comparison studies and experimental results demonstrate that our method can achieve a robust extraction of NIBUA from high-resolution remote sensing images with a higher detection accuracy. We conclude that this automatic way can play a positive role in reducing the artificial workload of the interpreters and the cost of monitoring a large-region area. It is encouraged to employ this method in a variety of applications, such as illegal construction land monitoring, land use/cover map update, and city planning.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.