Poster + Paper
17 October 2023 Development of drone acquisition imagery and AI-based field crop status and growth prediction model
Author Affiliations +
Conference Poster
Abstract
Remote sensing drone utilization technology is rapidly developing and is being used in a variety of ways, from seed sowing to disease management and maintenance. Crop growth environment analysis and crop prediction vary depending on the climate, soil environment, topography, and applied technology of the target area. Crop growth is a complex trait determined by various factors such as genotype, growing environment, and interactions. To accurately predict growth conditions and growth, it is necessary to fundamentally understand the functional relationship between these interaction factors through data analysis. Interpretation of growth-related relationships requires both a comprehensive dataset and powerful algorithms in the model. This study aimed to build a model using drone imaging and AI technology to develop a model for the cultivation status and growth prediction of various crops grown in the field. The development model included the entire process of drone image acquisition, image processing, AI algorithm application, farmland information, crop status, and growth information production. This paper presents the overall configuration for the construction of the growth prediction model and the results of the AI-based cultivation area extraction model conducted in the first stage. Classifying cultivated crops by field is important for identifying the cultivated area and predicting yield. The development of drone remote sensing (RS) and AI technology has made it possible to precisely analyze the characteristics of field crops with images. The purpose of this study was to create and evaluate an AI-based cultivated crop classification model using the reflectance and texture characteristics of drone RGB images. The major crops cultivated during the crop classification survey period were kimchi cabbage, soybean, and rice. The texture applied in this development model is the texture characteristic of Haralick using GLCM (Gray Level Co-occurrence Matrix). A total of 8 factors were used to create the model: mean, variance, contrast, homogeneity, correlation, ASM (Angular Second Moment), homogeneity, and dissimilarity. Two AI models, SVC and RFC, were built in this study. For the SVC-based classification model, the hyperparameters C and gamma were set to 1.5 and 0.01, respectively, and a radial basis function (RBF) kernel was used. The cross-validation accuracy was 0.88 and the test set accuracy was 0.91. The maximum depth of the RFC-based classification model was set to 8 and the number of trees was set to 500. The cross-validation accuracy of the RFC-based model was 0.95, and the test set accuracy was 0.89. The learning time of the two models was 90 seconds for the SVC model and 7,200 seconds for the RFC model. The SVC-based classification model was evaluated as advantageous when considering classification accuracy and learning time. The findings of this study are expected to improve the precision of crop cultivation area identification using AI technology and be used as a useful tool for agricultural production management and forecasting by farmers and the government.
(2023) Published by SPIE. Downloading of the abstract is permitted for personal use only.
Seung-Hwan Go and Jong-Hwa Park "Development of drone acquisition imagery and AI-based field crop status and growth prediction model", Proc. SPIE 12727, Remote Sensing for Agriculture, Ecosystems, and Hydrology XXV, 127271Q (17 October 2023); https://doi.org/10.1117/12.2684522
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
RGB color model

Artificial intelligence

Image processing

Data modeling

Agriculture

Scalable video coding

Image acquisition

Back to Top