Figures
Abstract
Accurate identification of vegetation in mining areas is crucial for conducting pre-mining ecological assessments and post-mining ecological monitoring. However, the vegetation in the mining area is always highly heterogeneous including both field crops and naturally scattered growing vegetation, which brings great challenges for fine vegetation mapping. Feature combinations are an important factor to influence the vegetation mapping. Thus, to effectively identify the vegetation, this study utilized an unmanned aerial vehicle (UAV) RGB image to extract vegetation indexes and textures, and then selected features based on standard deviation and difference coefficient. By integrating selected optimal features with RGB images, different combinations were constructed and classified using Support Vector Machine (SVM). The results demonstrated that the combination of RGB and all selected features yielded the highest accuracy, followed by the combination of RGB and a single type of texture, and then the combination of RGB and VIs, which indicated that texture features were more important than VIs for vegetation identification. The OA and Kappa for the best combination were 87.76% and 0.8351 for study area A, and 88.74% and 0.8505 for study area B, indicating the effectiveness of the adopted method. Besides, compared with the commonly used random forest (RF) feature selection method, the adopted method avoided complex parameter settings and constructed a superior optimal combination, which further proved the simplicity and effectiveness of difference coefficient-based feature selection methods for vegetation classification in highly heterogeneous environments, contributing to more accurate ecological assessments and monitoring.
Citation: Dong J, Zhang J, Zhang S, Yu Z, Song Z, Meng T (2025) Vegetation extraction through UAV RGB imagery and efficient feature selection. PLoS One 20(5): e0322180. https://doi.org/10.1371/journal.pone.0322180
Editor: Daniel Capella Zanotta, Universidade do Vale do Rio dos Sinos, BRAZIL
Received: July 10, 2024; Accepted: March 18, 2025; Published: May 9, 2025
Copyright: © 2025 Dong et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Data Availability: The data underlying the results presented in the study are available from https://doi.org/10.6084/m9.figshare.28374311.v1
Funding: J.Z. received funding from the Open Fund of State Key Laboratory of Water Resource Protection and Utilization in Coal Mining, grant number GJNY-20-113-15, provided by the State Key Laboratory of Water Resources Protection and Utilization in Coal Mining(http://wpu.energy.com.cn/). Z.S. received funding from Youth Fund of the National Natural Science Foundation of China, grant number 52204183, provided by the National Natural Science Foundation of China(https://www.nsfc.gov.cn/). The sponsors or funders did not play any role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript.
Competing interests: The authors have declared that no competing interests exist.
Introduction
The coal mining region in Ordos, Inner Mongolia, serves as a crucial source of energy for the northwest region of China. The Xinjie Taigemiao mining area, situated in Ordos, falls under the jurisdiction of Ejin Horo Banner and Wushen Banner. This area features exceptional overall construction conditions, including an ideal location and ample coal reserves. It is a large-scale, undeveloped integrated coalfield that can serve as a model for the creation of a large-scale, intensive, and intelligent mining area. To effectively analyze the characteristics of regional ecological changes, post-mining land reclamation, and ecological restoration, it is necessary to accurately determine land use and surface vegetation distribution.
With the development of remote sensing technology, it has gradually become an effective method for vegetation monitoring and land use investigation due to the advantage of large coverage area and strong detection capabilities [1–3]. However, when the study focused on the working face of mining areas, it required the high-resolution images with expensive cost and highly dependent on weather conditions during satellite transit [4], which made it difficult to access the appropriate data for vegetation identification. Recently, UAV remote sensing has been widely used since it is with high flexibility, high spatial resolution and low susceptibility to weather and cloud cover [5,6]. Scholars have applied the UAV-based RGB image to distinguish different ground objects such as tree mapping [7,8] and crop identification [9, 10], and proved their feasibility and effectiveness. Compared with other sensors, they are more affordable and therefore increasingly become the choice for vegetation study.
Currently, there are mainly two types of vegetation mapping methods based on UAV RGB imagery. The first one take use of vegetation indexes (VIs) to increase the differences between the vegetation and non-vegetation, and then extracted the vegetation by threshold segmentation. For example, Wang et al. [11] constructed the Visible-band Difference Vegetation Index (VDVI) to achieve high precision recognition of healthy green vegetation. Zhang et al. [12] designed a new Green-Red Vegetation Index (NGRVI) according to the characteristics of arid and semi-arid vegetation and realized the effective extraction. Although this type of methods is simple to operate, they are often applied to single type of land cover identification.
The second type of method is based on image enhancement and texture feature extraction. Li et al. [13] and Zhou et al. [14] proposed object-based image analysis (OBIA) method to identify shrub species and wetland vegetation respectively, they first segmented the image into homogeneous regions, and then used feature extraction or selection methods to integrate feature combinations of spectrum, VIs and textures to differentiate objects. Although OBIA are always adopted for the high-resolution image, they face the challenges in the process and determination of optimal scale [15], and the accuracy is closely related to the experience and prior knowledge of image operators [16]. Thus, considering the simplicity and effectiveness, pixel-based analysis methods are still one of the most used methods [16,17]. The feature combinations are an important factor to influence the vegetation mapping results, and random forest (RF) have been used for feature selection [17,18]. However, it needs to determine the optimal number of trees, predictor variables and features by trial and error. Feature selections based on coefficient of variation are simple and efficient methods which showed better performance compared with some state-of-the-art methods [19,20]. Han et al. [21] analyzed the variation coefficients and difference coefficients of RGB texture between maize and other vegetation and then selected the mean of green band and the homogeneity of blue band to identify maize. Guo et al. [22] used above methods to analyze HSV(Hue, Saturation and Value) and 24 texture features, and finally selected brightens, saturation and red second order moment to identify farmland crop such as grape, maize and cotton. Hu et al. [23] also applied the same method to extract the subsided cultivated land. Although above studies showed good performance for identification, they only focused on the farmland crop with homogeneous spatial distribution, leaving a gap for high heterogeneity environments such as mining areas with field crops and naturally growing vegetation. Besides, VIs are effective features for vegetation identification [24–26], but they were not considered in the feature selections.
To address the above issues, this study explored the effectiveness of vegetation mapping in highly heterogeneous areas with feature selection method based on standard deviation and difference coefficients, which could avoid complex parameter settings of commonly used feature selection methods. This study integrated UAV-based RGB imagery with selected vegetation indices (VIs), RGB textures, and HSV textures, and then used support machine vector (SVM) to classify on the different constructed feature combinations to identify the vegetation of Xinjie Taigemiao mining area. The contributions of the study are as follows: (1) to determine whether the difference coefficient based feature selection method can apply to the extraction of non-crop plants with high heterogeneity; (2) to validate the effectiveness of incorporating VIs into the feature; and (3) to explore the optimal combinations of RGB image with other features. This work could provide new insights for ecological analysis of before and after mining such as diversity investigation and mining impact evaluation.
Materials and methods
Study area and data acquisition
This paper selected the initial working face of Xinjie Taigemiao mining area as the study area as shown in Fig 1. The vector of the administrative map was from Resource and Environmental Science Data Platform: https://www.resdc.cn/DOI/DOI.aspx?DOIID=121. This region is located in the ecological functional area of typical grassland desertification control of Ordos Plateau, and the landscapes are mainly the hills and undulating plains. It has inland semi-arid climate with heterogeneous vegetation such as crop, trees and shrubs. This study was carried out in two experimental areas. The study area A covers an area of about 9.9 × 104 m2, and the main land cover types were Zea mays (maize), Helianthus annuus (sunflowers), Pinus sylvestris, grasslands and bare land (Fig 2). Among them, the Pinus sylvestris varied greatly in density and crown size. The study area B covers an area of about 2.4 × 104 m2, and the main land covers area B were maize, Artemisia arenaria, Salix mongolia, grasslands and bare land in which Artemisia arenaria and Salix mongolia were scattered distribution.
A presents the data of experimental region, B presents data of verified region. Map was created using ArcGIS from Esri (http://www.arcgis.com). The base map images were author-owned RGB image data acquired from UAV surveys.
The DJI M600 were selected as the UAV platform which was equipped with the Phase One IXM 100 MP visible light digital camera to acquire the high resolution RGB image with a spectral range of 0.4–0.7 µm. The adopted sensor type of the camera was CMOS with dynamic range of 83dB. The camera was with a range of specifically designed RSM lenses from 35 mm to 300 mm, and max field of view was 63°. The UAV flight height was set to 500 meters, and the overlaps were 80% and 70% in the flight direction and in the lateral direction, respectively. The photo interval was 3 seconds. The data was acquired in September 2022 at which time most of sunflowers were blooming. The orthoimages with 10.0 cm/pixel spatial resolution were generated with the software of Agisoft Metashape 1.8.4. and stored in the TIFF format. Each band of RGB image contained 8-bit information with a value range of 0–255.
Workflow overview
Since UAV RGB image only contains the red, green, and blue band, it is difficult to distinguish different vegetation with the RGB color information. Therefore, this paper extracted VIs, RGB textures and HSV textures to improve the differentiation between the vegetation, and the extracted features were normalized to make them comparable. Then, the standard deviation and inter-class difference coefficient between different vegetation were compared and analyzed to select optimal features and construct the feature combinations with RGB image. This paper constructed the following combinations: RGB & optimal VIs, RGB & optimal RGB texture features, RGB & optimal HSV texture features, and RGB & optimal VIs & optimal RGB texture features & optimal HSV texture features. Then SVM was conducted based on the combinations to identify the vegetation. Visual interpretation combining with confusion matrix was used to evaluate the accuracy to determine the optimal feature combinations. The workflow of the study was shown in Fig 3.
Feature extraction
Vegetation index extraction.
Vegetation indexes have been proved effective for vegetation identification. However, compared with multispectral images, UAV visible light images could not apply traditional indexes such as Normalized Difference Vegetation Index (NDVI) since they did not contain near-infrared band. In this situation, scholars have constructed varieties of indexes focusing on the RGB images to increase the differentiation between the vegetation and non-vegetation. To explore the feasibility of UAV visible light images for vegetation classification and identification, commonly used visible light vegetation indexes were used including Red Green Ratio Index (RGRI) [27], Normalized Green Blue Difference Index (NGBDI) [28], Normalized Green Red Difference Index (NGRDI) [29], Excess Green Index (EXG) [30], and Vegetation Difference Index (VDVI) [11], as shown in Table 1.
Texture extraction based on RGB image.
Textures can reveal the detailed structures of target objects, and are one of the most important features in the remote sensing images. They could reflect the homogeneity and were used to increase the phenomena discrimination of “same object different spectrum” and “same spectrum foreign body” combining with spectral features, thereby improving vegetation classification accuracy [23,24]. To extract the textures of RGB image, the ENVI software was used to perform co-occurrence measures and default parameters were adopted as follows. The filter window size was set to 3 × 3, and the changes in the spatial correlation matrices X and Y were set to 1. The gray level was set to 64. The extracted texture features included the mean, variance, homogeneity, contrast, dissimilarity, entropy, second moment, and correlation of the R, G, and B bands, resulting in a total of 24 features.
Texture extraction based on HSV image.
HSV color space consists of hue, saturation, and brightness values. The three components of RGB image always highly correlate with each other, and the conversion from RGB space to HSV space could decouple the chromaticity (H and S) and the brightness (V) [31], which could enhance the readability of an image and was proved useful for object identification and extraction. Thus, the color space transform was performed as shown in Fig 4. It could be observed that there were obvious differences in hue, saturation, and texture between different vegetation types. Therefore, texture filtering was applied to the converted image and the set was the same as the extraction of RGB texture. Then total of 24 texture features were obtained for the H, S, and V components.
(a) Pinus sylvestris, (b) Sunflower, (c) Maize, (d) Grass.
Feature selection
Feature analysis and selection were conducted using the mean, standard deviation (S), and inter-class difference coefficient ) of different vegetation samples in the images. The calculation formulas were as follows [32]:
where represents the sample grayscale value of same class, n is the number of samples, x is the sample mean,
is the inter-class difference coefficient,
is the sample mean value of one class, and
is the sample mean value of another class.
The standard deviation reflects the degree of dispersion of feature. The lower the degree of dispersion, the more conducive to use this feature to extract the object. The inter-class difference coefficient reflects the differences between classes. The higher the difference coefficient, the easier to distinguish the different vegetation.
This study only compared and analyzed the standard deviation and difference coefficient among vegetation since the obvious spectral differences between bare land and vegetation made it easy to distinguish. The specific selection method was as follows: Firstly, the difference coefficients between first class and other classes were sorted in reverse order, and features with difference coefficients greater than 70% were selected. Then, the standard deviations of the first class were sorted, and features with standard deviations less than 30% were chosen. This process was repeated until all the classed were compared and analyzed, then the feature combinations for distinguishing different vegetation were determined.
Vegetation identification
SVM classification is a commonly used machine learning method which can automatically find the support vectors with significant discriminative power for classification. SVM constructs the optimal hyperplanes to maximize class intervals through kernel functions which maps linearly inseparable data to high-dimensional space. It has the characteristics of simple structure, strong adaptability and global optimality, and can effectively solve problems such as high-dimensional features, nonlinearity, over learning and uncertainty. Previous study showed that SVM could achieve good accuracy in vegetation extraction [33], and the radial basis kernel function was more suitable for distinguishing different types of crops [34]. Therefore, this paper used the radial basis kernel function of SVM to distinguish the vegetation based on different optimal feature combinations.
Results and analysis
Feature selection result
Selection of VIs.
Table 2 showed the standard deviation and difference coefficient of the VIs for different vegetation of study area A. It could be found that the difference coefficients vary greatly, for example, the maximum difference coefficient between sunflower and maize is 59.31%, and the minimum value is 2.47% which indicated that not all features were conducive to the identification and analysis of vegetation. According to the feature selection method, this paper first determined the features to distinguish maize and Pinus sylvestris. It could be found that RGRI have largest difference coefficient and a smaller standard deviation which indicated that RGRI could be used to distinguish between maize and Pinus sylvestris. Similarly, it was found that RGRI could distinguish between maize and grassland, and EXG could distinguish between maize and sunflower, between Pinus sylvestris and sunflower, and between Pinus sylvestris and grassland. Therefore, RGRI and EXG were selected as optimal VIs for the study area A. This paper performed the same actions to the study area B and the result showed that optimal VIs was RGRI and NGRDI.
Selection of RGB texture
Table 3 showed the standard deviation and difference coefficient of the RGB texture for different vegetation of study area A. The features to distinguish between Pinus sylvestris and maize were determined according to the feature sorting and were the contrast of red band as well as blue band and the variance of blue band. Similarly, the contrast and variance of red band as well as blue band were determined to distinguish between Pinus sylvestris and sunflower as well as grassland. The features to distinguish between grassland and maize, Pinus sylvestris, sunflower were the variance and contrast of green band as well as red band. The mean of red band, green band and blue band were with larger difference coefficient and less standard deviation to distinguish between sunflower and maize. Finally, the contrast and variance of red band as well as the contrast of blue band were determined to be the optimal features for RGB texture analysis based on the frequency and standard deviation of these features. This paper performed the same actions to the study area B and the result showed that optimal RGB textures were the variance and contrast of green band as well as the contrast of blue band. Thus, it could be seen that the variance and contrast played an important role to distinguish between vegetation for RGB texture.
Selection of HSV texture.
Table 4 showed the standard deviation and difference coefficient of the HSV texture for different vegetation of study area A. The second moment of brightness could be used to distinguish between maize and Pinus sylvestris. The second moment and variance of saturation as well as the second moment and mean of brightness could be used to distinguish between maize and sunflower. The features to distinguish between maize and grassland were the variance of hue and saturation as well as the contrast of hue and brightness. The second moment and contrast of saturation as well as the mean and second moment of brightness were with larger difference coefficient and less standard deviation to distinguish between Pinus sylvestris and sunflower. The features to distinguish between the Pinus sylvestris and grassland were the contrast of hue and saturation as well as the variance of hue, saturation and brightness. And the contrast and variance of hue and saturation could be used to differentiate between sunflower and grassland. Finally, the contrast of hue, the variance of saturation and the second moment of brightness were determined to be the optimal features for HSV texture analysis based on the frequency and standard deviation of these features. This paper performed the same actions to the study area B and the result showed that optimal HSV textures were the variance of hue as well as the variance and contrast of brightness. Thus, it could be seen that the variance and contrast also played an important role to distinguish between vegetation for HSV texture.
Classification results with different feature combinations
This paper performed SVM classification on the feature combinations mentioned above of study area A, and obtained the classification results as shown in Fig 5. It could be observed that all the combinations obtained good identification results for the Pinus sylvestris with larger crowns since the obvious differences with other vegetation as indicated by pink circle in Fig 5a. However, the identification varied greatly for the Pinus sylvestris with smaller crowns. Among them, there were cases where Pinus sylvestris were misclassified as maize for the combinations of RGB with selected VIs as indicated by Fig 5a. Besides, there were more omission cases for sunflowers which could only recognize the flower heads while miss the leaves as marked by Fig 5a. When using the combinations of RGB with selected RGB texture, some maizes were misclassified as Pinus sylvestris as indicated in the top left of Fig 5b. This might be due to slight blurriness of this area thereby causing differences of RGB texture compared to that of surrounding area. However, it was not affected for the combinations of RGB with selected VIs. Besides, Pinus sylvestris was still misclassified as maize in several areas such as the marked area in the top right of Fig 5b and the same went for the combinations of RGB with selected HSV texture as indicated by Fig 5c. However, sunflower recognition of the third combinations were obviously better than the previous two combinations. For the combinations of RGB with selected VIs, RGB texture and HSV texture, the result could significantly improve for the aforementioned issues, but they were difficult to identify the small sunflowers that had not yet blossomed just like the effects of previous combinations as marked in Fig 5d.
(a) RGB& selected VIs; (b) RGB& selected RGB texture; (c) RGB& selected HSV texture; (d) RGB& selected VIs &RGB texture & HSV texture; (e) RGB& selected features by RF1; (f) RGB& selected features by RF2.
The classification results of study area B were as shown in Fig 6. It could be observed that there were misclassification errors between these vegetation for the combinations of RGB image with selected VIs as marked in Fig 6a. Among them, many pixels belonging to the maize and grass were classified into Salix mongolia. This situation could improve when combining with the textures. However, there were part misclassification cases between Artemisia arenaria and Salix mongolia for the combinations of RGB with its texture. The same also went for the combinations of RGB with HSV texture, and there were also the cases that the Artemisia arenaria and grass were classified into maize and Salix mongolia, respectively. For the combinations of RGB with selected VIs, RGB texture and HSV texture, the result could significantly improve compared with other combinations.
(a) RGB& selected VIs; (b) RGB& selected RGB texture; (c) RGB& selected HSV texture; (d) RGB& selected VIs &RGB texture & HSV texture; (e) RGB& selected features by RF1; (f) RGB& selected features by RF2.
The confusion matrix was used to evaluate the accuracy of vegetation extraction, and validation samples were randomly and uniformly selected on the image through visual interpretation. There was a total of 104021 validation samples for the study area A, including 36518 maize samples, 29633 Pinus sylvestris samples, 14289 sunflower samples, 15859 grassland samples, and 7722bare land samples. There was a total of 70315 validation samples, including 20010 maize samples, 24220 Salix mongolia samples, 10952 Artemisia arenaria samples, 9972 grassland samples, and 5161 bare land samples.
The accuracy evaluation results were as shown in Table 5 and 6. It could be observed that the combinations of RGB& optimal VIs were with the lowest accuracy for both the overall accuracy (OA) and Kappa values, then followed by RGB& optimal textures, and the combinations of RGB& all selected features obtained the highest accuracy. This indicated that the inclusion of VIs was beneficial for the vegetation identification in the heterogenous areas, but they were less important than the texture features.
As to the study area A, the highest OA and Kappa values obtained based on the best combinations were 87.76% and 0.8351, respectively. Among them, all the vegetation obtained high producer accuracy and user accuracy exceeding 80%, in which Pinus sylvestris had the lowest user accuracy mainly due to the low distinguishability between maize and Pinus sylvestris, resulting in most Pinus sylvestris being misclassified as maize. Although the OA difference between the third and fourth combinations was only 2.3%, they had large accuracy difference for some vegetation. For example, the gap of producer accuracy was 10.87% for Pinus sylvestris between these two combinations.
As to the study area B, the highest OA and Kappa values obtained based on the best combinations were 88.74% and 0.8505, respectively. Among them, maize was with high producer accuracy and user accuracy exceeding 90%, and Artemisia arenaria was with the lowest producer accuracy and user accuracy might due to its spectral similarity and partly spatial overlap with Salix mongolia.
Comparison with other feature selected method
To prove the effectiveness of the difference coefficient-based method, we also adopted widely used RF method for the experiment. We constructed two feature sets through the following ways: (1) selecting features from VIs, RGB texture and HSV texture by RF, respectively, and the number of each kind was equal to that of difference coefficient-based method; (2) integrating VIs, RGB texture and HSV texture, and then selecting 8 features by RF (The best combinations had 8 selected features with difference coefficient-based method). These two feature sets were indicated as “selected feature by RF1”and “selected feature by RF2”. The number of trees and predictor variables was set to 500 and 5 by trial and error. For the study area A, NGBDI, EXG and the mean of red, green, blue, hue, saturation and brightness band were the selected features by RF1. These features including RGRI, mean of blue, hue, saturation and brightness band, correlation of hue and brightness band as well as contrast of brightness band were selected by RF2. For the study area B, the features selected by RF1 were just the same as that of the study area A. These features including NGBDI, mean of red, blue, hue and saturation band as well as the variance, contrast and entropy of hue band were selected by RF2. The selected features were combined with RGB image to perform classification and the results were as shown in Fig 5 and 6, and the accuracy evaluations were as shown in Table 5 and 6. It could be observed that the recognition from RF2 combination was better than that from RF1 combination.
As to the study area A, we could observe that there were large numbers of misclassifications between Pinus sylvestris and maize, and omission cases for bloomed sunflowers as indicated in Fig 5e for the classification result from RF1 combination, and thereby leading to lower OA and Kappa with 73.75% and 0.6432. As for the combinations from RF2, it could effectively identify the not yet bloomed sunflower, but it misclassified lots of Pinus sylvestris and maize into the sunflower as shown in Fig 5f. Thus, its OA and Kappa was lower than that of the best combination from difference coefficient-based method. As to the study area B, many pixels belonging to maize and grass were classified into Salix mongolia as indicated in Fig 6e for the classification result from RF1 combination and thereby leading to lower OA and Kappa with 80.00% and 0.7332. As for the combinations from RF2, there were lots of misclassifications between Salix mongolia and Artemisia arenaria as shown in Fig 6f, thereby leading to lower accuracy than that of the best combination from difference coefficient-based method.
Discussion
This paper chosen the more affordable UAV RGB image to map these cover types through constructing feature combination and classifying. VIs, RGB texture and HSV texture were extracted and selected by analyzing the standard deviation and difference coefficient. The result showed that the identification accuracy varied greatly for different combinations. Among them, the accuracy of RGB&VIs was much lower than that of RGB&single type of texture indicating that the texture was more important than the used VIs which was consistent with previous findings on desert vegetation [16]. Besides, this paper made comparisons with other feature combinations selected by widely used RF, and the result showed that our constructed optimal combination was superior to those, which further proved that the proposed method was simple and effective. Notably, the introduction of a band selection method based on standard deviation and difference coefficients provides a streamlined yet robust approach for feature selection, enhancing classification performance in complex environments such as mining areas.
And there were a lot of factors to impact the vegetation extraction such as growth condition, image quality and spatial heterogeneity. For example, a small part of sunflowers in the study area A had not yet bloomed and showed differences in colors and textures with those bloomed sunflowers which would exacerbate the phenomena of “same object different spectrum” as shown in Fig 7. We could observe that the spectrum of sunflower not yet bloomed (marked by red line) was closer to that of maize (marked by blue line) thereby resulting these sunflowers misclassifying into maize. Besides, longer fight campaigns would increase the probability of encountering weather changes [35] including illumination and wind, therefore influencing the image quality such as decreasing the contrast of ground objects. This might one of the reasons leading to misclassification between Pinus sylvestris and maize. Additionally, the changes in the density and scattered distribution would increase the spatial heterogeneity making classification more challenging
It is pointed out that UAV can offer high-frequency observations. Thus, the future research can adopt multi-temporal UAV imagery to leverage the varying phenological characteristics to increase the differentiation between vegetation, thereby further improving the accuracy.
Conclusion
The vegetation distribution map in mining areas was an important basis for conducting pre-mining ecological assessments and post-mining ecological monitoring. This paper adopted UAV RGB image to identify the vegetation with high heterogeneity, and different feature combinations were constructed based on standard deviation and difference coefficients and then performed with SVM. The results showed that the accuracy for the combinations of RGB& all selected feature were best, next was the combinations of RGB& single type of texture and then was the combinations of RGB &VIs. This indicated that texture features were more important than VIs for vegetation discrimination. The OA and Kappa of best combination were 87.76% and 0.8351 for study area A, whereas 88.74% and 0.8505 for study area B. And the producer accuracy and user accuracy for most vegetation exceeded 80% which indicated that the adopted method was effective for vegetation identification. Besides, our constructed optimal combination was superior to that constructed by RF, which further proved that the proposed method was simple and effective.
References
- 1. Sutton A, Fisher A, Metternicht G. Assessing the accuracy of landsat vegetation fractional cover for monitoring Australian drylands. Remote Sens. 2022;14(24):6322.
- 2. Wellmann T, Schug F, Haase D, Pflugmacher D, van der Linden S. Green growth? On the relation between population density, land use and vegetation cover fractions in a city using a 30-years Landsat time series. Lands Urban Plan. 2020;202:103857.
- 3. Abdolalizadeh Z, Ghorbani A, Mostafazadeh R, Moameri M. Rangeland canopy cover estimation using Landsat OLI data and vegetation indices in Sabalan rangelands, Iran. Arab J Geosci. 2020;13:1–13.
- 4. Dai J, Zhang G, Guo P, Zeng T, Cui M, Xue J. Classification method of major crops in northern Xinjiang based on UAV remote sensing visible light image. J Agric Eng. 2018;34(18):122–9.
- 5. Yang Z, Yu X, Dedman S, Rosso M, Zhu J, Yang J, et al. UAV remote sensing applications in marine monitoring: Knowledge visualization and review. Sci Total Environ. 2022;838(Pt 1):155939. pmid:35577092
- 6. Zhang H, Wang L, Tian T, Yin J. A review of unmanned aerial vehicle low-altitude remote sensing (UAV-LARS) Use in agricultural monitoring in China. Remote Sens. 2021;13(6):1221.
- 7. Pearse GD, Tan AYS, Watt MS, Franz MO, Dash JP. Detecting and mapping tree seedlings in UAV imagery using convolutional neural networks and field-verified data. ISPRS J Photogramm Remote Sens. 2020;168:156–69.
- 8. Schiefer F, Kattenborn T, Frick A, Frey J, Schall P, Koch B, et al. Mapping forest tree species in high resolution UAV-based RGB-imagery by means of convolutional neural networks. ISPRS J Photogramm Remote Sens. 2020;170:205–15.
- 9. Liu S, Yin D, Feng H, Li Z, Xu X, Shi L, et al. Estimating maize seedling number with UAV RGB images and advanced image processing methods. Precision Agric. 2022;23(5):1604–32.
- 10. Bai X, Liu P, Cao Z, Lu H, Xiong H, Yang A, et al. Rice Plant Counting, Locating, and Sizing Method Based on High-Throughput UAV RGB Images. Plant Phenomics. 2023;5:0020. pmid:37040495
- 11. Wang X, Wang M, Wang S, Wu Y. Vegetation information extraction based on visible light band UAV remote sensing. J Agri Eng. 2015;31(5):152–9.
- 12. Zhang X, Zhang F, Qi Y, Deng L, Wang X, Yang S. New research methods for vegetation information extraction based on visible light remote sensing images from an unmanned aerial vehicle (UAV). Int J Appl Earth Obs Geoinf. 2019;78:215–26.
- 13. Li Z, Ding J, Zhang H, Feng Y. Classifying individual shrub species in UAV images—A case study of the Gobi region of Northwest China. Remote Sens. 2021;13(24):4995.
- 14. Xia Y. Object-Based wetland vegetation classification using multi-feature selection of unoccupied aerial vehicle RGB imagery. Remote Sens. 2021;13:4910.
- 15. Li X, Shao G. Object-based urban vegetation mapping with high-resolution aerial photography as a single data source. Int J Remote Sens. 2012;34(3):771–89.
- 16. Zhou H, Fu L, Sharma R, Lei Y, Guo J. A hybrid approach of combining random forest with texture analysis and VDVI for desert vegetation mapping based on UAV RGB data. Remote Sens. 2021;13(10):1891.
- 17. Feng Q, Liu J, Gong J. UAV remote sensing for urban vegetation mapping using random forest and texture analysis. Remote Sens. 2015;7(1):1074–94.
- 18. Grybas H, Congalton RG. A comparison of Multi-Temporal RGB and multispectral UAS imagery for tree species classification in heterogeneous new hampshire forests. Remote Sens. 2021;13(13):2631.
- 19. Su P, Tarkoma S, Pellikka PKE. Band Ranking via Extended Coefficient of Variation for Hyperspectral Band Selection. Remote Sens. 2020;12(20):3319.
- 20. Chaudhari K, Thakkar A. Neural network systems with an integrated coefficient of variation-based feature selection for stock price and trend prediction. Expert Syst Appl. 2023;219:119527.
- 21. Han W, Li G, Yuan M, Zhang L, Shi Z. Extraction method of maize planting information based on UAV remote sensing techonology. Trans Chinese Soc Agric Mach. 2017;48(1):139–47.
- 22. Guo P, Wu F, Dai J, Wang H, Xu L, Zhang G. Comparison of farmland crop classification methods based on visible light images of unmanned aerial vehicles. Trans Chinese Soc Agric Eng. 2017;33(13):112–9.
- 23. Hu X, Li X. Comparison of subsided cultivated land extraction methods in high-groundwater level coal mines based on unmanned aerial vehicle. J China Coal Soc. 2019;44(11):3547–55.
- 24. Guo Y, Wang H, Wu Z, Wang S, Sun H, Senthilnath J, et al. Modified red blue vegetation index for chlorophyll estimation and yield prediction of maize from visible images captured by UAV. Sensors (Basel). 2020;20(18):5055. pmid:32899582
- 25. Saponaro M, Agapiou A, Hadjimitsis D, Tarantino E. Influence of spatial resolution for vegetation indices’ extraction using visible bands from unmanned aerial vehicles’ orthomosaics datasets. Remote Sens. 2021;13(16):3238.
- 26. Agapiou A. Vegetation extraction using visible-bands from openly licensed unmanned aerial vehicle imagery. Drones. 2020;4(2):27.
- 27. Gamon J, Surfus J. Assessing leaf pigment content and activity with a reflectometer. New Phytol. 1999;143(1):105–17.
- 28. Woebbecke DM, Meyer GE, Bargen KV, Mortensen DA. Plant species identification, size, and enumeration using machine vision techniques on near-binary images. Proceedings of SPIE-Int Soc Opt Eng. 1993;1836:208–19.
- 29. Hunt E, Cavigelli M, Daughtry C, Mcmurtrey J, Walthall C. Evaluation of digital photography from model aircraft for remote sensing of crop biomass and nitrogen status. Precision Agric. 2005;6(4):359–78.
- 30. D. M. Woebbecke, G. E. Meyer, K. Von Bargen, D. A. Mortensen. Color Indices for weed identification under various soil, residue, and lighting conditions. Transactions of the ASAE. 1995;38(1):259–69.
- 31. Pekel J-F, Vancutsem C, Bastin L, Clerici M, Vanbogaert E, Bartholomé E, et al. A near real-time water surface detection method based on HSV transformation of MODIS multi-spectral time series data. Remote Sens Environ. 2014;140:704–16.
- 32. Dai J, Zhang G, Guo P, Zeng T, Cui M, Xue J. Classification method of main crops in northern Xinjiang based on UAV visible waveband images. Trans Chinese Soc Agric Eng. 2018;34(18):122–9.
- 33. Perez-Ortiz M, Pena J, Gutierrez P, Torres-Sanchez J, Hervas-Martinez C, Lopez-Granados F. Selecting patterns and features for between- and within-crop-row weed mapping using UAV-imagery. Expert Syst Appl. 2016;47(1):85–94.
- 34. Wang X, Wang Z, Jin G, Yang J. Land reserve prediction using different kernel based support vector regression. Trans Chinese Soc Agric Eng. 2014;30(04):204–11.
- 35. Wang Y, Yang Z, Kootstra G, Khan HA. The impact of variable illumination on vegetation indices and evaluation of illumination correction methods on chlorophyll content estimation using UAV imagery. Plant Methods. 2023;19(1):51. pmid:37245050