高级检索+

基于深度学习的槟榔黄化病感病植株无人机遥感精准识别

Precision identification of areca nut yellowing disease-infected plants using UAV remote sensing based on deep learning

  • 摘要: 槟榔黄化病是一种严重危害槟榔生长的系统性病害,实现感病植株的精准监测对于槟榔产业的可持续发展至关重要。针对当前槟榔黄化病感病植株无人机遥感识别精度较低,易受变色灌木等相似地物目标干扰的问题。该研究基于无人机搭载RedEdge-M多光谱相机所获得的5波段多光谱遥感影像数据,通过计算欧式距离和J-M (Jeffreys-Matusita) 距离,结合可分性分析提取敏感光谱指标;基于影像标准化预处理、样本切片及人工核验筛选构建标准化检测数据集,共获取18520个高质量样本,按照8:1:1的比例划分训练集、验证集与测试集,之后通过改进YOLO (You Only Look Once) v10算法获取感病植株精准识别模型。模型在主干网络中引入GhostNet以增强关键波段的有效信息表达并提升对细粒度病征纹理的建模能力,采用双向特征金字塔网络(Bidirectional Feature Pyramid Network, BiFPN)强化复杂林分环境下的多尺度特征聚合效果,引入形状感知型 交并比(Shape-aware Intersection over Union, SIoU),通过边界框回归机制提高病株定位精度。结果表明,改进 YOLOv10 模型跨尺度特征融合能力增强使mAP@0.5提升至90.6%,相较于Faster R-CNN、YOLO v8&v10等模型约提升20%;用户精度为91.2%,召回率为92.7%,均显著优于其他经典深度学习模型。该模型实现了对槟榔黄化病感病植株的快速、准确检测,可满足槟榔黄化病巡查与监测的实际应用需求。

     

    Abstract: Areca yellowing disease is a systemic disease that severely compromises the growth of Areca catechu palms, and the precise monitoring of infected plants is crucial for the sustainable development of the areca nut industry. Traditional monitoring methods, which primarily rely on visual inspection and manual surveys, are often inefficient and have a limited operational scope. Unmanned aerial vehicle (UAV) remote sensing, with its distinct advantages of high spatial resolution, flexible response capabilities, and low operational costs, offers an effective solution to the challenges posed by the frequent cloudy and rainy weather in Hainan, which often degrades the quality and availability of satellite optical imagery. However, current UAV-based remote sensing approaches for monitoring Areca yellowing disease still suffer from several limitations, including low identification accuracy and significant interference from similar ground objects, such as discolored shrubs. To address these challenges, this study began with field sampling to assess the disease severity of A. catechu palms. Based on the observed disease characteristics and the separability in remote sensing imagery. Subsequently, we employed a MicaSense RedEdge-M multispectral camera mounted on a DJI Phantom 4 Pro V2.0 UAV to acquire 5-band multispectral imagery, encompassing the blue, green, red, near-infrared, and red-edge bands. We then calculated the Euclidean and Jeffreys-Matusita (J-M) distances to extract sensitive spectral indices with high separability, which were subsequently used to construct four distinct datasets for training typical models. A comprehensive and standardized detection dataset was then established through a rigorous process of image preprocessing, sample slicing, and manual verification, resulting in a total of 18,520 high-quality samples. This final dataset was partitioned into training, validation, and test sets at a ratio of 8:1:1. A precise identification model for infected plants was developed by implementing a series of targeted improvements to the YOLO (You Only Look Once) v10 algorithm. First, GhostNet was incorporated into the backbone network. This lightweight feature generation mechanism enhances the representation of effective information from key multispectral bands, reduces redundant feature computations, and improves the modeling capability for fine-grained disease textures. Second, in the feature fusion stage, a bidirectional feature pyramid network (BiFPN) was employed to replace parts of the original feature pyramid structure. This modification, with its learnable weights and bidirectional cross-scale fusion, strengthens the multi-scale feature aggregation capabilities, particularly in complex forest stand environments. Finally, the Shape-aware Intersection over Union (SIoU) Loss function was introduced into the regression branch of the detection head. This provides a more stable and geometrically constrained bounding box regression mechanism, which not only improves the localization accuracy of infected plants but also accelerates the model's convergence. The results demonstrate that the enhanced cross-scale feature fusion capability of the improved YOLOv10 model increased the mAP@0.5 to 90.6%, representing an improvement of approximately 20% compared to models such as Faster R-CNN, YOLOv8, and YOLOv10. Furthermore, the model achieved a precision of 91.2% and a recall of 92.7%, both of which significantly outperformed other classic deep learning models. This model achieves the rapid and accurate detection of Areca catechu plants infected with yellowing disease, effectively meeting the practical demands for the inspection and monitoring of the disease.

     

/

返回文章
返回