高级检索+

基于GSBS-YOLOv8的棉花病虫害轻量化检测方法

Lightweight detection method for cotton pests and diseases based on GSBS-YOLOv8

  • 摘要: 针对棉田自然场景下病虫害目标尺度差异大、小目标特征弱及边缘端部署受限等问题,该研究提出一种轻量化棉花病虫害检测模型GSBS-YOLOv8。模型在骨干网络和特征融合网络中引入GSConv与Ghost C2f模块,以降低模型参数量和计算复杂度;在颈部网络中引入双向特征金字塔网络(Bidirectional Feature Pyramid Network, BiFPN),并采用Shape-IoU损失函数优化边界框回归,以增强多尺度特征表达和目标定位能力。试验结果表明,GSBS-YOLOv8的平均精度均值(mean Average Precision, mAP)提高1.0个百分点,参数量由3.01×106降至1.19×106,浮点运算量(Floating Point Operations, FLOPs)由8.2 G降至4.7 G,模型体积由5.8 MB降至2.7 MB;图形处理器(Graphics Processing Unit, GPU)环境下推理速度为101.3 帧/s,在Jetson Xavier NX平台经TensorRT加速后达到43.7 帧/s。结果表明该模型在保证检测精度的同时显著压缩了模型规模,可为棉田病虫害边缘端智能监测提供方法支持。

     

    Abstract: To address the pronounced scale variation of disease and pest targets, weak feature representation of small objects, and the limitations of edge deployment in natural cotton-field scenes, a lightweight cotton disease and pest detection model, GSBS-YOLOv8, was proposed on the basis of You Only Look Once version 8 (YOLOv8). The objective was to improve detection accuracy under field conditions while substantially reducing model complexity, computational burden, and deployment cost. In the model design, GSConv and Ghost C2f modules were introduced into the backbone network and feature fusion stages to replace part of the conventional convolutional structure, thereby reducing parameter redundancy, improving feature reuse, and lowering computational overhead without weakening semantic extraction capability. A Bidirectional Feature Pyramid Network (BiFPN) was further embedded into the neck to strengthen the bidirectional propagation and weighted fusion of shallow spatial information and deep semantic information, so that fine-grained cues of small disease spots and pests at different imaging distances were more effectively retained. In addition, the Shape-IoU loss was adopted for bounding-box regression to enhance geometric matching between predicted boxes and target contours, which improved localization robustness for small, slender, and irregular targets under complex canopy backgrounds. Experimental results showed that the proposed model achieved a 1.0 percentage point increase in mean Average Precision (mAP) over the baseline YOLOv8 model, indicating that the lightweight reconstruction not only preserved detection capability but also improved the recognition of multi-scale disease and pest targets. Meanwhile, the number of model parameters decreased from 3.01 million to 1.19 million, corresponding to a reduction of 60.5%, while Floating Point Operations (FLOPs) decreased from 8.2 G to 4.7 G, corresponding to a reduction of 42.7%. The model size was compressed from 5.8 MB to 2.7 MB, representing a reduction of 53.4%, which markedly lowered storage and transmission burden and improved the feasibility of deployment on resource-constrained hardware platforms. The simultaneous improvement in accuracy and compression efficiency showed that the optimized architecture enhanced feature representation efficiency and alleviated the common trade-off between lightweight design and detection performance. In the inference evaluation, the model reached 101.3 frames per second on a Graphics Processing Unit (GPU), demonstrating strong real-time performance in a high-throughput computing environment. After TensorRT acceleration on the Jetson Xavier NX platform, the inference speed still reached 43.7 frames per second, which confirmed that the model maintained efficient execution under embedded conditions and satisfied the practical requirement of real-time field monitoring. The overall results showed that the coordinated introduction of lightweight convolutional design, ghost feature generation, bidirectional multi-scale fusion, and improved regression loss effectively balanced accuracy and efficiency, enhanced small-object perception and localization performance, and avoided the substantial decline in detection capability that often accompanied model compression. Therefore, GSBS-YOLOv8 provided an effective lightweight solution for intelligent edge-side monitoring of cotton diseases and pests in natural field environments, and the study offered technical support for subsequent applications in rapid field scouting, precision prevention and control, and low-power agricultural vision systems.

     

/

返回文章
返回