Abstract:
This study aims to accurately prevent and control of the flying insects and pests in strawberry greenhouse. An insect monitoring system was also proposed using improved YOLOv8n model. The inspection robot was utilized to collect the image of the armyworm plate. The improved deep learning algorithm (YOLO-Pest) was combined to accurately identify and count the small target pests. The experiment was carried out in the solar greenhouse of Yongqing Precision Test Base of Academy of Agricultural Planning and Engineering from February to June 2024. Patrol robots were utilized to capture 510 raw images of yellow sticky traps. These images were then subjected to the preprocessing steps, including color space segmentation, affine transformation correction, and data augmentation, such as rotation, mirroring, and brightness adjustment. After that, 2550 training samples were obtained. All pest targets were uniformly labeled as the "Pest" category. The dataset was divided into the training, validation, and test sets at a ratio of 8:1:1. Statistical analysis indicates that there are a large number of pest infestations of small size in the data. The YOLO-Pest model was developed to improve the neck and head network of the baseline YOLOv8n model. Four key improvements were selected: (1) A microtarget detection head was utilized with a resolution of 160×160 pixels, in order to detect the small target pests as small as 4×4 pixels; (2) The C2f-Dual module was used to replace the original C2f feature fusion layer. The Dual Path Grouped Convolution (Dual Path Grouped Convolution, DualConv) was utilized to reduce the computational cost for the high efficiency of the feature extraction. (3) The Haar Wavelet Downsampling (Haar Wavelet Downsampling, HWD) module was preserved the key details of the small target; (4) An upsampling module with Dynamic point Sampling (Dynamic Sampling, DySample) was used to alleviate the loss of the high-frequency information during resolution scaling. A series of experiments was finally performed on a platform equipped with 13 generations of Intel i7-13620H processor and RTX 4060 GPU. The results show that the YOLO-Pest model was achieved in 92.4% precision, 91.5% recall and 95.2% average precision (AP
50), which were improved by 3.4, 2.1, and 1.9 percentage points, respectively, compared with the baseline YOLOv8n model. The number of the pests after prediction shared a strong correlation with the real data, where the coefficient of determination (
R²) was 0.996, the Mean Absolute Error (MAE) was 1.62, the Root Mean square error (RMSE) was 3.14, and the relative error (RE) was 4.16%. A comparison was made on the Faster R-CNN, SSD and YOLOv5n. The YOLO-Pest was further verified in the detection of the small objects. In addition, the bilinear interpolation was used to interpolate the daily incremental data of 15 pairs of sticky traps. A global heat map of the greenhouse insect situation was generated to represent the spread trend of the pests and diseases. The correlation between the predicted and the measured value was
R²=0.972 (
P<0.05). The dynamic classification (mild, moderate, and severe) was then realized to monitor the severity of the pests and diseases. The accurate management was provided to monitor the flying insects and pests in greenhouse. The YOLO-Pest model was enhanced to detect the small target pests. The accuracy of the global heat map was verified for the insect conditions in greenhouse. Therefore, only flying insects and pests were identified and counted in a unified way. A higher resolution camera can be expected to take the sample images in the future. In addition, the intelligent equipment can be designed for the armyworm plates, in order to classify and detect the insects and pests in unmanned farms.