Abstract:
Yellow-feather broiler is a breed of chicken specifically for meat production in the poultry industry. It is often required to rapidly and accurately detect dead chickens in a stacked cage. Existing machine vision cannot fully meet the large-scale production in recent years, due mainly to severe target occlusion, frequent illumination variations, and the small size and posture similarity of dead chickens in high-density rearing environments. Single visible-light images and conventional deep learning detectors have also limited the robustness under weak lighting and heavy occlusion, especially when deployed on low-cost inspection robots. In this study, an improved YOLO11n model was proposed to detect dead chickens in caged poultry houses. Infrared–visible image was fused to enhance the robustness, accuracy, and real-time performance of dead chicken detection in complex cage-rearing environments, particularly suitable for deployment on resource-constrained inspection robots. Furthermore, the fusion strategy was designed for the thermal infrared images in temperature perception, while the visible-light images were used for structural and texture representation. Specifically, scale-invariant feature transform (SIFT) was employed to extract the feature points for matching between infrared and visible images with significantly different resolutions. The least median of squares (LMeds) algorithm was applied to eliminate mismatches for the registration robustness. Laplacian pyramid fusion was used to generate fused images for the detail preservation and structural consistency. Low computational complexity was also maintained, suitable for real-time applications. The original YOLO11n lightweight detector was modified to further improve detection performance under occlusion and small-target conditions. A Dysample dynamic upsampling module was introduced to enhance multi-scale feature recovery, thereby improving the representation of small and partially occluded targets in high-density cage environments. In addition, the Focal Loss function was incorporated into the training process to alleviate the negative impact of class imbalance between dead chickens and background samples. The training bias was effectively reduced for the model's robustness and recall performance. All training models were evaluated on three datasets, including visible-light, thermal infrared, and fused images. Experimental results demonstrated that the fusion-image detection significantly outperformed single-modality inputs over multiple evaluation metrics. Compared with the visible-light and thermal infrared datasets, the improved model with the fused image dataset also achieved superior detection accuracy, recall rate, and mean average precision. Taking the fused images as input, the improved YOLO11n-DS-Focal-Loss model achieved a detection precision of 96.8%, a recall rate of 92.6%, and mAP@50 and mAP@50:95 values of 94.9% and 68.3%, respectively, with a single-image inference time of only 8.3 ms. The accuracy and real-time performance were effectively balanced after optimization. Furthermore, the comparison was made on the mainstream lightweight detectors, including YOLOv5n, YOLOv6n, YOLOv8n, YOLOv10n, and YOLO12n. The superior performance was obtained, in terms of accuracy, efficiency, and stability, particularly under dense rearing and partial occlusion. Field inspection experiments were conducted in a real caged poultry house to further verify the effectiveness. The increasing inspection frequency significantly reduced the missed detections caused by the complete occlusion of dead chickens, especially for the broilers aged 40 days or above with dense feather coverage and larger body sizes. Previously occluded dead chickens were more visible from the different viewpoints, as the inspection rounds increased, indicating the high completeness and reliability of the detection. In conclusion, the infrared–visible image fusion with the improved YOLO11n framework can provide an effective and practical solution to detect dead chickens in large-scale caged poultry houses. The high accuracy and real-time performance with the low computational complexity are suitable for the deployment on the low-cost embedded inspection robots. This finding can also provide a valuable reference for the poultry house inspection in modern poultry farming.