高级检索+

群养生猪行为识别与目标跟踪的自动监测

Behavior recognition and object tracking for automated monitoring of group-housed pigs

  • 摘要: 群养生猪的行为识别跟踪与自动监测是现代智慧养殖中的关键环节。针对实际养殖场存在的光照变化、猪只碰撞和密集遮挡等引起的目标丢失和目标ID切换问题,该研究提出一种SR-YOLO+OC-SORT猪只行为识别跟踪与自动监测方法。首先,在YOLOv8n的网络基础上,利用空间分组增强(spatial group enhancement, SGE)模块,并引入特征金字塔(re-parameterized generalized feature pyramid network,RepGFPN)结构,构建SR-YOLO检测器,提升模型在复杂环境下的检测稳定性与精度;然后,基于SR-YOLO的猪只检测与行为识别结果,利用OC-SORT(observation-centric simple online and real-time tracking)跟踪器实现个体猪只行为跟踪;最后,通过猪只行为识别和跟踪结果,统计每头猪只的躺、站、进食和其他4类行为的时间占比,实现群养环境下个体猪只的行为自动监测。试验结果表明,在猪只检测方面,SR-YOLO的平均精度均值为90.1%,F1值为84.4%,对比YOLOv8n方法,其值分别提升3.1和1.0个百分点。在猪只跟踪方面,OC-SORT跟踪器在高阶跟踪精度和多目标跟踪精度上分别为85.2%和96.7%,展现出良好的整体跟踪性能。研究结果证实了SR-YOLO+OC-SORT的猪只行为识别与目标跟踪方法切实可行,可为群养生猪健康状况监测提供可靠的技术支持。

     

    Abstract: Precise tracking and behavior recognition in group-housed swine are critical for intelligent livestock farming. However, in real farming environments, object detection and tracking tasks present significant challenges due to pigs’ crowding, occlusion and illumination variation. To address these issues, this study proposed a multi-object tracking and behavior analysis framework named SR-YOLO+OC-SORT, which integrated an improved YOLOv8n detector with a robust tracking module. Firstly, the original C2f module of YOLOv8n was replaced with a spatially aware C2f_SGE module. C2f_SGE retained the efficient feature extraction capabilities of the C2f module while introducing a spatial enhancement mechanism. This mechanism strengthened the spatial information and semantic expression of feature maps, which significantly suppressed noise regions and enhanced the response of effective feature areas under uneven lighting and occlusion scenarios. Moreover, a lightweight RepGFPN structure was introduced into the neck to enhance feature fusion and detection robustness under complex conditions. Secondly, the detection results from SR-YOLO were fed into the OC-SORT tracker, which effectively maintained pig identity consistency even in challenging scenarios involving severe occlusion, dense groupings and varying illumination. Finally, an automatic behavior monitoring algorithm was designed by combining behavior categories with the OC-SORT tracking trajectories, enabling the time-based analysis of four typical pig behaviors (stand, lie, eat and other). To validate the effectiveness of the proposed method, experiments were conducted on two datasets: a public dataset and a private dataset. The public dataset comprised 10 video segments, with 6 segments used for training and validation, and 4 segments for testing. The private dataset originated from a commercial pig farming in Foshan City and consisted of 12 1-minute video clips. Among these, 8 video sequences were used for training and 4 for testing. All datasets were captured using fixed overhead cameras with a video resolution of 2688×1520 pixels. Each video was recorded and annotated at a rate of 5 frames per second, enabling stable documentation of pigs' behavioral activities within the pens. To ensure the diversity of behaviors, key frames were extracted from raw videos using FFmpeg 6.0, and different behaviors were precisely annotated using the DarkLabel tool. This dataset exhibited significant diversity across multiple dimensions, including pig body size, stocking density, and housing environments. This diversity made behavioral analyses based on this dataset more universally applicable and valuable for validation, thereby effectively supporting intelligent management in practical farming operations. Experimental results showed that the proposed method achieved superior performance in both detection and tracking tasks. In terms of detection, SR-YOLO achieved 90.1% of mAP (mean average precision at IoU-0.5) and 84.4% of F1-score on the public dataset, and 85.6% with mAP@0.5 and 83.7% with F1-score on the private dataset, outperforming mainstream detectors such as YOLOv5, YOLOv6, and YOLOv10. For multi-object tracking task, the SR-YOLO+OC-SORT framework outperformed classical approaches such as ByteTrack and BoT-SORT, which achieved 83.2% with HOTA (higher order tracking accuracy), 94.0% with MOTA (multiple object tracking accuracy), and 92.0% with IDF1(identity F1-score) on the public dataset. Moreover, it also obtained 85.2%, 96.7%, and 96.8% for HOTA, MOTA, and IDF1 on the private dataset, respectively. Furthermore, based on the behavior tracking information, individual pig behaviors were monitored and analyzed over time. The experimental results demonstrate that the proposed detection and tracking framework achieve superior accuracy and robust performance under diverse real-world conditions, which provides a scalable technical solution for the automatic monitoring of pig behavior in intelligent farming systems.

     

/

返回文章
返回