高级检索+

基于相机与激光雷达融合的温室机器人行间导航方法

Inter-rows Navigation Method of Greenhouse Robot Based on Fusion of Camera and LiDAR

  • 摘要: 针对温室颠簸不平、枝叶遮挡道路的复杂环境,开展基于相机与激光雷达数据融合的机器人行间导航方法研究。首先,利用改进的U-Net模型实现图像道路区域的准确快速分割;其次,通过融合图像分割结果进行地面点云预分割,减少地面起伏造成的点云倾斜;然后,采用改进的KMeans算法实现作物行点云快速聚类,并将聚类中心作为作物行主干区域点,降低枝叶遮挡对作物行中线提取的影响;最后,采用RANSAC算法拟合两侧作物行方程并计算出导航线。通过实验评估导航线精度,在测试集中94%以上数据帧可以准确实现提取导航线,平均角度误差不高于1.45°,满足温室机器人沿作物行自主导航行驶要求。

     

    Abstract: Aiming at the complex greenhouse environment where the ground is bumpy and the branches and leaves block the road, the research on the inter-rows navigation method of greenhouse robot based on the fusion of camera and LiDAR data was carried out. Firstly, the improved U-Net model was used to realize the accurate and fast segmentation of image road area. Secondly, the ground point cloud was pre-segmented by fusing the image segmentation result to reduce the incline of the point cloud data caused by the ground bumpiness. Then, the improved KMeans algorithm was used to realize the rapid clustering of the crop row point cloud, and the cluster centers were used as the main area points of crop rows to reduce the influence of branches and leaves blocking the road on extraction of crop row centerline. Finally, the RANSAC algorithm was used to fit the crop row equations on both sides and calculate the navigation lines. The navigation line accuracy was evaluated by experiment, the validation work was conducted in two greenhouse scenarios at three typical greenhouse robot operation speeds. The experimental results showed that the performance and timing of the segmented images met the requirements of subsequent point cloud pre-segmentation; the experiment of point cloud data frames by bumpy environment can effectively calibrate the ground point cloud; compared with the raster height difference segmentation of ground point cloud, the segmentation effect was better and the time consumption of single frame processing was increased very little; on the test set, more than 94% of the data frames can accurately extract the navigation line and the average angle error was not higher than 1.45°. The research result can meet the greenhouse robot along the crop row autonomous navigation driving requirements.

     

/

返回文章
返回