高级检索+

基于弱监督语义分割的灯盏花无人机遥感种植信息提取

Extraction of Erigeron breviscapus Planting Information by Unmanned Aerial Vehicle Remote Sensing Based on Weakly Supervised Semantic Segmentation

  • 摘要: 为及时掌握种植空间信息,保护和利用灯盏花,针对灯盏花垄间边界模糊,精细标记训练数据集获取困难问题,提出一种基于结合RGB波段最大差异法和弱监督语义分割的无人机遥感灯盏花种植信息提取方法。首先,对灯盏花进行边框级标记,制作弱标记数据集,减少标记时间成本;然后采用轻量级U-Net网络对弱标记数据集进行弱监督语义分割,实现灯盏花粗提取;最后,采用RGB波段最大差异法去除粗提取结果中的非灯盏花,实现灯盏花种植区精细提取。实验结果表明,提出方法在选取的3个灯盏花场景中交并比(Intersection-over-union, IoU)分别为90.55%、90.74%、86.63%,精度均高于面向对象分类法和最大似然法,并通过消融实验验证了方法的有效性。

     

    Abstract: In order to get spatial information of planting in time, protect and utilize Erigeron breviscapus, the fuzzy inter-ridge boundary and the difficulty in obtaining training data set of fine markers were solved. An unmanned aerial vehicle remote sensing planting information extraction method for Erigeron breviscapus based on the combination of RGB band maximum difference method and weakly supervised semantic segmentation was proposed. Firstly, Erigeron breviscapus was labeled at border level in order to make weakly labeled data set to reduce labeling time cost. Then, a lightweight U-Net network was used for weakly supervised semantic segmentation of the weakly labeled data set to achieve rough extraction of Erigeron breviscapus. Finally, the RGB band maximum difference method was used to remove the non-Erigeron breviscapus in the rough extraction results to achieve the fine extraction of Erigeron breviscapus growing area. The experimental results showed that the proposed method in IoU was 90.55%, 90.74% and 86.63%, respectively, in three selected Erigeron breviscapus scenes, and the accuracy was higher than object-oriented classification method and maximum likelihood method. The effectiveness of the method was verified by ablation experiments.

     

/

返回文章
返回