SHI Xiaochen, ZHANG Kai, CHEN Zilin, et al. High-throughput phenotypic parameter extraction method for maize plants based on point cloud instance segmentationJ. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2026, 42(8): 180-190. DOI: 10.11975/j.issn.1002-6819.202509196
Citation: SHI Xiaochen, ZHANG Kai, CHEN Zilin, et al. High-throughput phenotypic parameter extraction method for maize plants based on point cloud instance segmentationJ. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2026, 42(8): 180-190. DOI: 10.11975/j.issn.1002-6819.202509196

High-throughput phenotypic parameter extraction method for maize plants based on point cloud instance segmentation

  • Maize (Zea mays L.) is a major crop for global food security at present. Advanced breeding is often required to enhance yield, stress resistance, and adaptability, particularly for high-throughput, non-destructive, and accurate acquisition of plant phenotypic parameters. Three-dimensional (3D) point clouds acquired by LiDAR can provide unprecedented detail of plant architecture, compared with 2D imaging. However, their widespread application has been confined to the accurate instance segmentation of individual plants within dense populations in real-world fields. Furthermore, conventional clustering or geometry algorithms cannot solve the convoluted spatial arrangement, complex plant morphologies, extensive canopy adhesion—where the leaves of adjacent plants are tightly interwoven—and mutual occlusion among plants. The efficient and reliable extraction of phenotypic data has been severely constrained to the fragmented or incorrectly merged plant instances. In this study, an instance segmentation framework, 3D-MaizeNet, was proposed to integrate LiDAR data with deep learning. Individual maize plants were accurately extracted for the high-throughput measurement of key agronomic traits, such as plant height and stem height. Three stages are included. (1) The structural integrity of the individual plant was preserved to avoid the compromise during simplistic preprocessing. An adaptive block segmentation was introduced using crop row detection. The row-planting pattern of farmlands was divided used to divide the large-scale point cloud into plant-centric blocks. This approach was used to effectively minimize the interference from overlapping canopies in adjacent rows. A high-quality, field-derived point cloud dataset was constructed for robust model training. (2) A local spatial encoding module was designed to learn fine-grained geometric features from complex canopy structures (e.g., leaf angles and stem orientations). Concurrently, an attention aggregation down-sampling module was integrated to reduce the loss of key spatial features during feature extraction. Salient information was selectively preserved to distinguish among tightly packed plants. (3) According to the high-fidelity instance segmentation, an pipeline was established for the high-throughput quantification of plant height and stem height—two pivotal phenotypic parameters closely related to yield potential and lodging resistance. Field-scanned data was were collected to validate the efficacy of the framework. Experimental results showed that the 3D-MaizeNet achieved a mean Average Precision (mAP) of 95.9% and an overall accuracy of 96.4% in instance segmentation, indicating the superior performance to identifyin identifying and delineate delineating the individual plant. Furthermore, the key traits were extracted for the strong correlations with manual ground-truth measurements, with coefficients of determination (R²) of 0.91 and 0.89 for plant height and stem height, respectively. The high-throughput and precise phenotyping platform can provide the technical support to advance the maize genomics, Genome-Wide Association Studies (GWAS), and ultimately the molecular breeding for next-generation crops.
  • loading

Catalog

    Turn off MathJax
    Article Contents

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return