Abstract:
Leaf Area Index (LAI) is a key parameter for describing canopy development and growth status in apple orchards. This study developed a multimodal unmanned aerial vehicle based framework for leaf area index estimation across three phenological stages of apple trees, including the end-flowering stage, spring shoot growth stage, and fruit expansion stage. Multispectral imagery and light detection and ranging point cloud data were acquired, and ground measurements were collected synchronously. A fused feature set was then constructed by integrating spectral information, texture information, and three-dimensional canopy structural information. Specifically, 15 spectral indices, 40 texture features, and 27 structural descriptors were extracted to characterize canopy reflectance, spatial heterogeneity, and geometric architecture. Seven regression models were evaluated, including Seven models—Backpropagation Neural Network (BPNN), Deep Neural Network (DNN), Gaussian Process Regression (GPR), Random Forest (RF), Ridge Regression (RIDGE), and Support Vector Regression (SVR). For each phenological stage, model performance was assessed using repeated random validation, and the coefficient of determination, root mean square error, and mean absolute error were used as evaluation metrics. In addition, ablation experiments were conducted to quantify the individual and combined contributions of spectral, textural, and structural information, and Shapley Additive Explanations were used to interpret feature importance across growth stages. The results showed that the Deep Neural Network achieved the best overall performance and the most stable accuracy among all models. For the end-flowering stage, spring shoot growth stage, and fruit expansion stage, the validation coefficient of determination reached 0.86, 0.87, and 0.89, respectively. These results indicate that the Deep Neural Network was highly effective for leaf area index estimation under different canopy conditions and phenological stages. In contrast, the other models showed greater fluctuations in validation performance, suggesting lower robustness to stage-dependent canopy variation. The ablation analysis further demonstrated that canopy structural information was the dominant source for leaf area index estimation in apple trees. When used alone, structural features consistently outperformed or matched the spectral and textural feature groups. Their advantage was especially evident during the spring shoot growth stage, when the coefficient of determination reached 0.79, reflecting the strong sensitivity of canopy geometry to leaf area changes during rapid vegetative development. At the end-flowering stage, the fusion of structural and textural information yielded the best dual-source performance, increasing the coefficient of determination to 0.84 and reducing the root mean square error to 0.16. A similar pattern was observed during the fruit expansion stage, where the combination of structural and textural information also produced the strongest two-source performance, with a coefficient of determination of 0.85 and a root mean square error of 0.13. When spectral, textural, and structural information were fully integrated, the estimation accuracy further improved, reaching coefficients of determination of 0.86, 0.87, and 0.89 across the three stages, with corresponding root mean square errors of 0.14, 0.14, and 0.12. The interpretation analysis confirmed the dominant role of canopy structural information in all growth stages. Structural descriptors related to canopy density and height distribution contributed most strongly to prediction, whereas spectral and texture information mainly served as complementary inputs that enhanced model discrimination under complex canopy conditions. Overall, this study demonstrates that multimodal feature fusion combined with deep learning can provide accurate and interpretable leaf area index estimation for apple orchards across phenological stages, and offers methodological support for orchard canopy monitoring and precision management.