Soybean Lodging Classification and Yield Prediction Using Multimodal UAV Data Fusion and Deep Learning (2025)

Abstract

UAV remote sensing is widely used in the agricultural sector due to its non-destructive, rapid, and cost-effective advantages. This study utilized two years of field data with multisource fused imagery of soybeans to evaluate lodging conditions and investigate the impact of lodging grade information on yield prediction. Unlike traditional approaches that build empirical lodging models using band reflectance, vegetation indices, and texture features, this research introduces a transfer learning framework. This framework employs a ResNet18 encoder to directly extract features from raw images, bypassing the complexity of manual feature extraction processes. To address the imbalance in the lodging dataset, the Synthetic Minority Over-sampling Technique (SMOTE) strategy was employed in the feature space to balance the training set. The findings reveal that deep learning effectively extracts meaningful features from UAV imagery, outperforming traditional methods in lodging grade classification across all growth stages. On the 65 days after emergence (DAE), lodging grade classification using ResNet18 features achieved the highest accuracy (Accuracy = 0.76, recall = 0.76, F1 score = 0.73), significantly exceeding the performance of traditional methods. However, classification accuracy was relatively low in plots with higher lodging grades (lodging grades = 3, 5, 7), with an accuracy of 0.42 and an F1 score of 0.56. After applying the SMOTE module to balance the samples, the classification accuracy in plots with higher lodging grades improved to 0.65, marking an increase of 54.76%. To improve accuracy in yield prediction, this study integrates lodging information with other features, such as canopy spectral reflectance, vegetation indices, and texture features, using two multimodal data fusion strategies: input-level fusion (ResNet-EF) and intermediate-level fusion (ResNet-MF). The findings reveal that the intermediate-level fusion strategy consistently outperforms input-level fusion in yield prediction accuracy across all growth stages. Specifically, the intermediate-level fusion model incorporating measured lodging grade information achieved the highest prediction accuracy on the 85 DAE (R2 = 0.65, RMSE = 529.56 kg/ha). Furthermore, when predicted lodging information was used, the model’s performance remained comparable to that of the measured lodging grades, underscoring the critical role of lodging factors in enhancing yield estimation accuracy.

Original languageEnglish
Article number1490
JournalRemote Sensing
Volume17
Issue number9
DOIs
StatePublished - May 2025
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2025 by the authors.

Keywords

  • data fusion
  • deep learning
  • lodging classification
  • UAV
  • yield

ASJC Scopus subject areas

  • General Earth and Planetary Sciences

Access to Document

Other files and links

Fingerprint

Dive into the research topics of 'Soybean Lodging Classification and Yield Prediction Using Multimodal UAV Data Fusion and Deep Learning'. Together they form a unique fingerprint.

View full fingerprint

Cite this

  • APA
  • Author
  • BIBTEX
  • Harvard
  • Standard
  • RIS
  • Vancouver

Xu, X., Fang, Y., Sun, G., Zhang, Y., Wang, L., Chen, C., Ren, L., Meng, L., Li, Y., Qiu, L., Guo, Y., Yu, H., & Ma, Y. (2025). Soybean Lodging Classification and Yield Prediction Using Multimodal UAV Data Fusion and Deep Learning. Remote Sensing, 17(9), Article 1490. https://doi.org/10.3390/rs17091490

Xu, Xingmei ; Fang, Yushi ; Sun, Guangyao et al. / Soybean Lodging Classification and Yield Prediction Using Multimodal UAV Data Fusion and Deep Learning. In: Remote Sensing. 2025 ; Vol. 17, No. 9.

@article{88fa61220452491e8b4470af75fec8f3,

title = "Soybean Lodging Classification and Yield Prediction Using Multimodal UAV Data Fusion and Deep Learning",

abstract = "UAV remote sensing is widely used in the agricultural sector due to its non-destructive, rapid, and cost-effective advantages. This study utilized two years of field data with multisource fused imagery of soybeans to evaluate lodging conditions and investigate the impact of lodging grade information on yield prediction. Unlike traditional approaches that build empirical lodging models using band reflectance, vegetation indices, and texture features, this research introduces a transfer learning framework. This framework employs a ResNet18 encoder to directly extract features from raw images, bypassing the complexity of manual feature extraction processes. To address the imbalance in the lodging dataset, the Synthetic Minority Over-sampling Technique (SMOTE) strategy was employed in the feature space to balance the training set. The findings reveal that deep learning effectively extracts meaningful features from UAV imagery, outperforming traditional methods in lodging grade classification across all growth stages. On the 65 days after emergence (DAE), lodging grade classification using ResNet18 features achieved the highest accuracy (Accuracy = 0.76, recall = 0.76, F1 score = 0.73), significantly exceeding the performance of traditional methods. However, classification accuracy was relatively low in plots with higher lodging grades (lodging grades = 3, 5, 7), with an accuracy of 0.42 and an F1 score of 0.56. After applying the SMOTE module to balance the samples, the classification accuracy in plots with higher lodging grades improved to 0.65, marking an increase of 54.76%. To improve accuracy in yield prediction, this study integrates lodging information with other features, such as canopy spectral reflectance, vegetation indices, and texture features, using two multimodal data fusion strategies: input-level fusion (ResNet-EF) and intermediate-level fusion (ResNet-MF). The findings reveal that the intermediate-level fusion strategy consistently outperforms input-level fusion in yield prediction accuracy across all growth stages. Specifically, the intermediate-level fusion model incorporating measured lodging grade information achieved the highest prediction accuracy on the 85 DAE (R2 = 0.65, RMSE = 529.56 kg/ha). Furthermore, when predicted lodging information was used, the model{\textquoteright}s performance remained comparable to that of the measured lodging grades, underscoring the critical role of lodging factors in enhancing yield estimation accuracy.",

keywords = "data fusion, deep learning, lodging classification, UAV, yield",

author = "Xingmei Xu and Yushi Fang and Guangyao Sun and Yong Zhang and Lei Wang and Chen Chen and Lisuo Ren and Lei Meng and Yinghui Li and Lijuan Qiu and Yan Guo and Helong Yu and Yuntao Ma",

note = "Publisher Copyright: {\textcopyright} 2025 by the authors.",

year = "2025",

month = may,

doi = "10.3390/rs17091490",

language = "English",

volume = "17",

journal = "Remote Sensing",

issn = "2072-4292",

publisher = "Multidisciplinary Digital Publishing Institute (MDPI)",

number = "9",

}

Xu, X, Fang, Y, Sun, G, Zhang, Y, Wang, L, Chen, C, Ren, L, Meng, L, Li, Y, Qiu, L, Guo, Y, Yu, H & Ma, Y 2025, 'Soybean Lodging Classification and Yield Prediction Using Multimodal UAV Data Fusion and Deep Learning', Remote Sensing, vol. 17, no. 9, 1490. https://doi.org/10.3390/rs17091490

Soybean Lodging Classification and Yield Prediction Using Multimodal UAV Data Fusion and Deep Learning. / Xu, Xingmei; Fang, Yushi; Sun, Guangyao et al.
In: Remote Sensing, Vol. 17, No. 9, 1490, 05.2025.

Research output: Contribution to journalArticlepeer-review

TY - JOUR

T1 - Soybean Lodging Classification and Yield Prediction Using Multimodal UAV Data Fusion and Deep Learning

AU - Xu, Xingmei

AU - Fang, Yushi

AU - Sun, Guangyao

AU - Zhang, Yong

AU - Wang, Lei

AU - Chen, Chen

AU - Ren, Lisuo

AU - Meng, Lei

AU - Li, Yinghui

AU - Qiu, Lijuan

AU - Guo, Yan

AU - Yu, Helong

AU - Ma, Yuntao

N1 - Publisher Copyright:© 2025 by the authors.

PY - 2025/5

Y1 - 2025/5

N2 - UAV remote sensing is widely used in the agricultural sector due to its non-destructive, rapid, and cost-effective advantages. This study utilized two years of field data with multisource fused imagery of soybeans to evaluate lodging conditions and investigate the impact of lodging grade information on yield prediction. Unlike traditional approaches that build empirical lodging models using band reflectance, vegetation indices, and texture features, this research introduces a transfer learning framework. This framework employs a ResNet18 encoder to directly extract features from raw images, bypassing the complexity of manual feature extraction processes. To address the imbalance in the lodging dataset, the Synthetic Minority Over-sampling Technique (SMOTE) strategy was employed in the feature space to balance the training set. The findings reveal that deep learning effectively extracts meaningful features from UAV imagery, outperforming traditional methods in lodging grade classification across all growth stages. On the 65 days after emergence (DAE), lodging grade classification using ResNet18 features achieved the highest accuracy (Accuracy = 0.76, recall = 0.76, F1 score = 0.73), significantly exceeding the performance of traditional methods. However, classification accuracy was relatively low in plots with higher lodging grades (lodging grades = 3, 5, 7), with an accuracy of 0.42 and an F1 score of 0.56. After applying the SMOTE module to balance the samples, the classification accuracy in plots with higher lodging grades improved to 0.65, marking an increase of 54.76%. To improve accuracy in yield prediction, this study integrates lodging information with other features, such as canopy spectral reflectance, vegetation indices, and texture features, using two multimodal data fusion strategies: input-level fusion (ResNet-EF) and intermediate-level fusion (ResNet-MF). The findings reveal that the intermediate-level fusion strategy consistently outperforms input-level fusion in yield prediction accuracy across all growth stages. Specifically, the intermediate-level fusion model incorporating measured lodging grade information achieved the highest prediction accuracy on the 85 DAE (R2 = 0.65, RMSE = 529.56 kg/ha). Furthermore, when predicted lodging information was used, the model’s performance remained comparable to that of the measured lodging grades, underscoring the critical role of lodging factors in enhancing yield estimation accuracy.

AB - UAV remote sensing is widely used in the agricultural sector due to its non-destructive, rapid, and cost-effective advantages. This study utilized two years of field data with multisource fused imagery of soybeans to evaluate lodging conditions and investigate the impact of lodging grade information on yield prediction. Unlike traditional approaches that build empirical lodging models using band reflectance, vegetation indices, and texture features, this research introduces a transfer learning framework. This framework employs a ResNet18 encoder to directly extract features from raw images, bypassing the complexity of manual feature extraction processes. To address the imbalance in the lodging dataset, the Synthetic Minority Over-sampling Technique (SMOTE) strategy was employed in the feature space to balance the training set. The findings reveal that deep learning effectively extracts meaningful features from UAV imagery, outperforming traditional methods in lodging grade classification across all growth stages. On the 65 days after emergence (DAE), lodging grade classification using ResNet18 features achieved the highest accuracy (Accuracy = 0.76, recall = 0.76, F1 score = 0.73), significantly exceeding the performance of traditional methods. However, classification accuracy was relatively low in plots with higher lodging grades (lodging grades = 3, 5, 7), with an accuracy of 0.42 and an F1 score of 0.56. After applying the SMOTE module to balance the samples, the classification accuracy in plots with higher lodging grades improved to 0.65, marking an increase of 54.76%. To improve accuracy in yield prediction, this study integrates lodging information with other features, such as canopy spectral reflectance, vegetation indices, and texture features, using two multimodal data fusion strategies: input-level fusion (ResNet-EF) and intermediate-level fusion (ResNet-MF). The findings reveal that the intermediate-level fusion strategy consistently outperforms input-level fusion in yield prediction accuracy across all growth stages. Specifically, the intermediate-level fusion model incorporating measured lodging grade information achieved the highest prediction accuracy on the 85 DAE (R2 = 0.65, RMSE = 529.56 kg/ha). Furthermore, when predicted lodging information was used, the model’s performance remained comparable to that of the measured lodging grades, underscoring the critical role of lodging factors in enhancing yield estimation accuracy.

KW - data fusion

KW - deep learning

KW - lodging classification

KW - UAV

KW - yield

UR - http://www.scopus.com/inward/record.url?scp=105004892922&partnerID=8YFLogxK

U2 - 10.3390/rs17091490

DO - 10.3390/rs17091490

M3 - Article

AN - SCOPUS:105004892922

SN - 2072-4292

VL - 17

JO - Remote Sensing

JF - Remote Sensing

IS - 9

M1 - 1490

ER -

Xu X, Fang Y, Sun G, Zhang Y, Wang L, Chen C et al. Soybean Lodging Classification and Yield Prediction Using Multimodal UAV Data Fusion and Deep Learning. Remote Sensing. 2025 May;17(9):1490. doi: 10.3390/rs17091490

Soybean Lodging Classification and Yield Prediction Using Multimodal UAV Data Fusion and Deep Learning (2025)
Top Articles
Latest Posts
Recommended Articles
Article information

Author: Kerri Lueilwitz

Last Updated:

Views: 6732

Rating: 4.7 / 5 (47 voted)

Reviews: 86% of readers found this page helpful

Author information

Name: Kerri Lueilwitz

Birthday: 1992-10-31

Address: Suite 878 3699 Chantelle Roads, Colebury, NC 68599

Phone: +6111989609516

Job: Chief Farming Manager

Hobby: Mycology, Stone skipping, Dowsing, Whittling, Taxidermy, Sand art, Roller skating

Introduction: My name is Kerri Lueilwitz, I am a courageous, gentle, quaint, thankful, outstanding, brave, vast person who loves writing and wants to share my knowledge and understanding with you.