Aisha Naseer, Madiha Amjad, Ali Raza, Kashif Munir, Aseel Smerat, Henry Fabian Gongora, Carlos Eduardo Uc Rios, Imran Ashraf
{"title":"基于田间图像的小麦作物生育期识别的杂交传递神经网络。","authors":"Aisha Naseer, Madiha Amjad, Ali Raza, Kashif Munir, Aseel Smerat, Henry Fabian Gongora, Carlos Eduardo Uc Rios, Imran Ashraf","doi":"10.1038/s41598-025-96332-9","DOIUrl":null,"url":null,"abstract":"<p><p>Wheat is one of the world's most widely cultivated cereal crops and is a primary food source for a significant portion of the population. Wheat goes through several distinct developmental phases, and accurately identifying these stages is essential for precision farming. Determining wheat growth stages accurately is crucial for increasing the efficiency of agricultural yield in wheat farming. Preliminary research identified obstacles in distinguishing between these stages, negatively impacting crop yields. To address this, this study introduces an innovative approach, MobDenNet, based on data collection and real-time wheat crop stage recognition. The data collection utilized a diverse image dataset covering seven growth phases 'Crown Root', 'Tillering', 'Mid Vegetative', 'Booting', 'Heading', 'Anthesis', and 'Milking', comprising 4496 images. The collected image dataset underwent rigorous preprocessing and advanced data augmentation to refine and minimize biases. This study employed deep and transfer learning models, including MobileNetV2, DenseNet-121, NASNet-Large, InceptionV3, and a convolutional neural network (CNN) for performance comparison. Experimental evaluations demonstrated that the transfer model MobileNetV2 achieved 95% accuracy, DenseNet-121 achieved 94% accuracy, NASNet-Large achieved 76% accuracy, InceptionV3 achieved 74% accuracy, and the CNN achieved 68% accuracy. The proposed novel hybrid approach, MobDenNet, that synergistically merges the architectures of MobileNetV2 and DenseNet-121 neural networks, yields highly accurate results with precision, recall, and an F1 score of 99%. We validated the robustness of the proposed approach using the k-fold cross-validation. The proposed research ensures the detection of growth stages with great promise for boosting agricultural productivity and management practices, empowering farmers to optimize resource distribution and make informed decisions.</p>","PeriodicalId":21811,"journal":{"name":"Scientific Reports","volume":"15 1","pages":"11822"},"PeriodicalIF":3.9000,"publicationDate":"2025-04-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11976961/pdf/","citationCount":"0","resultStr":"{\"title\":\"Novel hybrid transfer neural network for wheat crop growth stages recognition using field images.\",\"authors\":\"Aisha Naseer, Madiha Amjad, Ali Raza, Kashif Munir, Aseel Smerat, Henry Fabian Gongora, Carlos Eduardo Uc Rios, Imran Ashraf\",\"doi\":\"10.1038/s41598-025-96332-9\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Wheat is one of the world's most widely cultivated cereal crops and is a primary food source for a significant portion of the population. Wheat goes through several distinct developmental phases, and accurately identifying these stages is essential for precision farming. Determining wheat growth stages accurately is crucial for increasing the efficiency of agricultural yield in wheat farming. Preliminary research identified obstacles in distinguishing between these stages, negatively impacting crop yields. To address this, this study introduces an innovative approach, MobDenNet, based on data collection and real-time wheat crop stage recognition. The data collection utilized a diverse image dataset covering seven growth phases 'Crown Root', 'Tillering', 'Mid Vegetative', 'Booting', 'Heading', 'Anthesis', and 'Milking', comprising 4496 images. The collected image dataset underwent rigorous preprocessing and advanced data augmentation to refine and minimize biases. This study employed deep and transfer learning models, including MobileNetV2, DenseNet-121, NASNet-Large, InceptionV3, and a convolutional neural network (CNN) for performance comparison. Experimental evaluations demonstrated that the transfer model MobileNetV2 achieved 95% accuracy, DenseNet-121 achieved 94% accuracy, NASNet-Large achieved 76% accuracy, InceptionV3 achieved 74% accuracy, and the CNN achieved 68% accuracy. The proposed novel hybrid approach, MobDenNet, that synergistically merges the architectures of MobileNetV2 and DenseNet-121 neural networks, yields highly accurate results with precision, recall, and an F1 score of 99%. We validated the robustness of the proposed approach using the k-fold cross-validation. The proposed research ensures the detection of growth stages with great promise for boosting agricultural productivity and management practices, empowering farmers to optimize resource distribution and make informed decisions.</p>\",\"PeriodicalId\":21811,\"journal\":{\"name\":\"Scientific Reports\",\"volume\":\"15 1\",\"pages\":\"11822\"},\"PeriodicalIF\":3.9000,\"publicationDate\":\"2025-04-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11976961/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Scientific Reports\",\"FirstCategoryId\":\"103\",\"ListUrlMain\":\"https://doi.org/10.1038/s41598-025-96332-9\",\"RegionNum\":2,\"RegionCategory\":\"综合性期刊\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MULTIDISCIPLINARY SCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Scientific Reports","FirstCategoryId":"103","ListUrlMain":"https://doi.org/10.1038/s41598-025-96332-9","RegionNum":2,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MULTIDISCIPLINARY SCIENCES","Score":null,"Total":0}
Novel hybrid transfer neural network for wheat crop growth stages recognition using field images.
Wheat is one of the world's most widely cultivated cereal crops and is a primary food source for a significant portion of the population. Wheat goes through several distinct developmental phases, and accurately identifying these stages is essential for precision farming. Determining wheat growth stages accurately is crucial for increasing the efficiency of agricultural yield in wheat farming. Preliminary research identified obstacles in distinguishing between these stages, negatively impacting crop yields. To address this, this study introduces an innovative approach, MobDenNet, based on data collection and real-time wheat crop stage recognition. The data collection utilized a diverse image dataset covering seven growth phases 'Crown Root', 'Tillering', 'Mid Vegetative', 'Booting', 'Heading', 'Anthesis', and 'Milking', comprising 4496 images. The collected image dataset underwent rigorous preprocessing and advanced data augmentation to refine and minimize biases. This study employed deep and transfer learning models, including MobileNetV2, DenseNet-121, NASNet-Large, InceptionV3, and a convolutional neural network (CNN) for performance comparison. Experimental evaluations demonstrated that the transfer model MobileNetV2 achieved 95% accuracy, DenseNet-121 achieved 94% accuracy, NASNet-Large achieved 76% accuracy, InceptionV3 achieved 74% accuracy, and the CNN achieved 68% accuracy. The proposed novel hybrid approach, MobDenNet, that synergistically merges the architectures of MobileNetV2 and DenseNet-121 neural networks, yields highly accurate results with precision, recall, and an F1 score of 99%. We validated the robustness of the proposed approach using the k-fold cross-validation. The proposed research ensures the detection of growth stages with great promise for boosting agricultural productivity and management practices, empowering farmers to optimize resource distribution and make informed decisions.
期刊介绍:
We publish original research from all areas of the natural sciences, psychology, medicine and engineering. You can learn more about what we publish by browsing our specific scientific subject areas below or explore Scientific Reports by browsing all articles and collections.
Scientific Reports has a 2-year impact factor: 4.380 (2021), and is the 6th most-cited journal in the world, with more than 540,000 citations in 2020 (Clarivate Analytics, 2021).
•Engineering
Engineering covers all aspects of engineering, technology, and applied science. It plays a crucial role in the development of technologies to address some of the world''s biggest challenges, helping to save lives and improve the way we live.
•Physical sciences
Physical sciences are those academic disciplines that aim to uncover the underlying laws of nature — often written in the language of mathematics. It is a collective term for areas of study including astronomy, chemistry, materials science and physics.
•Earth and environmental sciences
Earth and environmental sciences cover all aspects of Earth and planetary science and broadly encompass solid Earth processes, surface and atmospheric dynamics, Earth system history, climate and climate change, marine and freshwater systems, and ecology. It also considers the interactions between humans and these systems.
•Biological sciences
Biological sciences encompass all the divisions of natural sciences examining various aspects of vital processes. The concept includes anatomy, physiology, cell biology, biochemistry and biophysics, and covers all organisms from microorganisms, animals to plants.
•Health sciences
The health sciences study health, disease and healthcare. This field of study aims to develop knowledge, interventions and technology for use in healthcare to improve the treatment of patients.