{"title":"Enhanced visual detection of litchi fruit in complex natural environments based on unmanned aerial vehicle (UAV) remote sensing","authors":"Changjiang Liang, Juntao Liang, Weiguang Yang, Weiyi Ge, Jing Zhao, Zhaorong Li, Shudai Bai, Jiawen Fan, Yubin Lan, Yongbing Long","doi":"10.1007/s11119-025-10220-w","DOIUrl":null,"url":null,"abstract":"<p>Rapid and accurate detection of fruits is crucial for estimating yields and making scientific decisions in litchi orchards. However, litchis grow in complex natural environments, characterized by variable lighting, severe occlusion from branches and leaves, small fruit sizes, and dense overlapping, all of which pose significant challenges for accurate detection. This paper addressed this problem by proposing a method that combines unmanned aerial vehicle (UAV) remote sensing and deep learning for litchi detection. A remote sensing image dataset comprising litchi fruit was first constructed. Subsequently, an improved algorithm, YOLOv7-MSRSF, was developed. Experimental results demonstrated that YOLOv7-MSRSF’s mean average precision (mAP) reached 96.1%, outperforming YOLOv7 and pure transformer algorithms by 3.7% and 20.6%, respectively. Tests on randomly selected 24 images demonstrated that integrating the Swin-transformer module into YOLOv7 improved litchi fruit detection accuracy under severe occlusion, dense overlapping, and variable lighting by 19.55%, 6.63%, and 13.94%, respectively. YOLOv7-MSRSF showed further improvements in these three complex conditions, with detection accuracy increasing by 26.99%, 9.82%, and 18.68%, respectively, reaching 89.16%, 97.79%, and 95.56%. Furthermore, the Real-ESRGAN algorithm significantly enhanced the YOLOv7-MSRSF model’s recognition accuracy of objects in low-resolution images captured by high-altitude drones. The average detected accuracy of three images collected at 27.5 m above the canopy reached a high value of 82.2%, which was improved by 70.6% compared with that (11.6%) before super-resolution processing. The proposed method offered valuable guidance for detecting small, dense agricultural objects in large-scale, complex natural environments.</p>","PeriodicalId":20423,"journal":{"name":"Precision Agriculture","volume":"57 1","pages":""},"PeriodicalIF":5.4000,"publicationDate":"2025-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Precision Agriculture","FirstCategoryId":"97","ListUrlMain":"https://doi.org/10.1007/s11119-025-10220-w","RegionNum":2,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURE, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
Rapid and accurate detection of fruits is crucial for estimating yields and making scientific decisions in litchi orchards. However, litchis grow in complex natural environments, characterized by variable lighting, severe occlusion from branches and leaves, small fruit sizes, and dense overlapping, all of which pose significant challenges for accurate detection. This paper addressed this problem by proposing a method that combines unmanned aerial vehicle (UAV) remote sensing and deep learning for litchi detection. A remote sensing image dataset comprising litchi fruit was first constructed. Subsequently, an improved algorithm, YOLOv7-MSRSF, was developed. Experimental results demonstrated that YOLOv7-MSRSF’s mean average precision (mAP) reached 96.1%, outperforming YOLOv7 and pure transformer algorithms by 3.7% and 20.6%, respectively. Tests on randomly selected 24 images demonstrated that integrating the Swin-transformer module into YOLOv7 improved litchi fruit detection accuracy under severe occlusion, dense overlapping, and variable lighting by 19.55%, 6.63%, and 13.94%, respectively. YOLOv7-MSRSF showed further improvements in these three complex conditions, with detection accuracy increasing by 26.99%, 9.82%, and 18.68%, respectively, reaching 89.16%, 97.79%, and 95.56%. Furthermore, the Real-ESRGAN algorithm significantly enhanced the YOLOv7-MSRSF model’s recognition accuracy of objects in low-resolution images captured by high-altitude drones. The average detected accuracy of three images collected at 27.5 m above the canopy reached a high value of 82.2%, which was improved by 70.6% compared with that (11.6%) before super-resolution processing. The proposed method offered valuable guidance for detecting small, dense agricultural objects in large-scale, complex natural environments.
期刊介绍:
Precision Agriculture promotes the most innovative results coming from the research in the field of precision agriculture. It provides an effective forum for disseminating original and fundamental research and experience in the rapidly advancing area of precision farming.
There are many topics in the field of precision agriculture; therefore, the topics that are addressed include, but are not limited to:
Natural Resources Variability: Soil and landscape variability, digital elevation models, soil mapping, geostatistics, geographic information systems, microclimate, weather forecasting, remote sensing, management units, scale, etc.
Managing Variability: Sampling techniques, site-specific nutrient and crop protection chemical recommendation, crop quality, tillage, seed density, seed variety, yield mapping, remote sensing, record keeping systems, data interpretation and use, crops (corn, wheat, sugar beets, potatoes, peanut, cotton, vegetables, etc.), management scale, etc.
Engineering Technology: Computers, positioning systems, DGPS, machinery, tillage, planting, nutrient and crop protection implements, manure, irrigation, fertigation, yield monitor and mapping, soil physical and chemical characteristic sensors, weed/pest mapping, etc.
Profitability: MEY, net returns, BMPs, optimum recommendations, crop quality, technology cost, sustainability, social impacts, marketing, cooperatives, farm scale, crop type, etc.
Environment: Nutrient, crop protection chemicals, sediments, leaching, runoff, practices, field, watershed, on/off farm, artificial drainage, ground water, surface water, etc.
Technology Transfer: Skill needs, education, training, outreach, methods, surveys, agri-business, producers, distance education, Internet, simulations models, decision support systems, expert systems, on-farm experimentation, partnerships, quality of rural life, etc.