Mauro Martini, Marco Ambrosio, Alessandro Navone, Brenno Tuberga, Marcello Chiaberge
{"title":"通过有效生成合成数据,加强行基作物的视觉自主导航","authors":"Mauro Martini, Marco Ambrosio, Alessandro Navone, Brenno Tuberga, Marcello Chiaberge","doi":"10.1007/s11119-024-10157-6","DOIUrl":null,"url":null,"abstract":"<h3 data-test=\"abstract-sub-heading\">Introduction</h3><p>Service robotics is recently enhancing precision agriculture enabling many automated processes based on efficient autonomous navigation solutions. However, data generation and in-field validation campaigns hinder the progress of large-scale autonomous platforms. Simulated environments and deep visual perception are spreading as successful tools to speed up the development of robust navigation with low-cost RGB-D cameras.</p><h3 data-test=\"abstract-sub-heading\">Materials and methods</h3><p>In this context, the contribution of this work resides in a complete framework to fully exploit synthetic data for a robust visual control of mobile robots. A wide realistic multi-crops dataset is accurately generated to train deep semantic segmentation networks and enabling robust performance in challenging real-world conditions. An automatic parametric approach enables an easy customization of virtual field geometry and features for a fast reliable evaluation of navigation algorithms.</p><h3 data-test=\"abstract-sub-heading\">Results and conclusion</h3><p>The high quality of the generated synthetic dataset is demonstrated by an extensive experimentation with real crops images and benchmarking the resulting robot navigation both in virtual and real fields with relevant metrics.</p>","PeriodicalId":20423,"journal":{"name":"Precision Agriculture","volume":"71 1","pages":""},"PeriodicalIF":5.4000,"publicationDate":"2024-06-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Enhancing visual autonomous navigation in row-based crops with effective synthetic data generation\",\"authors\":\"Mauro Martini, Marco Ambrosio, Alessandro Navone, Brenno Tuberga, Marcello Chiaberge\",\"doi\":\"10.1007/s11119-024-10157-6\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<h3 data-test=\\\"abstract-sub-heading\\\">Introduction</h3><p>Service robotics is recently enhancing precision agriculture enabling many automated processes based on efficient autonomous navigation solutions. However, data generation and in-field validation campaigns hinder the progress of large-scale autonomous platforms. Simulated environments and deep visual perception are spreading as successful tools to speed up the development of robust navigation with low-cost RGB-D cameras.</p><h3 data-test=\\\"abstract-sub-heading\\\">Materials and methods</h3><p>In this context, the contribution of this work resides in a complete framework to fully exploit synthetic data for a robust visual control of mobile robots. A wide realistic multi-crops dataset is accurately generated to train deep semantic segmentation networks and enabling robust performance in challenging real-world conditions. An automatic parametric approach enables an easy customization of virtual field geometry and features for a fast reliable evaluation of navigation algorithms.</p><h3 data-test=\\\"abstract-sub-heading\\\">Results and conclusion</h3><p>The high quality of the generated synthetic dataset is demonstrated by an extensive experimentation with real crops images and benchmarking the resulting robot navigation both in virtual and real fields with relevant metrics.</p>\",\"PeriodicalId\":20423,\"journal\":{\"name\":\"Precision Agriculture\",\"volume\":\"71 1\",\"pages\":\"\"},\"PeriodicalIF\":5.4000,\"publicationDate\":\"2024-06-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Precision Agriculture\",\"FirstCategoryId\":\"97\",\"ListUrlMain\":\"https://doi.org/10.1007/s11119-024-10157-6\",\"RegionNum\":2,\"RegionCategory\":\"农林科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AGRICULTURE, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Precision Agriculture","FirstCategoryId":"97","ListUrlMain":"https://doi.org/10.1007/s11119-024-10157-6","RegionNum":2,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURE, MULTIDISCIPLINARY","Score":null,"Total":0}
Enhancing visual autonomous navigation in row-based crops with effective synthetic data generation
Introduction
Service robotics is recently enhancing precision agriculture enabling many automated processes based on efficient autonomous navigation solutions. However, data generation and in-field validation campaigns hinder the progress of large-scale autonomous platforms. Simulated environments and deep visual perception are spreading as successful tools to speed up the development of robust navigation with low-cost RGB-D cameras.
Materials and methods
In this context, the contribution of this work resides in a complete framework to fully exploit synthetic data for a robust visual control of mobile robots. A wide realistic multi-crops dataset is accurately generated to train deep semantic segmentation networks and enabling robust performance in challenging real-world conditions. An automatic parametric approach enables an easy customization of virtual field geometry and features for a fast reliable evaluation of navigation algorithms.
Results and conclusion
The high quality of the generated synthetic dataset is demonstrated by an extensive experimentation with real crops images and benchmarking the resulting robot navigation both in virtual and real fields with relevant metrics.
期刊介绍:
Precision Agriculture promotes the most innovative results coming from the research in the field of precision agriculture. It provides an effective forum for disseminating original and fundamental research and experience in the rapidly advancing area of precision farming.
There are many topics in the field of precision agriculture; therefore, the topics that are addressed include, but are not limited to:
Natural Resources Variability: Soil and landscape variability, digital elevation models, soil mapping, geostatistics, geographic information systems, microclimate, weather forecasting, remote sensing, management units, scale, etc.
Managing Variability: Sampling techniques, site-specific nutrient and crop protection chemical recommendation, crop quality, tillage, seed density, seed variety, yield mapping, remote sensing, record keeping systems, data interpretation and use, crops (corn, wheat, sugar beets, potatoes, peanut, cotton, vegetables, etc.), management scale, etc.
Engineering Technology: Computers, positioning systems, DGPS, machinery, tillage, planting, nutrient and crop protection implements, manure, irrigation, fertigation, yield monitor and mapping, soil physical and chemical characteristic sensors, weed/pest mapping, etc.
Profitability: MEY, net returns, BMPs, optimum recommendations, crop quality, technology cost, sustainability, social impacts, marketing, cooperatives, farm scale, crop type, etc.
Environment: Nutrient, crop protection chemicals, sediments, leaching, runoff, practices, field, watershed, on/off farm, artificial drainage, ground water, surface water, etc.
Technology Transfer: Skill needs, education, training, outreach, methods, surveys, agri-business, producers, distance education, Internet, simulations models, decision support systems, expert systems, on-farm experimentation, partnerships, quality of rural life, etc.