{"title":"Recognition of field roads based on improved U-Net++ Network","authors":"Lili Yang, Yuanbo Li, Mengshuai Chang, Yuanyuan Xu, Bingbing Hu, Xinxin Wang, Caicong Wu","doi":"10.25165/j.ijabe.20231602.7941","DOIUrl":null,"url":null,"abstract":": Unmanned driving of agricultural machinery has garnered significant attention in recent years, especially with the development of precision farming and sensor technologies. To achieve high performance and low cost, perception tasks are of great importance. In this study, a low-cost and high-safety method was proposed for field road recognition in unmanned agricultural machinery. The approach of this study utilized point clouds, with low-resolution lidar point clouds as inputs, generating high-resolution point clouds and Bird's Eye View (BEV) images that were encoded with several basic statistics. Using a BEV representation, road detection was reduced to a single-scale problem that could be addressed with an improved U-Net++ neural network. Three enhancements were proposed for U-Net++: 1) replacing the convolutional kernel in the original U-Net++ with an Asymmetric Convolution Block (ACBlock); 2) adding a multi-branch Asymmetric Dilated Convolutional Block (MADC) in the highest semantic information layer; 3) adding an Attention Gate (AG) model to the long-skip-connection in the decoding stage. The results of experiments of this study showed that our algorithm achieved a Mean Intersection Over Union of 96.54% on the 16-channel point clouds, which was 7.35 percentage points higher than U-Net++. Furthermore, the average processing time of the model was about 70 ms, meeting the time requirements of unmanned driving in agricultural machinery. The proposed method of this study can be applied to enhance the perception ability of unmanned agricultural machinery thereby increasing the safety of field road driving.","PeriodicalId":13895,"journal":{"name":"International Journal of Agricultural and Biological Engineering","volume":"112 1","pages":""},"PeriodicalIF":2.2000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Agricultural and Biological Engineering","FirstCategoryId":"97","ListUrlMain":"https://doi.org/10.25165/j.ijabe.20231602.7941","RegionNum":2,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"AGRICULTURAL ENGINEERING","Score":null,"Total":0}
引用次数: 0
Abstract
: Unmanned driving of agricultural machinery has garnered significant attention in recent years, especially with the development of precision farming and sensor technologies. To achieve high performance and low cost, perception tasks are of great importance. In this study, a low-cost and high-safety method was proposed for field road recognition in unmanned agricultural machinery. The approach of this study utilized point clouds, with low-resolution lidar point clouds as inputs, generating high-resolution point clouds and Bird's Eye View (BEV) images that were encoded with several basic statistics. Using a BEV representation, road detection was reduced to a single-scale problem that could be addressed with an improved U-Net++ neural network. Three enhancements were proposed for U-Net++: 1) replacing the convolutional kernel in the original U-Net++ with an Asymmetric Convolution Block (ACBlock); 2) adding a multi-branch Asymmetric Dilated Convolutional Block (MADC) in the highest semantic information layer; 3) adding an Attention Gate (AG) model to the long-skip-connection in the decoding stage. The results of experiments of this study showed that our algorithm achieved a Mean Intersection Over Union of 96.54% on the 16-channel point clouds, which was 7.35 percentage points higher than U-Net++. Furthermore, the average processing time of the model was about 70 ms, meeting the time requirements of unmanned driving in agricultural machinery. The proposed method of this study can be applied to enhance the perception ability of unmanned agricultural machinery thereby increasing the safety of field road driving.
期刊介绍:
International Journal of Agricultural and Biological Engineering (IJABE, https://www.ijabe.org) is a peer reviewed open access international journal. IJABE, started in 2008, is a joint publication co-sponsored by US-based Association of Agricultural, Biological and Food Engineers (AOCABFE) and China-based Chinese Society of Agricultural Engineering (CSAE). The ISSN 1934-6344 and eISSN 1934-6352 numbers for both print and online IJABE have been registered in US. Now, Int. J. Agric. & Biol. Eng (IJABE) is published in both online and print version by Chinese Academy of Agricultural Engineering.