{"title":"Development of a sensor fusion method for crop row tracking operations","authors":"B. Benet, R. Lenain, V. Rousseau","doi":"10.1017/S2040470017000310","DOIUrl":null,"url":null,"abstract":"A sensor fusion method was developed in order to track crop rows, considering various vegetation levels, for various crops. This application consisted to use a laser sensor, an inertial measurement unit and a color camera, in a fusion mode, to get a set of points corresponding to crop rows and eliminate noise like grass or leaves in environment, in real time. After applying a method such as Hough or Least Square (LS) technique for obtaining the geometric data of the crop line, automatic control operations were applied to realize the crop row tracking operation, with the desired lateral deviation parameter, taking into account the robot angular deviation and the temporal aspect, to realize the task with accuracy and without oscillations. The results showed the robustness of fusion method, to get a stable autonomous navigation for crop row tracking, particularly in the vineyards, with many perturbations such as bumps, hole and mud, and speeds between 1 and 2 m s⁻¹. The mean lateral error between desired and obtained trajectory varied between 0.10 and 0.40 m, depending of speed and perturbations.","PeriodicalId":7228,"journal":{"name":"Advances in Animal Biosciences","volume":"69 1","pages":"583-589"},"PeriodicalIF":0.0000,"publicationDate":"2017-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advances in Animal Biosciences","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1017/S2040470017000310","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
A sensor fusion method was developed in order to track crop rows, considering various vegetation levels, for various crops. This application consisted to use a laser sensor, an inertial measurement unit and a color camera, in a fusion mode, to get a set of points corresponding to crop rows and eliminate noise like grass or leaves in environment, in real time. After applying a method such as Hough or Least Square (LS) technique for obtaining the geometric data of the crop line, automatic control operations were applied to realize the crop row tracking operation, with the desired lateral deviation parameter, taking into account the robot angular deviation and the temporal aspect, to realize the task with accuracy and without oscillations. The results showed the robustness of fusion method, to get a stable autonomous navigation for crop row tracking, particularly in the vineyards, with many perturbations such as bumps, hole and mud, and speeds between 1 and 2 m s⁻¹. The mean lateral error between desired and obtained trajectory varied between 0.10 and 0.40 m, depending of speed and perturbations.
为了在考虑不同植被水平的情况下,对不同作物进行行跟踪,提出了一种传感器融合方法。该应用程序包括使用激光传感器、惯性测量单元和彩色摄像机,在融合模式下,获得一组与作物行对应的点,并实时消除环境中的草或树叶等噪声。在采用霍夫或最小二乘(LS)等方法获取作物行几何数据后,采用自动控制操作实现作物行跟踪操作,在考虑机器人角度偏差和时间方面的情况下,以期望的横向偏差参数实现作物行跟踪操作,以实现精确无振荡的任务。结果表明,融合方法的鲁棒性,可以获得稳定的自主导航,用于作物行跟踪,特别是在葡萄园中,有许多扰动,如颠簸,洞和泥,速度在1到2 m s⁻¹之间。期望和获得的轨迹之间的平均横向误差在0.10和0.40 m之间变化,取决于速度和扰动。