Ronja Güldenring, Frits K. van Evert, Lazaros Nalpantidis
{"title":"RumexWeeds: A grassland dataset for agricultural robotics","authors":"Ronja Güldenring, Frits K. van Evert, Lazaros Nalpantidis","doi":"10.1002/rob.22196","DOIUrl":null,"url":null,"abstract":"<p>Computer vision can lead toward more sustainable agricultural production by enabling robotic precision agriculture. Vision-equipped robots are being deployed in the fields to take care of crops and control weeds. However, publicly available agricultural datasets containing both image data as well as data from navigational robot sensors are scarce. Our real-world dataset RumexWeeds targets the detection of the grassland weeds: <i>Rumex obtusifolius</i> L. and <i>Rumex crispus</i> L. RumexWeeds includes whole image sequences instead of individual static images, which is rare for computer vision image datasets, yet crucial for robotic applications. It allows for more robust object detection, incorporating temporal aspects and considering different viewpoints of the same object. Furthermore, RumexWeeds includes data from additional navigational robot sensors—GNSS, IMU and odometry—which can increase robustness, when additionally fed to detection models. In total the dataset includes 5510 images with 15,519 manual bounding box annotations collected at three different farms and four different days in summer and autumn 2021. Additionally, RumexWeeds includes a subset of 340 ground truth pixels-wise annotations. The dataset is publicly available at https://dtu-pas.github.io/RumexWeeds/. In this paper we also use RumexWeeds to provide baseline weed detection results considering a state-of-the-art object detector; in this way we are elucidating interesting characteristics of the dataset.</p>","PeriodicalId":192,"journal":{"name":"Journal of Field Robotics","volume":"40 6","pages":"1639-1656"},"PeriodicalIF":4.2000,"publicationDate":"2023-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/rob.22196","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Field Robotics","FirstCategoryId":"94","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/rob.22196","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 3
Abstract
Computer vision can lead toward more sustainable agricultural production by enabling robotic precision agriculture. Vision-equipped robots are being deployed in the fields to take care of crops and control weeds. However, publicly available agricultural datasets containing both image data as well as data from navigational robot sensors are scarce. Our real-world dataset RumexWeeds targets the detection of the grassland weeds: Rumex obtusifolius L. and Rumex crispus L. RumexWeeds includes whole image sequences instead of individual static images, which is rare for computer vision image datasets, yet crucial for robotic applications. It allows for more robust object detection, incorporating temporal aspects and considering different viewpoints of the same object. Furthermore, RumexWeeds includes data from additional navigational robot sensors—GNSS, IMU and odometry—which can increase robustness, when additionally fed to detection models. In total the dataset includes 5510 images with 15,519 manual bounding box annotations collected at three different farms and four different days in summer and autumn 2021. Additionally, RumexWeeds includes a subset of 340 ground truth pixels-wise annotations. The dataset is publicly available at https://dtu-pas.github.io/RumexWeeds/. In this paper we also use RumexWeeds to provide baseline weed detection results considering a state-of-the-art object detector; in this way we are elucidating interesting characteristics of the dataset.
期刊介绍:
The Journal of Field Robotics seeks to promote scholarly publications dealing with the fundamentals of robotics in unstructured and dynamic environments.
The Journal focuses on experimental robotics and encourages publication of work that has both theoretical and practical significance.