GrapeSLAM: UAV-based monocular visual dataset for SLAM, SfM and 3D reconstruction with trajectories under challenging illumination conditions

IF 1 Q3 MULTIDISCIPLINARY SCIENCES
Kaiwen Wang , Sergio Vélez , Lammert Kooistra , Wensheng Wang , João Valente
{"title":"GrapeSLAM: UAV-based monocular visual dataset for SLAM, SfM and 3D reconstruction with trajectories under challenging illumination conditions","authors":"Kaiwen Wang ,&nbsp;Sergio Vélez ,&nbsp;Lammert Kooistra ,&nbsp;Wensheng Wang ,&nbsp;João Valente","doi":"10.1016/j.dib.2025.111495","DOIUrl":null,"url":null,"abstract":"<div><div>SLAM (Simultaneous Localization and Mapping) is an efficient method for robot to percept surrendings and make decisions, especially for robots in agricultural scenarios. Perception and path planning in an automatic way is crucial for precision agriculture. However, there are limited public datasets to implement and develop robotic algorithms for agricultural environments. Therefore, we collected dataset “GrapeSLAM”. The ``GrapeSLAM'' dataset comprises video data collected from vineyards to support agricultural robotics research. Data collection involved two primary methods: (1) unmanned aerial vehicle (UAV) for capturing videos under different illumination conditions, and (2) trajectories of the UAV during each flight collected by RTK and IMU. The UAV used was Phantom 4 RTK, equipped with a high resolution camera, flying at around 1 to 3 meters above ground level.</div></div>","PeriodicalId":10973,"journal":{"name":"Data in Brief","volume":"60 ","pages":"Article 111495"},"PeriodicalIF":1.0000,"publicationDate":"2025-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Data in Brief","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2352340925002276","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"MULTIDISCIPLINARY SCIENCES","Score":null,"Total":0}
引用次数: 0

Abstract

SLAM (Simultaneous Localization and Mapping) is an efficient method for robot to percept surrendings and make decisions, especially for robots in agricultural scenarios. Perception and path planning in an automatic way is crucial for precision agriculture. However, there are limited public datasets to implement and develop robotic algorithms for agricultural environments. Therefore, we collected dataset “GrapeSLAM”. The ``GrapeSLAM'' dataset comprises video data collected from vineyards to support agricultural robotics research. Data collection involved two primary methods: (1) unmanned aerial vehicle (UAV) for capturing videos under different illumination conditions, and (2) trajectories of the UAV during each flight collected by RTK and IMU. The UAV used was Phantom 4 RTK, equipped with a high resolution camera, flying at around 1 to 3 meters above ground level.
求助全文
约1分钟内获得全文 求助全文
来源期刊
Data in Brief
Data in Brief MULTIDISCIPLINARY SCIENCES-
CiteScore
3.10
自引率
0.00%
发文量
996
审稿时长
70 days
期刊介绍: Data in Brief provides a way for researchers to easily share and reuse each other''s datasets by publishing data articles that: -Thoroughly describe your data, facilitating reproducibility. -Make your data, which is often buried in supplementary material, easier to find. -Increase traffic towards associated research articles and data, leading to more citations. -Open up doors for new collaborations. Because you never know what data will be useful to someone else, Data in Brief welcomes submissions that describe data from all research areas.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信