2019 European Conference on Mobile Robots (ECMR)最新文献

筛选
英文 中文
Sensor Aware Lidar Odometry 传感器感知激光雷达里程计
2019 European Conference on Mobile Robots (ECMR) Pub Date : 2019-07-22 DOI: 10.1109/ECMR.2019.8870929
D. Kovalenko, Mikhail Korobkin, Andrey Minin
{"title":"Sensor Aware Lidar Odometry","authors":"D. Kovalenko, Mikhail Korobkin, Andrey Minin","doi":"10.1109/ECMR.2019.8870929","DOIUrl":"https://doi.org/10.1109/ECMR.2019.8870929","url":null,"abstract":"A lidar odometry method, integrating into the computation the knowledge about the physics of the sensor, is proposed. A model of measurement error enables higher precision in estimation of the point normal covariance. Adjacent laser beams are used in an outlier correspondence rejection scheme. The method is ranked in the KITTI's leaderboard with 1.37% positioning error. 3.67% is achieved in comparison with the LOAM method on the internal dataset.","PeriodicalId":435630,"journal":{"name":"2019 European Conference on Mobile Robots (ECMR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-07-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128720396","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 10
Motion Control for Steerable Wheeled Mobile Manipulation 可操纵轮式移动操作的运动控制
2019 European Conference on Mobile Robots (ECMR) Pub Date : 2019-07-19 DOI: 10.1109/ECMR.2019.8870958
M. Sorour, A. Cherubini, P. Fraisse
{"title":"Motion Control for Steerable Wheeled Mobile Manipulation","authors":"M. Sorour, A. Cherubini, P. Fraisse","doi":"10.1109/ECMR.2019.8870958","DOIUrl":"https://doi.org/10.1109/ECMR.2019.8870958","url":null,"abstract":"In this paper, we address the problem of long travel mobile manipulation for steerable wheeled mobile robots (SWMR) operating in human shared environment. On one hand, a small footprint is required while maintaining a fixed arm configuration, to make robot motion predictable for near individuals during the long traverse. On the other hand, redundancy resolution poses a challenge since there is no direct kinematic mapping between the task and joint spaces for SWMR. Hence, we propose a redundancy resolution algorithm that enables switching between 3 modes of operation based on the Euclidean norm of the motion task error. In particular, we employ a floating base model for the mobile platform, and enhance the end effector motion performance by predicting the error between such model and the actual (SWMR) one. Such error is then compensated using the highly responsive arm manipulator. The proposed methodology is successfully validated in simulations on a Neobotix-MPO700 SWMR with a Kuka LWR-IV manipulator mounted on it.","PeriodicalId":435630,"journal":{"name":"2019 European Conference on Mobile Robots (ECMR)","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-07-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127606857","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Stochastic Optimization for Trajectory Planning with Heteroscedastic Gaussian Processes 异方差高斯过程轨迹规划的随机优化
2019 European Conference on Mobile Robots (ECMR) Pub Date : 2019-07-17 DOI: 10.1109/ECMR.2019.8870970
Luka Petrović, Juraj Peršić, Marija Seder, Ivan Marković
{"title":"Stochastic Optimization for Trajectory Planning with Heteroscedastic Gaussian Processes","authors":"Luka Petrović, Juraj Peršić, Marija Seder, Ivan Marković","doi":"10.1109/ECMR.2019.8870970","DOIUrl":"https://doi.org/10.1109/ECMR.2019.8870970","url":null,"abstract":"Trajectory optimization methods for motion planning attempt to generate trajectories that minimize a suitable objective function. Such methods efficiently find solutions even for high degree-of-freedom robots. However, a globally optimal solution is often intractable in practice and state-of-the-art trajectory optimization methods are thus prone to local minima, especially in cluttered environments. In this paper, we propose a novel motion planning algorithm that employs stochastic optimization based on the cross-entropy method in order to tackle the local minima problem. We represent trajectories as samples from a continuous-time Gaussian process and introduce heteroscedasticity to generate powerful trajectory priors better suited for collision avoidance in motion planning problems. Our experimental evaluation shows that the proposed approach yields a more thorough exploration of the solution space and a higher success rate in complex environments than a current Gaussian process based state-of-the-art trajectory optimization method, namely GPMP2, while having comparable execution time.","PeriodicalId":435630,"journal":{"name":"2019 European Conference on Mobile Robots (ECMR)","volume":"209 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-07-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131421101","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Stereo Event Lifetime and Disparity Estimation for Dynamic Vision Sensors 动态视觉传感器的立体事件寿命和视差估计
2019 European Conference on Mobile Robots (ECMR) Pub Date : 2019-07-17 DOI: 10.1109/ECMR.2019.8870946
Antea Hadviger, Ivan Marković, I. Petrović
{"title":"Stereo Event Lifetime and Disparity Estimation for Dynamic Vision Sensors","authors":"Antea Hadviger, Ivan Marković, I. Petrović","doi":"10.1109/ECMR.2019.8870946","DOIUrl":"https://doi.org/10.1109/ECMR.2019.8870946","url":null,"abstract":"Event-based cameras are biologically inspired sensors that output asynchronous pixel-wise brightness changes in the scene called events. They have a high dynamic range and temporal resolution of a microsecond, opposed to standard cameras that output frames at fixed frame rates and suffer from motion blur. Forming stereo pairs of such cameras can open novel application possibilities, since for each event depth can be readily estimated; however, to fully exploit asynchronous nature of the sensor and avoid fixed time interval event accumulation, stereo event lifetime estimation should be employed. In this paper, we propose a novel method for event lifetime estimation of stereo event-cameras, allowing generation of sharp gradient images of events that serve as input to disparity estimation methods. Since a single brightness change triggers events in both event-camera sensors, we propose a method for single shot event lifetime and disparity estimation, with association via stereo matching. The proposed method is approximately twice as fast and more accurate than if lifetimes were estimated separately for each sensor and then stereo matched. Results are validated on realworld data through multiple stereo event-camera experiments.","PeriodicalId":435630,"journal":{"name":"2019 European Conference on Mobile Robots (ECMR)","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-07-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129446474","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Real-time Vision-based Depth Reconstruction with NVidia Jetson 基于NVidia Jetson的实时视觉深度重建
2019 European Conference on Mobile Robots (ECMR) Pub Date : 2019-07-16 DOI: 10.1109/ECMR.2019.8870936
A. Bokovoy, K. Muravyev, K. Yakovlev
{"title":"Real-time Vision-based Depth Reconstruction with NVidia Jetson","authors":"A. Bokovoy, K. Muravyev, K. Yakovlev","doi":"10.1109/ECMR.2019.8870936","DOIUrl":"https://doi.org/10.1109/ECMR.2019.8870936","url":null,"abstract":"Vision-based depth reconstruction is a challenging problem extensively studied in computer vision but still lacking universal solution. Reconstructing depth from single image is particularly valuable to mobile robotics as it can be embedded to the modern vision-based simultaneous localization and mapping (vSLAM) methods providing them with the metric information needed to construct accurate maps in real scale. Typically, depth reconstruction is done nowadays via fully-convolutional neural networks (FCNNs). In this work we experiment with several FCNN architectures and introduce a few enhancements aimed at increasing both the effectiveness and the efficiency of the inference. We experimentally determine the solution that provides the best performance/accuracy tradeoff and is able to run on NVidia Jetson with the framerates exceeding 16FPS for 320 × 240 input. We also evaluate the suggested models by conducting monocular vSLAM of unknown indoor environment on NVidia Jetson TX2 in real-time. Open-source implementation of the models and the inference node for Robot Operating System (ROS) are available at https://github.com/CnnDepth/tx2_fcnn_node.","PeriodicalId":435630,"journal":{"name":"2019 European Conference on Mobile Robots (ECMR)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133234885","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 14
Learning Objectness from Sonar Images for Class-Independent Object Detection 从声纳图像中学习目标进行类无关目标检测
2019 European Conference on Mobile Robots (ECMR) Pub Date : 2019-07-01 DOI: 10.1109/ECMR.2019.8870959
Matias Valdenegro-Toro
{"title":"Learning Objectness from Sonar Images for Class-Independent Object Detection","authors":"Matias Valdenegro-Toro","doi":"10.1109/ECMR.2019.8870959","DOIUrl":"https://doi.org/10.1109/ECMR.2019.8870959","url":null,"abstract":"Detecting novel objects without class information is not trivial, as it is difficult to generalize from a small training set. This is an interesting problem for underwater robotics, as modeling marine objects is inherently more difficult in sonar images, and training data might not be available apriori. Detection proposals algorithms can be used for this purpose but usually requires a large amount of output bounding boxes. In this paper we propose the use of a fully convolutional neural network that regresses an objectness value directly from a Forward-Looking sonar image. By ranking objectness, we can produce high recall (96 %) with only 100 proposals per image. In comparison, EdgeBoxes requires 5000 proposals to achieve a slightly better recall of 97 %, while Selective Search requires 2000 proposals to achieve 95 % recall. We also show that our method outperforms a template matching baseline by a considerable margin, and is able to generalize to completely new objects. We expect that this kind of technique can be used in the field to find lost objects under the sea.","PeriodicalId":435630,"journal":{"name":"2019 European Conference on Mobile Robots (ECMR)","volume":"15 Suppl 1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132913961","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Joint Vision-Based Navigation, Control and Obstacle Avoidance for UAVs in Dynamic Environments 动态环境下无人机联合视觉导航、控制与避障
2019 European Conference on Mobile Robots (ECMR) Pub Date : 2019-05-03 DOI: 10.1109/ECMR.2019.8870944
Ciro Potena, D. Nardi, A. Pretto
{"title":"Joint Vision-Based Navigation, Control and Obstacle Avoidance for UAVs in Dynamic Environments","authors":"Ciro Potena, D. Nardi, A. Pretto","doi":"10.1109/ECMR.2019.8870944","DOIUrl":"https://doi.org/10.1109/ECMR.2019.8870944","url":null,"abstract":"This work addresses the problem of coupling vision-based navigation systems for Unmanned Aerial Vehicles (UAVs) with robust obstacle avoidance capabilities. The former problem is solved by maximizing the visibility of the points of interest, while the latter is modeled by means of ellipsoidal repulsive areas. The whole problem is transcribed into an Optimal Control Problem (OCP), and solved in a few milliseconds by leveraging state-of-the-art numerical optimization. The resulting trajectories are well suited for reaching the specified goal location while avoiding obstacles with a safety margin and minimizing the probability of losing the route with the target of interest. Combining this technique with a proper ellipsoid shaping (i.e., by augmenting the shape proportionally with the obstacle velocity or with the obstacle detection uncertainties) results in a robust obstacle avoidance behavior. We validate our approach within extensive simulated experiments that show effective capabilities to satisfy all the constraints even in challenging conditions. We release with this paper the open source implementation.","PeriodicalId":435630,"journal":{"name":"2019 European Conference on Mobile Robots (ECMR)","volume":"130 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-05-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121093992","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信