Humanoid navigation planning using future perceptive capability

P. Michel, Joel E. Chestnutt, S. Kagami, K. Nishiwaki, J. Kuffner, T. Kanade
{"title":"Humanoid navigation planning using future perceptive capability","authors":"P. Michel, Joel E. Chestnutt, S. Kagami, K. Nishiwaki, J. Kuffner, T. Kanade","doi":"10.1109/ICHR.2008.4755972","DOIUrl":null,"url":null,"abstract":"We present an approach to navigation planning for humanoid robots that aims to ensure reliable execution by augmenting the planning process to reason about the robotpsilas ability to successfully perceive its environment during operation. By efficiently simulating the robotpsilas perception system during search, our planner generates a metric, the so-called perceptive capability, that quantifies the dasiasensabilitypsila of the environment in each state given the task to be accomplished. We have applied our method to the problem of planning robust autonomous walking sequences as performed by an HRP-2 humanoid. A fast GPU-accelerated 3D tracker is used for perception, with a footstep planner incorporating reasoning about the robotpsilas perceptive capability. When combined with a controller capable of adaptively adjusting the height of swing leg trajectories, HRP-2 is able to navigate around obstacles and climb stairs in dynamically changing environments. Reasoning about the future perceptive capability ensures that sensing remains operational throughout the walking sequence and yields higher task success rates than perception-unaware planning.","PeriodicalId":402020,"journal":{"name":"Humanoids 2008 - 8th IEEE-RAS International Conference on Humanoid Robots","volume":"175 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2008-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Humanoids 2008 - 8th IEEE-RAS International Conference on Humanoid Robots","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICHR.2008.4755972","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5

Abstract

We present an approach to navigation planning for humanoid robots that aims to ensure reliable execution by augmenting the planning process to reason about the robotpsilas ability to successfully perceive its environment during operation. By efficiently simulating the robotpsilas perception system during search, our planner generates a metric, the so-called perceptive capability, that quantifies the dasiasensabilitypsila of the environment in each state given the task to be accomplished. We have applied our method to the problem of planning robust autonomous walking sequences as performed by an HRP-2 humanoid. A fast GPU-accelerated 3D tracker is used for perception, with a footstep planner incorporating reasoning about the robotpsilas perceptive capability. When combined with a controller capable of adaptively adjusting the height of swing leg trajectories, HRP-2 is able to navigate around obstacles and climb stairs in dynamically changing environments. Reasoning about the future perceptive capability ensures that sensing remains operational throughout the walking sequence and yields higher task success rates than perception-unaware planning.
使用未来感知能力的人形导航规划
我们提出了一种人形机器人的导航规划方法,旨在通过增加规划过程来推断机器人在操作过程中成功感知其环境的能力,从而确保可靠的执行。通过在搜索过程中有效地模拟机器人感知系统,我们的规划器生成了一个度量,即所谓的感知能力,它量化了给定任务要完成的每个状态下环境的可感知性。我们已经将我们的方法应用于HRP-2类人机器人的鲁棒自主行走序列规划问题。一个快速的gpu加速的3D跟踪器用于感知,与一个脚步规划结合推理机器人的感知能力。当与能够自适应调整摆动腿轨迹高度的控制器相结合时,HRP-2能够在动态变化的环境中绕过障碍物和爬楼梯。对未来感知能力的推理确保了感知在整个行走序列中保持运作,并且比感知不知情的计划产生更高的任务成功率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信