灵活的任务特定控制使用主动视觉

R. Firby, M. Swain
{"title":"灵活的任务特定控制使用主动视觉","authors":"R. Firby, M. Swain","doi":"10.1109/AIHAS.1992.636877","DOIUrl":null,"url":null,"abstract":"This paper is about the interface between continuous and discrete robot control. We advocate encapsulating continuous actions and their related sensing strategies into behaviors called situation specific activities, which can be constructed by a symbolic reactive planner. Task- specific, real-time perception is a fundamental part of these activities. While researchers have successfully used primitive touch and sonar sensors in such situations, it is more problematic to achieve reasonable performance with complex signals such as those from a video camera. Active vision routines are suggested as a means of incorporating visual data into real time control and as one mechanism for designating aspects of the world in an indexical-functional manner. Active vision routines are a particularly flexible sensing methodology because different routines extract different functional attributes from the world using the same sensor. In fact, there will often be different active vision routines for extracting the same functional attribute using different processing techniques. This allows an agent substantial leeway to instantiate its activities in different ways under different circumstances using different active vision routines. We demonstrate the utility of this architecture with an object tracking example. A control system is presented that can be reconfigured by a reactive planner to achieve different tasks. We show how this system allows us to build interchangeable tracking activities that use either color histogram or motion based active vision routines.","PeriodicalId":442147,"journal":{"name":"Proceedings of the Third Annual Conference of AI, Simulation, and Planning in High Autonomy Systems 'Integrating Perception, Planning and Action'.","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1992-04-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Flexible Task-Specific Control Using Active Vision\",\"authors\":\"R. Firby, M. Swain\",\"doi\":\"10.1109/AIHAS.1992.636877\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper is about the interface between continuous and discrete robot control. We advocate encapsulating continuous actions and their related sensing strategies into behaviors called situation specific activities, which can be constructed by a symbolic reactive planner. Task- specific, real-time perception is a fundamental part of these activities. While researchers have successfully used primitive touch and sonar sensors in such situations, it is more problematic to achieve reasonable performance with complex signals such as those from a video camera. Active vision routines are suggested as a means of incorporating visual data into real time control and as one mechanism for designating aspects of the world in an indexical-functional manner. Active vision routines are a particularly flexible sensing methodology because different routines extract different functional attributes from the world using the same sensor. In fact, there will often be different active vision routines for extracting the same functional attribute using different processing techniques. This allows an agent substantial leeway to instantiate its activities in different ways under different circumstances using different active vision routines. We demonstrate the utility of this architecture with an object tracking example. A control system is presented that can be reconfigured by a reactive planner to achieve different tasks. We show how this system allows us to build interchangeable tracking activities that use either color histogram or motion based active vision routines.\",\"PeriodicalId\":442147,\"journal\":{\"name\":\"Proceedings of the Third Annual Conference of AI, Simulation, and Planning in High Autonomy Systems 'Integrating Perception, Planning and Action'.\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1992-04-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the Third Annual Conference of AI, Simulation, and Planning in High Autonomy Systems 'Integrating Perception, Planning and Action'.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/AIHAS.1992.636877\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Third Annual Conference of AI, Simulation, and Planning in High Autonomy Systems 'Integrating Perception, Planning and Action'.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AIHAS.1992.636877","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

摘要

本文主要研究机器人连续控制与离散控制之间的接口问题。我们主张将连续的行动及其相关的感知策略封装为情境特定活动的行为,这种行为可以由符号反应计划器构建。具体任务、实时感知是这些活动的基本组成部分。虽然研究人员已经成功地在这种情况下使用了原始的触摸和声纳传感器,但在处理复杂的信号(如来自摄像机的信号)时,要实现合理的性能则更加困难。主动视觉程序被认为是一种将视觉数据纳入实时控制的手段,也是一种以索引功能方式指定世界各个方面的机制。主动视觉例程是一种特别灵活的感知方法,因为不同的例程使用相同的传感器从世界中提取不同的功能属性。实际上,通常会有不同的主动视觉例程来使用不同的处理技术提取相同的功能属性。这允许代理在使用不同的主动视觉例程的不同情况下以不同的方式实例化其活动。我们通过一个对象跟踪示例来演示这种体系结构的实用性。提出了一种可通过响应式规划器重新配置以实现不同任务的控制系统。我们展示了这个系统如何允许我们构建可互换的跟踪活动,使用颜色直方图或基于运动的主动视觉例程。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Flexible Task-Specific Control Using Active Vision
This paper is about the interface between continuous and discrete robot control. We advocate encapsulating continuous actions and their related sensing strategies into behaviors called situation specific activities, which can be constructed by a symbolic reactive planner. Task- specific, real-time perception is a fundamental part of these activities. While researchers have successfully used primitive touch and sonar sensors in such situations, it is more problematic to achieve reasonable performance with complex signals such as those from a video camera. Active vision routines are suggested as a means of incorporating visual data into real time control and as one mechanism for designating aspects of the world in an indexical-functional manner. Active vision routines are a particularly flexible sensing methodology because different routines extract different functional attributes from the world using the same sensor. In fact, there will often be different active vision routines for extracting the same functional attribute using different processing techniques. This allows an agent substantial leeway to instantiate its activities in different ways under different circumstances using different active vision routines. We demonstrate the utility of this architecture with an object tracking example. A control system is presented that can be reconfigured by a reactive planner to achieve different tasks. We show how this system allows us to build interchangeable tracking activities that use either color histogram or motion based active vision routines.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信