Deepak R. Karuppiah, R. Grupen, A. Hanson, E. Riseman
{"title":"Smart resource reconfiguration by exploiting dynamics in perceptual tasks","authors":"Deepak R. Karuppiah, R. Grupen, A. Hanson, E. Riseman","doi":"10.1109/IROS.2005.1545247","DOIUrl":null,"url":null,"abstract":"In robot and sensor networks, one of the key challenges is to decide when and where to deploy sensory resources to gather information of optimal value. The problem is essentially one of planning, scheduling and controlling the sensors in the network to acquire data from an environment that is constantly varying. The dynamic nature of the problem precludes the use of traditional rule-based strategies that can handle only quasi-static context changes. Automatic context derivation procedures are thus essential for providing fault recovery and fault pre-emption in such systems. We posit that the quality of a sensor network configuration depends on sensor coverage and geometry, sensor allocation policies, and the dynamic processes in the environment. In this paper, we show how these factors can be manipulated in an adaptive framework for robust run-time resource management. We demonstrate our ideas in a people tracking application using a network of multiple cameras. The task specification for our multi-camera network is one of allocating a camera pair that can best localize a human subject given the current context. The system automatically derives policies for switching between camera pairs that enable robust tracking while being attentive to performance measures. Our approach is unique in that we do not make any a priori assumptions about the scene or the activities that take place in the scene. Models of motion dynamics in the scene and the camera network configuration steer the policies to provide robust tracking.","PeriodicalId":189219,"journal":{"name":"2005 IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"74 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2005-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"20","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2005 IEEE/RSJ International Conference on Intelligent Robots and Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IROS.2005.1545247","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 20
Abstract
In robot and sensor networks, one of the key challenges is to decide when and where to deploy sensory resources to gather information of optimal value. The problem is essentially one of planning, scheduling and controlling the sensors in the network to acquire data from an environment that is constantly varying. The dynamic nature of the problem precludes the use of traditional rule-based strategies that can handle only quasi-static context changes. Automatic context derivation procedures are thus essential for providing fault recovery and fault pre-emption in such systems. We posit that the quality of a sensor network configuration depends on sensor coverage and geometry, sensor allocation policies, and the dynamic processes in the environment. In this paper, we show how these factors can be manipulated in an adaptive framework for robust run-time resource management. We demonstrate our ideas in a people tracking application using a network of multiple cameras. The task specification for our multi-camera network is one of allocating a camera pair that can best localize a human subject given the current context. The system automatically derives policies for switching between camera pairs that enable robust tracking while being attentive to performance measures. Our approach is unique in that we do not make any a priori assumptions about the scene or the activities that take place in the scene. Models of motion dynamics in the scene and the camera network configuration steer the policies to provide robust tracking.