2009 IEEE International Workshop on Robotic and Sensors Environments最新文献

筛选
英文 中文
Hybrid networking infrastructure for greenhouse management system 温室管理系统的混合网络基础设施
2009 IEEE International Workshop on Robotic and Sensors Environments Pub Date : 2009-12-18 DOI: 10.1109/ROSE.2009.5355979
O. Mirabella, M. Brischetto
{"title":"Hybrid networking infrastructure for greenhouse management system","authors":"O. Mirabella, M. Brischetto","doi":"10.1109/ROSE.2009.5355979","DOIUrl":"https://doi.org/10.1109/ROSE.2009.5355979","url":null,"abstract":"In this paper we focus on the problems related to the management of a farm made up of several greenhouses. Management of this kind of farms requires data acquisition in each greenhouse and their transfer to a control unit which is usually located in a control room, separated from the production area. At present, data transfer between the greenhouses and the control system is mainly provided by a suitable wired communication system, such as a Fieldbus. In such contexts, even though the replacement of the wired system with a fully wireless one can appear very attractive, a fully wireless system can introduce some disadvantages. A solution based on a hybrid wired/wireless network, where CAN and ZigBee protocols are used, is presented along with all the related problems that this integration involves. In particular, in order to integrate the wireless section with the wired one at the Data Link Layer, a suitable multi-protocol bridge has been implemented, while, at the application layer, a porting of SDS services on ZigBee, which we called ZSDS, allows us to access the network resources independently from the network segment they are connected to.","PeriodicalId":107220,"journal":{"name":"2009 IEEE International Workshop on Robotic and Sensors Environments","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2009-12-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133041585","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
A modified bootstrap filter 修改后的引导过滤器
2009 IEEE International Workshop on Robotic and Sensors Environments Pub Date : 2009-12-18 DOI: 10.1109/ROSE.2009.5356002
Qi Cheng, P. Bondon
{"title":"A modified bootstrap filter","authors":"Qi Cheng, P. Bondon","doi":"10.1109/ROSE.2009.5356002","DOIUrl":"https://doi.org/10.1109/ROSE.2009.5356002","url":null,"abstract":"This paper presents a new method to draw particles in the particle filter. The standard bootstrap filter draw particles randomly from the prior density which does not use the latest information of the observation. Some improvements consist in using extended Kalman filter or unscented Kalman filter to produce the importance distribution in order to move the particles from the domain of low likelihood to the domain of high likelihood by using the latest information of the observation. These methods work well when the state noise is small. We propose a modified bootstrap filter which uses a new method to draw the particles in the scenario of a big state noise. We show through numerical examples that it outperforms the bootstrap filter with the same computational complexity","PeriodicalId":107220,"journal":{"name":"2009 IEEE International Workshop on Robotic and Sensors Environments","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2009-12-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114074929","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
A multi-modal gesture recognition system in a Human-Robot Interaction scenario 人机交互场景中的多模态手势识别系统
2009 IEEE International Workshop on Robotic and Sensors Environments Pub Date : 2009-12-18 DOI: 10.1109/ROSE.2009.5355984
Zhi Li, R. Jarvis
{"title":"A multi-modal gesture recognition system in a Human-Robot Interaction scenario","authors":"Zhi Li, R. Jarvis","doi":"10.1109/ROSE.2009.5355984","DOIUrl":"https://doi.org/10.1109/ROSE.2009.5355984","url":null,"abstract":"Recognition of non-verbal gestures is essential for robots to understand a user's state and intention in a Human-Robot Interaction (HRI) scenario. In this paper a multi-modal system is proposed to recognize a user's hand gestures and estimate body poses from the robot's viewpoint only. A range camera is employed to derive the depth data at a high frame rate. Depth data is useful for image segmentation, objects detection and localization in 3D spaces. A pair of stereo cameras is used to sense the user's head gestures and eye gaze direction, which provide useful information about the user's attention direction. Both hand shapes and hand trajectories are recognized. Full configurations of body poses are estimated using a model-based algorithm. Poses are tracked by a Particle Filter method, and refined by a gradient-based searching method in the neighborhood of the particles which have top largest weights.","PeriodicalId":107220,"journal":{"name":"2009 IEEE International Workshop on Robotic and Sensors Environments","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2009-12-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124115541","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 15
Pose and motion estimation of a moving rigid body with few features 具有少量特征的运动刚体的姿态和运动估计
2009 IEEE International Workshop on Robotic and Sensors Environments Pub Date : 2009-12-18 DOI: 10.1109/ROSE.2009.5355973
Valentin Borsu, P. Payeur
{"title":"Pose and motion estimation of a moving rigid body with few features","authors":"Valentin Borsu, P. Payeur","doi":"10.1109/ROSE.2009.5355973","DOIUrl":"https://doi.org/10.1109/ROSE.2009.5355973","url":null,"abstract":"This paper proposes a reliable solution to the problem of estimating the motion of a rigid object moving freely in 3D space, through the use of a passive vision system. The feature-based tracking technique builds upon the selection of a consistent set of features and their tracking on a frame-by-frame basis. A thorough investigation is conducted to determine a proper vision system setup, which results in a configuration that ensures the coverage of the complete patterns of motion that the object may exhibit. While the system relies on low resolution cameras, the proposed algorithm provides subpixel accuracy on the pose estimation of the rigid body and its associated motion. The algorithm is experimentally validated and operates within an execution timeframe that makes it suitable for real-time processing applications.","PeriodicalId":107220,"journal":{"name":"2009 IEEE International Workshop on Robotic and Sensors Environments","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2009-12-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128832712","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Wireless electronic nose network for real-time gas monitoring system 无线电子鼻网络实时气体监测系统
2009 IEEE International Workshop on Robotic and Sensors Environments Pub Date : 2009-12-18 DOI: 10.1109/ROSE.2009.5355983
Young Wung Kim, Sang Jin Lee, G. Kim, G. Jeon
{"title":"Wireless electronic nose network for real-time gas monitoring system","authors":"Young Wung Kim, Sang Jin Lee, G. Kim, G. Jeon","doi":"10.1109/ROSE.2009.5355983","DOIUrl":"https://doi.org/10.1109/ROSE.2009.5355983","url":null,"abstract":"We present a study on the development and testing of a wireless electronic nose network (WENn) for monitoring real-time gas mixture, NH3 and H2S, main malodors in various environments. The proposed WENn is based on an embedded PC, an electronic olfactory system and wireless sensor network (WSN) technology and neuro-fuzzy network algorithms. The WENn used in this work takes advantage of recent advances in low power wireless communication platforms and uses micro-gas sensors with SnO2-CuO and SnO{in2-Pt sensing films for detecting the presence of target gases. Each node in the network real-timely performs classification and concentration estimation of the binary gas mixtures using the fuzzy ART and ARTMAP neural networks and calculation of the measured humidity and temperature in a located point and then transmits the computed results from the measured data set to a sink node via a Zigbeeready RF transceiver. In addition, a monitoring manager virtual instrument (MMVI) is developed using LabVIEW to monitor efficiently the analyzed gas information from the sensor node. To test the reproducibility and reliability of the WENn, on-line experiments are conducted with the gas monitoring system.","PeriodicalId":107220,"journal":{"name":"2009 IEEE International Workshop on Robotic and Sensors Environments","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2009-12-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129618395","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 24
Large area smart tactile sensor for rescue robot 用于救援机器人的大面积智能触觉传感器
2009 IEEE International Workshop on Robotic and Sensors Environments Pub Date : 2009-12-18 DOI: 10.1109/ROSE.2009.5355985
F. Vidal-Verdú, María José Barquero, J. Serón, A. García-Cerezo
{"title":"Large area smart tactile sensor for rescue robot","authors":"F. Vidal-Verdú, María José Barquero, J. Serón, A. García-Cerezo","doi":"10.1109/ROSE.2009.5355985","DOIUrl":"https://doi.org/10.1109/ROSE.2009.5355985","url":null,"abstract":"This paper shows the design of a tactile sensor intended to cover the forearm of the rescue robot ALACRAN. This robot is able to lift hundreds of kilograms so it has to be carefully designed because it will manipulate human beings that can be hurt. So it has to be aware of being in contact with a human being and how he or she is pressed. Not just a binary output is required because contact is often necessary, for instance when a human is held in the arms of the robot. For this sort of operations a kind of artificial skin must provide information about the contact between the robot and the human. This skin will cover the hands to carry out fine manipulation, but it will also cover large areas like forearms. Some devices have been proposed to face such demand. This paper presents one of these sensors that has been obtained by arranging commercial force sensing resistors. Design issues related to this approach and results are presented.","PeriodicalId":107220,"journal":{"name":"2009 IEEE International Workshop on Robotic and Sensors Environments","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2009-12-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114956182","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
Tactile robotic mapping of unknown surfaces: an application to oil well exploration 触觉机器人绘制未知表面:在油井勘探中的应用
2009 IEEE International Workshop on Robotic and Sensors Environments Pub Date : 2009-12-01 DOI: 10.1109/ROSE.2009.5355990
Francesco Mazzini, Daniel T. Kettler, S. Dubowsky, Julio Guerrero
{"title":"Tactile robotic mapping of unknown surfaces: an application to oil well exploration","authors":"Francesco Mazzini, Daniel T. Kettler, S. Dubowsky, Julio Guerrero","doi":"10.1109/ROSE.2009.5355990","DOIUrl":"https://doi.org/10.1109/ROSE.2009.5355990","url":null,"abstract":"World oil demand and advanced oil recovery techniques have made it economically attractive to rehabilitate previously abandoned oil wells. This requires relatively fast mapping of the shape and location of the down-hole well structures. Practical factors prohibit the use of visual and other range sensors in this situation. Here, the feasibility of robotic tactile mapping is studied. A method is developed that only uses the robot joint encoders and avoids any force or tactile sensor, which are complex and unreliable in such a hostile environment. This paper addresses the general problem of intelligent tactile exploration of constrained internal geometries where time is critical. It is assumed that the time required to move a manipulator to acquire a new touch point outweighs computational time. This approach models the down-hole structures with geometric primitives and focuses on exploration efficiency by intelligently searching for new touch points to build the geometric models. The algorithms developed here are shown in simulations and hardware experiments to substantially reduce the data acquisition effort for exploration with a tactile manipulator.","PeriodicalId":107220,"journal":{"name":"2009 IEEE International Workshop on Robotic and Sensors Environments","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2009-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114477435","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 20
Multiobjective selection of features for pattern recognition 多目标选择模式识别特征
2009 IEEE International Workshop on Robotic and Sensors Environments Pub Date : 2009-11-01 DOI: 10.1109/ROSE.2009.5355996
L. Ferariu, D. Panescu
{"title":"Multiobjective selection of features for pattern recognition","authors":"L. Ferariu, D. Panescu","doi":"10.1109/ROSE.2009.5355996","DOIUrl":"https://doi.org/10.1109/ROSE.2009.5355996","url":null,"abstract":"The paper suggests a novel pattern recognition system based on a flexible genetic selection of relevant features. Firstly, a hybrid set of competing features is determined, aggregating the results provided by several different basic extractors, such as principal component analysis, bi-dimensional Fourier transformation, grey-levels and geometric analysis. Subsequently, the most suitable features are chosen, in accordance with the specific properties of the particular visual patterns that have to be recognized, via a multiobjective optimization performed in terms of classification accuracy, parsimony and computational requirements. Pareto-optimal solutions are searched using genetic techniques based on hierarchical encoding. To adapt the selection pressure imposed by the conflicting objectives, a new algorithm for fitness computation is proposed. It efficiently exploits the concept of dominance analysis due to a progressive articulation between the decision mechanism and the search procedure. The experimental trials, performed within the context of a holonic palletizing manufacturing system, illustrate enhanced adaptation capabilities of the designed pattern recognition subsystem.","PeriodicalId":107220,"journal":{"name":"2009 IEEE International Workshop on Robotic and Sensors Environments","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2009-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126166421","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Active people tracking by a PTZ camera in IP surveillance system IP监控系统中通过PTZ摄像头跟踪活动人员
2009 IEEE International Workshop on Robotic and Sensors Environments Pub Date : 2009-11-01 DOI: 10.1109/ROSE.2009.5355997
Parisa Darvish Zadeh Varcheie, Guillaume-Alexandre Bilodeau
{"title":"Active people tracking by a PTZ camera in IP surveillance system","authors":"Parisa Darvish Zadeh Varcheie, Guillaume-Alexandre Bilodeau","doi":"10.1109/ROSE.2009.5355997","DOIUrl":"https://doi.org/10.1109/ROSE.2009.5355997","url":null,"abstract":"In this paper, we propose a fuzzy feature-based method for online body tracking using an IP PTZ camera. Because the camera uses a built-in web server, camera control entails response time and network delays, and thus, the frame rate is irregular and generally low (3–7 fps). Our method has been designed specifically to perform in such conditions. It detects in every frame, candidate targets by extracting moving targets using optical flow, a sampling method, and appearance. The target is detected among samples using a fuzzy classifier. Results show that our system has a good target detection precision (≫ 93%), and low track fragmentation.","PeriodicalId":107220,"journal":{"name":"2009 IEEE International Workshop on Robotic and Sensors Environments","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2009-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131857026","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 14
Data visualization: From body sensor network to social networks 数据可视化:从身体传感器网络到社交网络
2009 IEEE International Workshop on Robotic and Sensors Environments Pub Date : 2009-11-01 DOI: 10.1109/ROSE.2009.5356001
Mohamed Abdur Rahman, Abdulmotaleb El Saddik, W. Gueaieb
{"title":"Data visualization: From body sensor network to social networks","authors":"Mohamed Abdur Rahman, Abdulmotaleb El Saddik, W. Gueaieb","doi":"10.1109/ROSE.2009.5356001","DOIUrl":"https://doi.org/10.1109/ROSE.2009.5356001","url":null,"abstract":"Sensors can capture very sensitive and valuable information without human intervention and send it to remote location. However, capturing sensory data from a Body Sensor Network (BSN) and sending it to social networks is a challenging task. This is because it requires a number of distributed networks to work together seamlessly. The task becomes more challenging when both the BSN and the social networks are mobile. It requires a framework which can handle the mobility of both the BSN and members of social networks and can send the sensory data to the social networks for real-time visualization. In this paper, we propose an open source framework, named SenseFace, which seamlessly incorporates a four-tier network including a BSN, cellular network, Internet and an overlay network consisting of social networks, to pass sensory data from a mobile BSN to the overlay network. The overlay network can intelligently manage one's social network and produce different data visualization formats suitable for email, fax, voicemail, SMS, MMS, APRS network, IM networks such as hotmail, gmail, yahoo, and existing social networks such as Facebook, YouTube, LinkedIn, delicious, Wordpress etc. Finally, we present the framework design and the hardware and software that have been used for the implementation of the framework.","PeriodicalId":107220,"journal":{"name":"2009 IEEE International Workshop on Robotic and Sensors Environments","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2009-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132026817","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 15
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信