2018 IEEE Region 10 Humanitarian Technology Conference (R10-HTC)最新文献

筛选
英文 中文
Second Order Volterra Filter for Appliance Modelling 用于器具造型的二阶Volterra滤波器
2018 IEEE Region 10 Humanitarian Technology Conference (R10-HTC) Pub Date : 2018-12-01 DOI: 10.1109/R10-HTC.2018.8629851
M. Akram, Neelanga Thelasingha, R. Godaliyadda, Parakrama B. Ekanayake, J. Ekanayake
{"title":"Second Order Volterra Filter for Appliance Modelling","authors":"M. Akram, Neelanga Thelasingha, R. Godaliyadda, Parakrama B. Ekanayake, J. Ekanayake","doi":"10.1109/R10-HTC.2018.8629851","DOIUrl":"https://doi.org/10.1109/R10-HTC.2018.8629851","url":null,"abstract":"Availability of large quantities of residential electrical consumption data is bringing considerable attention towards load monitoring, load forecasting, load disaggregation and demand response. Load modelling is the first and most essential step in achieving all the above said tasks. Even though many appliance modelling schemes are presented in the literature, no considerably influential work has been done on modelling appliances under voltage fluctuating environment. Motivated by this fact, we present the design and analysis of a Volterra based appliance modelling scheme which can be used in a voltage fluctuating environment. Principles of Volterra filter, least mean square algorithm for Volterra filter coefficient approximation and applicability of Volterra filter for appliance modelling are discussed. Further, a case study is presented to validate and identify the performance of the model using a data set obtained from a real household. Obtained results show that, Volterra filter can be utilized as an efficient tool for appliance modelling in a supply voltage fluctuating environment. Finally, how Volterra filter modelling can be extended to achieve the non intrusive load monitoring task is discussed.","PeriodicalId":404432,"journal":{"name":"2018 IEEE Region 10 Humanitarian Technology Conference (R10-HTC)","volume":"108 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124067943","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Developing a voice controlled wheelchair with enhanced safety through multimodal approach 通过多模态方法开发一种提高安全性的语音控制轮椅
2018 IEEE Region 10 Humanitarian Technology Conference (R10-HTC) Pub Date : 2018-12-01 DOI: 10.1109/R10-HTC.2018.8629829
S. Priyanayana, A. G. Buddhika, P. Jayasekara
{"title":"Developing a voice controlled wheelchair with enhanced safety through multimodal approach","authors":"S. Priyanayana, A. G. Buddhika, P. Jayasekara","doi":"10.1109/R10-HTC.2018.8629829","DOIUrl":"https://doi.org/10.1109/R10-HTC.2018.8629829","url":null,"abstract":"The rising population of disabled and elderly community and lack of caretakers to look after them have become a crisis in most countries. Voice controlled wheelchairs give its user a chance to interact with the wheelchair in a humane manne. Due to the uncertaities of the voice commands and unreliability of using one interaction modality, user safety is compromised. In natural wheelchair user and caretaker conversation a lot of distance related uncertain instructions like ‘little’ and ‘far’ will be used. With incorporating these uncertain terms, the interpretation of voice commands will be enhanced. In most existing systems, Joystick support is used as backup modality. Even though this is intended as a safety feature, accidental joystick operation can lead to unfortunate situations. Therfore a reliable safety system considering multimodal aspect of two modalities is needed. Therefore this paper proposes a voice controlled intelligent wheelchair system incorporating uncertain voice information with safe navigation system using multmodal approach. Uncertain Information Module(UIM) have been introduced in order to interpret uncertain voice information and Multimodal Safety System is introduced in order to ensure user safety. Usability experiments have been carried out to evaluate the reliability of voice control system and Multimodal safety system.","PeriodicalId":404432,"journal":{"name":"2018 IEEE Region 10 Humanitarian Technology Conference (R10-HTC)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132029561","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
A Multispectral Imaging System to Assess Meat Quality 一种评估肉类品质的多光谱成像系统
2018 IEEE Region 10 Humanitarian Technology Conference (R10-HTC) Pub Date : 2018-12-01 DOI: 10.1109/R10-HTC.2018.8629858
W. G. C. Bandara, G. Prabhath, D. M. K. V. B. Dissanayake, H. Herath, G. Godaliyadda, M. Ekanayake, S. S. P. Vithana, S. Demini, T. Madhujith
{"title":"A Multispectral Imaging System to Assess Meat Quality","authors":"W. G. C. Bandara, G. Prabhath, D. M. K. V. B. Dissanayake, H. Herath, G. Godaliyadda, M. Ekanayake, S. S. P. Vithana, S. Demini, T. Madhujith","doi":"10.1109/R10-HTC.2018.8629858","DOIUrl":"https://doi.org/10.1109/R10-HTC.2018.8629858","url":null,"abstract":"Multispectral imaging uses reflectance information of a number of discrete spectral bands to classify samples according to their quality defined using standard parameters. A multispectral image is rich in information compared to a normal RGB image. Therefore, a multispectral image can be used to classify samples more accurately than an RGB image. This paper discusses a design of a multispectral imaging system that can be used to assess the quality of meat. The system is comprised of six LEDs with nominal wavelengths between 405 nm and 740 nm. The light emitted from LEDs reach the meat sample placed inside a dark chamber through an integrating hemisphere. LEDs are lighted one at a time and images of the meat sample are captured for each flash separately using a smartphone camera. Eventually, all the images of the meat sample, taken at a specific time instance were integrated to form the multispectral image. The meat samples stored at $4 circ mathrm {c}$ were imaged up to four days at predetermined time intervals using the designed system. Once the data acquisition was completed, all the pixels of the multispectral image were represented as points in high dimensional space, which was then reduced to a lower dimensional space using Principal Component Analysis (PCA). It was observed that images of meat sample obtained at different time instances clustered into different regions in the lower dimensional space. The experiment was performed with chicken meat samples. This proves the viability of using multispectral imaging as a non-invasive and non-destructive method of assessing meat quality according to certain quality parameters. Off-the-shelf electronic components and a regular smartphone were used to build the system, thus making the system cost-effective.","PeriodicalId":404432,"journal":{"name":"2018 IEEE Region 10 Humanitarian Technology Conference (R10-HTC)","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128416919","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Deep Learning-Based Eye Gaze Controlled Robotic Car 基于深度学习的眼睛注视控制机器人汽车
2018 IEEE Region 10 Humanitarian Technology Conference (R10-HTC) Pub Date : 2018-12-01 DOI: 10.1109/R10-HTC.2018.8629836
Dipayan Saha, Munia Ferdoushi, Md. Tanvir Emrose, Subrata Das, S. Hasan, Asir Intisar Khan, C. Shahnaz
{"title":"Deep Learning-Based Eye Gaze Controlled Robotic Car","authors":"Dipayan Saha, Munia Ferdoushi, Md. Tanvir Emrose, Subrata Das, S. Hasan, Asir Intisar Khan, C. Shahnaz","doi":"10.1109/R10-HTC.2018.8629836","DOIUrl":"https://doi.org/10.1109/R10-HTC.2018.8629836","url":null,"abstract":"In recent years Eye gaze tracking (EGT) has emerged as an attractive alternative to conventional communication modes. Gaze estimation can be effectively used in human-computer interaction, assistive devices for motor-disabled persons, autonomous robot control systems, safe car driving, diagnosis of diseases and even in human sentiment assessment. Implementation in any of these areas however mostly depends on the efficiency of detection algorithm along with usability and robustness of detection process. In this context we have proposed a Convolutional Neural Network (CNN) architecture to estimate the eye gaze direction from detected eyes which outperforms all other state of the art results for Eye-Chimera dataset. The overall accuracies are 90.21% and 99.19% for Eye-Chimera and HPEG datasets respectively. This paper also introduces a new dataset EGDC for which proposed algorithm finds 86.93% accuracy. We have developed a real-time eye gaze controlled robotic car as a prototype for possible implementations of our algorithm.","PeriodicalId":404432,"journal":{"name":"2018 IEEE Region 10 Humanitarian Technology Conference (R10-HTC)","volume":"107 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125611181","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
R10-HTC 2018 Author Index R10-HTC 2018作者索引
2018 IEEE Region 10 Humanitarian Technology Conference (R10-HTC) Pub Date : 2018-12-01 DOI: 10.1109/r10-htc.2018.8629812
{"title":"R10-HTC 2018 Author Index","authors":"","doi":"10.1109/r10-htc.2018.8629812","DOIUrl":"https://doi.org/10.1109/r10-htc.2018.8629812","url":null,"abstract":"","PeriodicalId":404432,"journal":{"name":"2018 IEEE Region 10 Humanitarian Technology Conference (R10-HTC)","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115267593","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Plastic Waste is Exponentially Filling our Oceans, but where are the Robots? 塑料垃圾正以指数方式填满我们的海洋,但机器人在哪里?
2018 IEEE Region 10 Humanitarian Technology Conference (R10-HTC) Pub Date : 2018-09-04 DOI: 10.1109/R10-HTC.2018.8629805
Juan Rojas
{"title":"Plastic Waste is Exponentially Filling our Oceans, but where are the Robots?","authors":"Juan Rojas","doi":"10.1109/R10-HTC.2018.8629805","DOIUrl":"https://doi.org/10.1109/R10-HTC.2018.8629805","url":null,"abstract":"Plastic waste is filling our oceans at an exponential rate. The situation is catastrophic and has now garnered worldwide attention. Despite the catastrophic conditions, little to no robotics research is conducted in the identification, collection, sorting, and removal of plastic waste from oceans and rivers and at the macro-and micro-scale. Only a scarce amount of individual efforts can be found from private sources. This paper presents a cursory view of the current plastic water waste catastrophe, associated robot research, and other efforts currently underway to address the issue. As well as the call that as a community, we must wait no longer to address the problem. Surely there is much potential for robots to help meet the challenges posed by the enormity of this problem.","PeriodicalId":404432,"journal":{"name":"2018 IEEE Region 10 Humanitarian Technology Conference (R10-HTC)","volume":"95 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-09-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133108059","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信