2022 IEEE Sensors Applications Symposium (SAS)最新文献

筛选
英文 中文
Pupil Detection for Augmented and Virtual Reality based on Images with Reduced Bit Depths 基于减小位深图像的增强现实和虚拟现实瞳孔检测
2022 IEEE Sensors Applications Symposium (SAS) Pub Date : 2022-08-01 DOI: 10.1109/SAS54819.2022.9881378
Gernot Fiala, Zhenyu Ye, C. Steger
{"title":"Pupil Detection for Augmented and Virtual Reality based on Images with Reduced Bit Depths","authors":"Gernot Fiala, Zhenyu Ye, C. Steger","doi":"10.1109/SAS54819.2022.9881378","DOIUrl":"https://doi.org/10.1109/SAS54819.2022.9881378","url":null,"abstract":"For future augmented reality (AR) and virtual reality (VR) applications, several different kinds of sensors will be used. These sensors, to give some examples, are used for gesture recognition, head pose tracking and pupil tracking. All these sensors send data to a host platform, where the data must be processed in real-time. This requires high processing power which leads to higher energy consumption. To lower the energy consumption, optimizations of the image processing system are necessary. This paper investigates pupil detection for AR/VR applications based on images with reduced bit depths. It shows that images with reduced bit depths even down to 3 or 2 bits can be used for pupil detection, with almost the same average detection rate. Reduced bit depths of an image reduces the memory foot-print, which allows to perform in-sensor processing for future image sensors and provides the foundation for future in-sensor processing architectures.","PeriodicalId":129732,"journal":{"name":"2022 IEEE Sensors Applications Symposium (SAS)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125969028","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Active magnetic ranging while drilling: A down-hole surroundings mapping 随钻主动磁测距:井下环境测绘
2022 IEEE Sensors Applications Symposium (SAS) Pub Date : 2022-08-01 DOI: 10.1109/SAS54819.2022.9881354
K. Husby, A. Saasen, J. D. Ytrehus, M. Hjelstuen, T. Eriksen, A. Liberale
{"title":"Active magnetic ranging while drilling: A down-hole surroundings mapping","authors":"K. Husby, A. Saasen, J. D. Ytrehus, M. Hjelstuen, T. Eriksen, A. Liberale","doi":"10.1109/SAS54819.2022.9881354","DOIUrl":"https://doi.org/10.1109/SAS54819.2022.9881354","url":null,"abstract":"Active magnetic ranging (AMR) while drilling is an electromagnetic method used to map subsurface ground by its conductivity. Subsurface mapping is needed both in the oil and gas industry and in the geothermal drilling industry. In both cases, several wells are drilled close to each other to exploit the full potential of either an oil reservoir or a geothermal reservoir. The challenge however with subsurface mapping compared to thin air radar mapping is the very low skin depth given by the high conductivity of the ground. For that reason, existing systems are often limited to very short range operations.In this paper methods for range improvement are presented. To maximize the range potential the frequency of operation is reduced, and the efficiency and size of the antennas are increased as much as possible.","PeriodicalId":129732,"journal":{"name":"2022 IEEE Sensors Applications Symposium (SAS)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129365489","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Effects of Lighting and Window Length on Heart Rate Assessment through Video Magnification 光照和窗长对视频放大心率评估的影响
2022 IEEE Sensors Applications Symposium (SAS) Pub Date : 2022-08-01 DOI: 10.1109/SAS54819.2022.9881347
L. Kassab, Andrew J. Law, Bruce Wallace, J. Larivière-Chartier, R. Goubran, F. Knoefel
{"title":"Effects of Lighting and Window Length on Heart Rate Assessment through Video Magnification","authors":"L. Kassab, Andrew J. Law, Bruce Wallace, J. Larivière-Chartier, R. Goubran, F. Knoefel","doi":"10.1109/SAS54819.2022.9881347","DOIUrl":"https://doi.org/10.1109/SAS54819.2022.9881347","url":null,"abstract":"Screening people for signs of illness through contactless measurement of vital signs could be beneficial in public transportation settings or long-term care facilities. To achieve this goal, one solution could utilize Red/Green/Blue (RGB) video cameras to measure heart rate. In this work, we present results for the assessment of heart rate through Video Magnification (VM) techniques applied to RGB face video recordings from 19 subjects. The work specifically explores (1) the effect of two lighting illumination levels and (2) the effect of window length on the accuracy of heart rate extraction via Video Magnification. The results show that higher illumination, as a result of combining halogen light with LED, yielded lower average errors in heart rate measured through Video Magnification. Additionally, the results show that increasing the window length from 10 seconds up to 30 seconds improves VM heart rate accuracy when there are small frequent head movements in the video but decreases heart rate accuracy in the absence of head motion.","PeriodicalId":129732,"journal":{"name":"2022 IEEE Sensors Applications Symposium (SAS)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132137552","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Towards lightweight deep neural network for smart agriculture on embedded systems 面向嵌入式系统智能农业的轻量级深度神经网络研究
2022 IEEE Sensors Applications Symposium (SAS) Pub Date : 2022-08-01 DOI: 10.1109/SAS54819.2022.9881382
Pengwei Du, T. Polonelli, M. Magno, Zhiyuan Cheng
{"title":"Towards lightweight deep neural network for smart agriculture on embedded systems","authors":"Pengwei Du, T. Polonelli, M. Magno, Zhiyuan Cheng","doi":"10.1109/SAS54819.2022.9881382","DOIUrl":"https://doi.org/10.1109/SAS54819.2022.9881382","url":null,"abstract":"Agriculture is the pillar industry for human survival. However, various diseases threaten the health of crops and lead to a decrease in yield. Industry 4.0 is making strides in plant illness prevention and detection, other than supporting farmers to improve plantations’ income. To prevent crop diseases in time, this paper proposes, implements, and evaluates a low-power smart camera. It features a lightweight neural network to verify and monitor the growth status of crops. The proposed tiny model features optimized complexity, to be deployed in milliwatt power microcontrollers, and high accuracy. Experimental results show that our work reaches 99% accuracy on a 4-classes dataset and more than 96% for a 10 classes dataset. The compact model size (139 kB) and low complexity enable ultra-low power consumption (2.63 mW per hour) on the battery-powered Sony Spresense platform, which features a six-core ARM Cortex-M4F.","PeriodicalId":129732,"journal":{"name":"2022 IEEE Sensors Applications Symposium (SAS)","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125414359","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
A simple and highly sensitive Force Sensor based on modified plastic optical fibers and cantilevers 一种基于改性塑料光纤和悬臂梁的简单、高灵敏度力传感器
2022 IEEE Sensors Applications Symposium (SAS) Pub Date : 2022-08-01 DOI: 10.1109/SAS54819.2022.9881346
N. Cennamo, F. Arcadio, V. Marletta, D. D. Prete, B. Andò, L. Zeni, Mario Cesaro, Alfredo De Matteis
{"title":"A simple and highly sensitive Force Sensor based on modified plastic optical fibers and cantilevers","authors":"N. Cennamo, F. Arcadio, V. Marletta, D. D. Prete, B. Andò, L. Zeni, Mario Cesaro, Alfredo De Matteis","doi":"10.1109/SAS54819.2022.9881346","DOIUrl":"https://doi.org/10.1109/SAS54819.2022.9881346","url":null,"abstract":"In this work, a force sensor based on plastic optical fibers (POFs) is realized and tested. More specifically, the optical sensor system is composed of a cantilever obtained by a spring-steel beam and a modified POF glued on the underside of the cantilever. One end of the cantilever is fixed to the optical desk using a developed support, while on the other end, a weight is applied to realize an applied force. The POF is modified by notches in order to improve the optical performance of the force sensor. An analysis is carried out to characterize the sensor system. In particular, it has a linear behaviour ranging from 50 mN to 300 mN with a sensitivity of 53.43 mV/N and a resolution of 0.01 N.","PeriodicalId":129732,"journal":{"name":"2022 IEEE Sensors Applications Symposium (SAS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131105715","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Assessment of UWB RTLS for Proximity Hazards Management in Construction Sites 超宽带RTLS在建筑工地近距离危害管理中的应用评估
2022 IEEE Sensors Applications Symposium (SAS) Pub Date : 2022-08-01 DOI: 10.1109/SAS54819.2022.9881376
P. Bellagente
{"title":"Assessment of UWB RTLS for Proximity Hazards Management in Construction Sites","authors":"P. Bellagente","doi":"10.1109/SAS54819.2022.9881376","DOIUrl":"https://doi.org/10.1109/SAS54819.2022.9881376","url":null,"abstract":"According to statistics, the construction market is one of the most dangerous economic sector all around the world. Construction workers are continuously exposed to moving materials and machinery, often in constrained spaces, rising the risk of collision accidents. In this paper an Ultra-Wide Band (UWB) Real Time Location System (RTLS) designed in a previous work for proximity hazards management in construction sites is described in detail. An extensive measurement campaign have been carried out outdoor, using a square grid (15 m x 15 m) and 1 m step, for a total of 225 positions. For each position, 1000 location measures have been collected and the bidimensional localization resolution has been estimated. Results show that location resolution remains similar across the considered area and that it could be manually verified by construction workers. In optimal conditions, the resolution ranges are within 0.01 m and 0.05 m . The results highlight a major error contribution due to radio-frequency reflection interference, which makes impossible to measure positions under some conditions.","PeriodicalId":129732,"journal":{"name":"2022 IEEE Sensors Applications Symposium (SAS)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132347284","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Development of a neural network to identify plastics using Fluorescence Lifetime Imaging Microscopy 利用荧光寿命成像显微镜识别塑料的神经网络的发展
2022 IEEE Sensors Applications Symposium (SAS) Pub Date : 2022-08-01 DOI: 10.1109/SAS54819.2022.9881372
Georgekutty Jose Maniyattu, Eldho Geegy, N. Leiter, Maximilian Wohlschlager, M. Versen, C. Laforsch
{"title":"Development of a neural network to identify plastics using Fluorescence Lifetime Imaging Microscopy","authors":"Georgekutty Jose Maniyattu, Eldho Geegy, N. Leiter, Maximilian Wohlschlager, M. Versen, C. Laforsch","doi":"10.1109/SAS54819.2022.9881372","DOIUrl":"https://doi.org/10.1109/SAS54819.2022.9881372","url":null,"abstract":"Plastics have become a major part of human’s daily life. An uncontrolled usage of plastic leads to an accumulation in the environment posing a threat to flora and fauna, if not recycled correctly. The correct sorting and recycling of the most commonly available plastic types and an identification of plastic in the environment are important. Fluorescence lifetime imaging microscopy shows a high potential in sorting and identifying plastic types. A data-based and an image-based classification are investigated using python programming language to demonstrate the potential of a neural network based on fluorescence lifetime images to identify plastic types. The results indicate that the data-based classification has a higher identification accuracy compared to the image-based classification.","PeriodicalId":129732,"journal":{"name":"2022 IEEE Sensors Applications Symposium (SAS)","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115546176","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Feasibility of Measuring Shot Group Using LoRa Technology and YOLO V5 利用LoRa技术和YOLO V5测量枪弹组的可行性
2022 IEEE Sensors Applications Symposium (SAS) Pub Date : 2022-08-01 DOI: 10.1109/SAS54819.2022.9881356
Sanghyun Park, Dongheon Lee, Jisoo Choi, Dohyeon Ko, Minji Lee, Zack Murphy, Nowf Binhowidy, Anthony H. Smith
{"title":"Feasibility of Measuring Shot Group Using LoRa Technology and YOLO V5","authors":"Sanghyun Park, Dongheon Lee, Jisoo Choi, Dohyeon Ko, Minji Lee, Zack Murphy, Nowf Binhowidy, Anthony H. Smith","doi":"10.1109/SAS54819.2022.9881356","DOIUrl":"https://doi.org/10.1109/SAS54819.2022.9881356","url":null,"abstract":"Shooting is a common activity all over the world for both military and recreational purposes. Shooting performance can be measured from the size of the shot group (grouping). Shooters have been calculating the size of the group by measuring the distance between bullet impacts using their hands. This paper aims to create a reasonable automated shot grouping size measuring module that is available from several kilometers away. It includes an IoT(Internet of Things) system and a mobile application that users can access. LoRa technology is adopted for covering long distances, and YOLO V5 is implemented to detect bullet impacts. Mathematical methods for calculating accurate distance and engineering techniques to fill the needs are described with experiments on various parameters and conditions. The proposed module showed that indoor tests measured the shot group with a mean accuracy of 91.8%. For future work, outdoor tests, which were affected by environmental control variables, are expected to give better accuracy.","PeriodicalId":129732,"journal":{"name":"2022 IEEE Sensors Applications Symposium (SAS)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121584681","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Live Migration of a 3D Flash LiDAR System between two Independent Data Processing Systems with Redundant Design 基于冗余设计的三维闪光激光雷达系统在两个独立数据处理系统之间的实时迁移
2022 IEEE Sensors Applications Symposium (SAS) Pub Date : 2022-08-01 DOI: 10.1109/SAS54819.2022.9881255
Philipp Stelzer, Sebastian Reicher, Georg Macher, C. Steger, Raphael Schermann
{"title":"Live Migration of a 3D Flash LiDAR System between two Independent Data Processing Systems with Redundant Design","authors":"Philipp Stelzer, Sebastian Reicher, Georg Macher, C. Steger, Raphael Schermann","doi":"10.1109/SAS54819.2022.9881255","DOIUrl":"https://doi.org/10.1109/SAS54819.2022.9881255","url":null,"abstract":"Self-driving and self-flying vehicles have the ability to drive respectively fly independently without the intervention of an operator. For this purpose, these vehicles need sensors for environment perception and data processing systems, which are safety-critical, to process the obtained raw data from these sensors. However, if such safety-critical systems fail, this can have fatal consequences and can affect human lives and/or the environment, especially in the case of highly automated vehicles. A total failure of these systems is one of the worst scenarios in an automated vehicle. Therefore, such safety-critical systems are often designed redundantly in order to prevent a total failure of environment perception. In order to ensure that the operation of the vehicle can continue safely, however, the live migration from one system to the other must be carried out with as little downtime as possible. In our publication, we present a concept for a 3D Flash LiDAR live migration between two independent data processing systems with redundant design. This concept provides a solution for highly automated vehicles to remain fail-operational in case one of the redundant data processing systems fails. The results obtained from the implemented concept, without specifically addressing performance, are also provided to demonstrate feasibility.","PeriodicalId":129732,"journal":{"name":"2022 IEEE Sensors Applications Symposium (SAS)","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122721611","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Evaluation of the quality of LiDAR data in the varying ambient light 在不同环境光下评估激光雷达数据的质量
2022 IEEE Sensors Applications Symposium (SAS) Pub Date : 2022-08-01 DOI: 10.1109/SAS54819.2022.9881373
Bhaskar Anand, Harshal Verma, A. Thakur, Parvez Alam, P. Rajalakshmi
{"title":"Evaluation of the quality of LiDAR data in the varying ambient light","authors":"Bhaskar Anand, Harshal Verma, A. Thakur, Parvez Alam, P. Rajalakshmi","doi":"10.1109/SAS54819.2022.9881373","DOIUrl":"https://doi.org/10.1109/SAS54819.2022.9881373","url":null,"abstract":"Light detection and ranging (LiDAR) is a widely used sensor for Intelligent transportation systems (ITS). It precisely determines the depth of the objects present around a vehicle. In this paper, the effect of light on the quality of acquired LiDAR data has been presented. The data was captured at different times in a day with varied light conditions. In the early morning and evening, there is partial light. At the night there is no light whereas in the mid-day there is perfect light condition. The data was acquired in the above four timings. On the acquired point cloud data, segmentation of an object, a person in the experiment, was performed. The number of object points and the point density have been observed to examine if light affects the quality of LiDAR data. The results, of the experiments, performed, suggest that the variation of light has little or no effect on the quality of LiDAR data.","PeriodicalId":129732,"journal":{"name":"2022 IEEE Sensors Applications Symposium (SAS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120978146","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信