Chongzhi Xu, Xiaolong Zheng, Z. Ren, Liang Liu, Huadong Ma
{"title":"UHead: Driver Attention Monitoring System Using UWB Radar","authors":"Chongzhi Xu, Xiaolong Zheng, Z. Ren, Liang Liu, Huadong Ma","doi":"10.1145/3643551","DOIUrl":null,"url":null,"abstract":"The focus of Advanced driver-assistance systems (ADAS) is extending from the vehicle and road conditions to the driver because the driver's attention is critical to driving safety. Although existing sensor and camera based methods can monitor driver attention, they rely on specialised hardware and environmental conditions. In this paper, we aim to develop an effective and easy-to-use driver attention monitoring system based on UWB radar. We exploit the strong association between head motions and driver attention and propose UHead that infers driver attention by monitoring the direction and angle of the driver's head rotation. The core idea is to extract rotational time-frequency representation from reflected signals and to estimate head rotation angles from complex head reflections. To eliminate the dynamic noise generated by other body parts, UHead leverages the large magnitude and high velocity of head rotation to extract head motion information from the dynamically coupled information. UHead uses a bilinear joint time-frequency representation to avoid the loss of time and frequency resolution caused by windowing of traditional methods. We also design a head structure-based rotation angle estimation algorithm to accurately estimate the rotation angle from the time-varying rotation information of multiple reflection points in the head. Experimental results show that we achieve 12.96° median error of 3D head rotation angle estimation in real vehicle scenes.","PeriodicalId":20463,"journal":{"name":"Proc. ACM Interact. Mob. Wearable Ubiquitous Technol.","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-03-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proc. ACM Interact. Mob. Wearable Ubiquitous Technol.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3643551","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The focus of Advanced driver-assistance systems (ADAS) is extending from the vehicle and road conditions to the driver because the driver's attention is critical to driving safety. Although existing sensor and camera based methods can monitor driver attention, they rely on specialised hardware and environmental conditions. In this paper, we aim to develop an effective and easy-to-use driver attention monitoring system based on UWB radar. We exploit the strong association between head motions and driver attention and propose UHead that infers driver attention by monitoring the direction and angle of the driver's head rotation. The core idea is to extract rotational time-frequency representation from reflected signals and to estimate head rotation angles from complex head reflections. To eliminate the dynamic noise generated by other body parts, UHead leverages the large magnitude and high velocity of head rotation to extract head motion information from the dynamically coupled information. UHead uses a bilinear joint time-frequency representation to avoid the loss of time and frequency resolution caused by windowing of traditional methods. We also design a head structure-based rotation angle estimation algorithm to accurately estimate the rotation angle from the time-varying rotation information of multiple reflection points in the head. Experimental results show that we achieve 12.96° median error of 3D head rotation angle estimation in real vehicle scenes.