{"title":"驾驶辅助中的驾驶员感知模型","authors":"Renzhi Tang, Zhihao Jiang","doi":"10.1109/QRS-C51114.2020.00047","DOIUrl":null,"url":null,"abstract":"Vision is the primary way to perceive the environment during driving. However, due to its low spatial and temporal resolution, a driver may fail to perceive agents on the road, which may lead to collisions. Modern vehicles are equipped with sensors that can better perceive the driving environment, as well as ADAS to provide driving assist. However, ADAS does not consider the driver's perception, which may result in unnecessary warnings or actions against the driver's will. These false-positives may cause distractions and confusions in complex driving scenarios, which pose safety threat. In this project, we proposed a driving assist system which can reduce the number of unnecessary warnings by taking into account the driver's perception of the driving environment. The driver's perception model combines estimation of driving environment update and driver's observation. The driver's observation is obtained from gaze tracking and the driving environment update is estimated based on the last observation. In this paper, we formulated inference problem on the driver's perception, and developed a virtual driving simulator to evaluate the feasibility of the system.","PeriodicalId":358174,"journal":{"name":"2020 IEEE 20th International Conference on Software Quality, Reliability and Security Companion (QRS-C)","volume":"59 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Driver's Perception Model in Driving Assist\",\"authors\":\"Renzhi Tang, Zhihao Jiang\",\"doi\":\"10.1109/QRS-C51114.2020.00047\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Vision is the primary way to perceive the environment during driving. However, due to its low spatial and temporal resolution, a driver may fail to perceive agents on the road, which may lead to collisions. Modern vehicles are equipped with sensors that can better perceive the driving environment, as well as ADAS to provide driving assist. However, ADAS does not consider the driver's perception, which may result in unnecessary warnings or actions against the driver's will. These false-positives may cause distractions and confusions in complex driving scenarios, which pose safety threat. In this project, we proposed a driving assist system which can reduce the number of unnecessary warnings by taking into account the driver's perception of the driving environment. The driver's perception model combines estimation of driving environment update and driver's observation. The driver's observation is obtained from gaze tracking and the driving environment update is estimated based on the last observation. In this paper, we formulated inference problem on the driver's perception, and developed a virtual driving simulator to evaluate the feasibility of the system.\",\"PeriodicalId\":358174,\"journal\":{\"name\":\"2020 IEEE 20th International Conference on Software Quality, Reliability and Security Companion (QRS-C)\",\"volume\":\"59 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 IEEE 20th International Conference on Software Quality, Reliability and Security Companion (QRS-C)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/QRS-C51114.2020.00047\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE 20th International Conference on Software Quality, Reliability and Security Companion (QRS-C)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/QRS-C51114.2020.00047","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Vision is the primary way to perceive the environment during driving. However, due to its low spatial and temporal resolution, a driver may fail to perceive agents on the road, which may lead to collisions. Modern vehicles are equipped with sensors that can better perceive the driving environment, as well as ADAS to provide driving assist. However, ADAS does not consider the driver's perception, which may result in unnecessary warnings or actions against the driver's will. These false-positives may cause distractions and confusions in complex driving scenarios, which pose safety threat. In this project, we proposed a driving assist system which can reduce the number of unnecessary warnings by taking into account the driver's perception of the driving environment. The driver's perception model combines estimation of driving environment update and driver's observation. The driver's observation is obtained from gaze tracking and the driving environment update is estimated based on the last observation. In this paper, we formulated inference problem on the driver's perception, and developed a virtual driving simulator to evaluate the feasibility of the system.