{"title":"驾驶模拟器中同步生理和行为传感器","authors":"R. Taib, Benjamin Itzstein, Kun Yu","doi":"10.1145/2663204.2663262","DOIUrl":null,"url":null,"abstract":"Accurate and noise robust multimodal activity and mental state monitoring can be achieved by combining physiological, behavioural and environmental signals. This is especially promising in assistive driving technologies, because vehicles now ship with sensors ranging from wheel and pedal activity, to voice and eye tracking. In practice, however, multimodal user studies are confronted with challenging data collection and synchronisation issues, due to the diversity of sensing, acquisition and storage systems. Referencing current research on cognitive load measurement in a driving simulator, this paper describes the steps we take to consistently collect and synchronise signals, using the Orbit Measurement Library (OML) framework, combined with a multimodal version of a cinema clapperboard. The resulting data is automatically stored in a networked database, in a structured format, including metadata about the data and experiment. Moreover, fine-grained synchronisation between all signals is provided without additional hardware, and clock drift can be corrected post-hoc.","PeriodicalId":389037,"journal":{"name":"Proceedings of the 16th International Conference on Multimodal Interaction","volume":"75 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Synchronising Physiological and Behavioural Sensors in a Driving Simulator\",\"authors\":\"R. Taib, Benjamin Itzstein, Kun Yu\",\"doi\":\"10.1145/2663204.2663262\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Accurate and noise robust multimodal activity and mental state monitoring can be achieved by combining physiological, behavioural and environmental signals. This is especially promising in assistive driving technologies, because vehicles now ship with sensors ranging from wheel and pedal activity, to voice and eye tracking. In practice, however, multimodal user studies are confronted with challenging data collection and synchronisation issues, due to the diversity of sensing, acquisition and storage systems. Referencing current research on cognitive load measurement in a driving simulator, this paper describes the steps we take to consistently collect and synchronise signals, using the Orbit Measurement Library (OML) framework, combined with a multimodal version of a cinema clapperboard. The resulting data is automatically stored in a networked database, in a structured format, including metadata about the data and experiment. Moreover, fine-grained synchronisation between all signals is provided without additional hardware, and clock drift can be corrected post-hoc.\",\"PeriodicalId\":389037,\"journal\":{\"name\":\"Proceedings of the 16th International Conference on Multimodal Interaction\",\"volume\":\"75 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-11-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 16th International Conference on Multimodal Interaction\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2663204.2663262\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 16th International Conference on Multimodal Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2663204.2663262","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Synchronising Physiological and Behavioural Sensors in a Driving Simulator
Accurate and noise robust multimodal activity and mental state monitoring can be achieved by combining physiological, behavioural and environmental signals. This is especially promising in assistive driving technologies, because vehicles now ship with sensors ranging from wheel and pedal activity, to voice and eye tracking. In practice, however, multimodal user studies are confronted with challenging data collection and synchronisation issues, due to the diversity of sensing, acquisition and storage systems. Referencing current research on cognitive load measurement in a driving simulator, this paper describes the steps we take to consistently collect and synchronise signals, using the Orbit Measurement Library (OML) framework, combined with a multimodal version of a cinema clapperboard. The resulting data is automatically stored in a networked database, in a structured format, including metadata about the data and experiment. Moreover, fine-grained synchronisation between all signals is provided without additional hardware, and clock drift can be corrected post-hoc.