{"title":"“好了,DeMille先生,我准备好特写了:”为沉浸式分析的视频添加用户操作的意义","authors":"A. Batch, N. Elmqvist","doi":"10.1109/MLUI52769.2019.10075557","DOIUrl":null,"url":null,"abstract":"While the use of machine learning and computer vision to classify human behavior has grown into a large, well-established, interdisciplinary area of research, one area that is somewhat overlooked is the intersection of computer vision as a tool for evaluating user behavior in Virtual Reality, particularly in the context of immersive analytics and visualization. We draw on the literature from pattern recognition, computer vision, and machine learning to compose a simple, comparatively resource-cheap pipeline for camera-based extraction of features of professional analyst users and of their sessions in an existing VR visualization system, ImAxes. Our results show high accuracy in predicting self-reported features of the users, even as survey responses about user experience with the immersive interface are somewhat ambiguous in varying based on these features.","PeriodicalId":297242,"journal":{"name":"2019 IEEE Workshop on Machine Learning from User Interaction for Visualization and Analytics (MLUI)","volume":"272 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"“All Right, Mr. DeMille, I’m Ready for My Closeup:” Adding Meaning to User Actions from Video for Immersive Analytics\",\"authors\":\"A. Batch, N. Elmqvist\",\"doi\":\"10.1109/MLUI52769.2019.10075557\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"While the use of machine learning and computer vision to classify human behavior has grown into a large, well-established, interdisciplinary area of research, one area that is somewhat overlooked is the intersection of computer vision as a tool for evaluating user behavior in Virtual Reality, particularly in the context of immersive analytics and visualization. We draw on the literature from pattern recognition, computer vision, and machine learning to compose a simple, comparatively resource-cheap pipeline for camera-based extraction of features of professional analyst users and of their sessions in an existing VR visualization system, ImAxes. Our results show high accuracy in predicting self-reported features of the users, even as survey responses about user experience with the immersive interface are somewhat ambiguous in varying based on these features.\",\"PeriodicalId\":297242,\"journal\":{\"name\":\"2019 IEEE Workshop on Machine Learning from User Interaction for Visualization and Analytics (MLUI)\",\"volume\":\"272 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-10-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 IEEE Workshop on Machine Learning from User Interaction for Visualization and Analytics (MLUI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/MLUI52769.2019.10075557\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE Workshop on Machine Learning from User Interaction for Visualization and Analytics (MLUI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MLUI52769.2019.10075557","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
“All Right, Mr. DeMille, I’m Ready for My Closeup:” Adding Meaning to User Actions from Video for Immersive Analytics
While the use of machine learning and computer vision to classify human behavior has grown into a large, well-established, interdisciplinary area of research, one area that is somewhat overlooked is the intersection of computer vision as a tool for evaluating user behavior in Virtual Reality, particularly in the context of immersive analytics and visualization. We draw on the literature from pattern recognition, computer vision, and machine learning to compose a simple, comparatively resource-cheap pipeline for camera-based extraction of features of professional analyst users and of their sessions in an existing VR visualization system, ImAxes. Our results show high accuracy in predicting self-reported features of the users, even as survey responses about user experience with the immersive interface are somewhat ambiguous in varying based on these features.