{"title":"360度图像的头部和眼睛运动数据集","authors":"Yashas Rai, Jesús Gutiérrez, P. Callet","doi":"10.1145/3083187.3083218","DOIUrl":null,"url":null,"abstract":"Understanding how observers watch visual stimuli like Images and Videos has helped the multimedia encoding, transmission, quality assessment and rendering communities immensely, to learn the regions important to an observer and provide to him/her an optimum quality of experience. The problem is even more paramount in case of 360 degree stimuli considering that most/a part of the content might not be seen by the observers at all, while other regions maybe extraordinarily important. Attention studies in this area has however been missing, mainly due to the lack of a dataset and guidelines to evaluate and compare visual attention/saliency in such scenarios. In this work, we present a dataset of sixty different 360 degree images, each watched by at-least 40 observers. Additionally, we also provide guidelines and tools to the community regarding the procedure to evaluate and compare saliency in omni-directional images. Some basic image/ observer agnostic viewing characteristics, like variation of exploration strategies with time and expertise, and also the effect of eye-movement within the view-port are explored. The dataset and tools are made available for free use by the community and is expected to promote Reproducible Research for all future work on computational modeling of attention in 360 scenarios.","PeriodicalId":123321,"journal":{"name":"Proceedings of the 8th ACM on Multimedia Systems Conference","volume":"60 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-06-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"170","resultStr":"{\"title\":\"A Dataset of Head and Eye Movements for 360 Degree Images\",\"authors\":\"Yashas Rai, Jesús Gutiérrez, P. Callet\",\"doi\":\"10.1145/3083187.3083218\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Understanding how observers watch visual stimuli like Images and Videos has helped the multimedia encoding, transmission, quality assessment and rendering communities immensely, to learn the regions important to an observer and provide to him/her an optimum quality of experience. The problem is even more paramount in case of 360 degree stimuli considering that most/a part of the content might not be seen by the observers at all, while other regions maybe extraordinarily important. Attention studies in this area has however been missing, mainly due to the lack of a dataset and guidelines to evaluate and compare visual attention/saliency in such scenarios. In this work, we present a dataset of sixty different 360 degree images, each watched by at-least 40 observers. Additionally, we also provide guidelines and tools to the community regarding the procedure to evaluate and compare saliency in omni-directional images. Some basic image/ observer agnostic viewing characteristics, like variation of exploration strategies with time and expertise, and also the effect of eye-movement within the view-port are explored. The dataset and tools are made available for free use by the community and is expected to promote Reproducible Research for all future work on computational modeling of attention in 360 scenarios.\",\"PeriodicalId\":123321,\"journal\":{\"name\":\"Proceedings of the 8th ACM on Multimedia Systems Conference\",\"volume\":\"60 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-06-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"170\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 8th ACM on Multimedia Systems Conference\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3083187.3083218\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 8th ACM on Multimedia Systems Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3083187.3083218","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A Dataset of Head and Eye Movements for 360 Degree Images
Understanding how observers watch visual stimuli like Images and Videos has helped the multimedia encoding, transmission, quality assessment and rendering communities immensely, to learn the regions important to an observer and provide to him/her an optimum quality of experience. The problem is even more paramount in case of 360 degree stimuli considering that most/a part of the content might not be seen by the observers at all, while other regions maybe extraordinarily important. Attention studies in this area has however been missing, mainly due to the lack of a dataset and guidelines to evaluate and compare visual attention/saliency in such scenarios. In this work, we present a dataset of sixty different 360 degree images, each watched by at-least 40 observers. Additionally, we also provide guidelines and tools to the community regarding the procedure to evaluate and compare saliency in omni-directional images. Some basic image/ observer agnostic viewing characteristics, like variation of exploration strategies with time and expertise, and also the effect of eye-movement within the view-port are explored. The dataset and tools are made available for free use by the community and is expected to promote Reproducible Research for all future work on computational modeling of attention in 360 scenarios.