S. Ayyagari, Kunal Gupta, Matthew Tait, M. Billinghurst
{"title":"CoSense:创造共享情感体验","authors":"S. Ayyagari, Kunal Gupta, Matthew Tait, M. Billinghurst","doi":"10.1145/2702613.2732839","DOIUrl":null,"url":null,"abstract":"In this paper we describe a prototype wearable interface that shares a user's first person view and their current emotional state with a remote user in order to create a shared emotional experience. A user evaluation was conducted to explore which interface cues best helped a remote user understand what the local user was feeling. The results showed simple visual cues provided a significantly enhanced experience over no cues at all, or a more detailed data representation.","PeriodicalId":142786,"journal":{"name":"Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems","volume":"197 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-04-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":"{\"title\":\"CoSense: Creating Shared Emotional Experiences\",\"authors\":\"S. Ayyagari, Kunal Gupta, Matthew Tait, M. Billinghurst\",\"doi\":\"10.1145/2702613.2732839\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper we describe a prototype wearable interface that shares a user's first person view and their current emotional state with a remote user in order to create a shared emotional experience. A user evaluation was conducted to explore which interface cues best helped a remote user understand what the local user was feeling. The results showed simple visual cues provided a significantly enhanced experience over no cues at all, or a more detailed data representation.\",\"PeriodicalId\":142786,\"journal\":{\"name\":\"Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems\",\"volume\":\"197 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-04-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"5\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2702613.2732839\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2702613.2732839","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
In this paper we describe a prototype wearable interface that shares a user's first person view and their current emotional state with a remote user in order to create a shared emotional experience. A user evaluation was conducted to explore which interface cues best helped a remote user understand what the local user was feeling. The results showed simple visual cues provided a significantly enhanced experience over no cues at all, or a more detailed data representation.