{"title":"Sensitive suggestion and perception of climatic effects in virtual urban environments","authors":"Toinon Vigier, D. Siret, G. Moreau, L. Lescop","doi":"10.1145/2492494.2501896","DOIUrl":"https://doi.org/10.1145/2492494.2501896","url":null,"abstract":"Climatic effects represent the physical signals (temperature, wind, humidity, light) which can have a perceptual impact for human-beings. They significantly influence perception and use of urban spaces. Nevertheless, while virtual environments have been proven as good tools to assess urban projects by non-expert people or to study perception in cities, climatic atmospheres are very seldom integrated in the virtual urban model [Drettakis et al. 2007; Tahrani and Moreau 2007]. Indeed, virtual urban environments are represented in ideal situations: the sky is blue, there are a few white clouds, trees are leafy or flowered and the sun is shining. Thus, participants can rarely experiment various seasons, times of day or weathers and and evaluate the urban design under different climatic or lighting situations.","PeriodicalId":102213,"journal":{"name":"Proceedings of the ACM Symposium on Applied Perception","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-08-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114958093","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Auditory distance perception in real and virtual environments","authors":"S. Moulin, R. Nicol, Lætitia Gros","doi":"10.1145/2492494.2501876","DOIUrl":"https://doi.org/10.1145/2492494.2501876","url":null,"abstract":"The digital entertainment industry recently knows a major change with the generalisation of 3D video technologies (movies, TV, smartphones). Adding visual depth significantly impacts our multimedia experience. However, little is known regarding the sound technology to associate with this new dimension. Interactions between audio and visual spatial information are poorly investigated as to distance dimension. Before dealing with this bimodal (audio-visual) problem, we need to study each modality independently.","PeriodicalId":102213,"journal":{"name":"Proceedings of the ACM Symposium on Applied Perception","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-08-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126148519","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Understanding viewers' involuntary behaviors for adaptive digital signage","authors":"Kenichi Nagao, I. Fujishiro","doi":"10.1145/2492494.2501903","DOIUrl":"https://doi.org/10.1145/2492494.2501903","url":null,"abstract":"Digital signage has been getting more popular due to the recent development of underlying hardware technology and improvement in installing environments. In order to deliver a message from a sender effectively, it is necessary to check up viewers' evaluations towards the content actually displayed by the digital signage and utilize the evaluations to make it more adapted to the local viewers' tastes. However, previous methods for the survey such as interviewing require a lot of human resources, and thus there have been some research attempts to gather viewers' evaluations towards digital signage automatically. For example, some digital signage systems utilize a fixed video camera for the evaluation. They recognize viewers' faces, and understand their demographics segmentation, such as gender and age, though these values are not related to their actual feelings towards digital signage.","PeriodicalId":102213,"journal":{"name":"Proceedings of the ACM Symposium on Applied Perception","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-08-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132215711","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Contextual temporal features of the flash illusion","authors":"P. Csibri, P. Kaposvári, G. Sáry","doi":"10.1145/2492494.2501879","DOIUrl":"https://doi.org/10.1145/2492494.2501879","url":null,"abstract":"Simultaneous processing of pieces of information spread in space and/or in time depends on multiple factors [Chatterjee et al. 2011]. Studies regarding the effects of context on the visual target perception are generally addressing the spatial variables. For a deeper understanding of contextual effects in the temporal domain this study is aimed to gain information about one of the unimodal illusions called Phantom flashes. Here, a single target flash with multiple inducer flashes can be perceived as several flashes.","PeriodicalId":102213,"journal":{"name":"Proceedings of the ACM Symposium on Applied Perception","volume":"47 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-08-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114834943","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Leonardo Guardati, Silvio Vallorani, B. Milosevic, Elisabetta Farella, L. Benini
{"title":"HapticLib: a haptic feedback library for embedded platforms","authors":"Leonardo Guardati, Silvio Vallorani, B. Milosevic, Elisabetta Farella, L. Benini","doi":"10.1145/2492494.2501882","DOIUrl":"https://doi.org/10.1145/2492494.2501882","url":null,"abstract":"Mobile and wearable embedded devices connect the user with digital information in a continuous and pervasive way. A key benefit is given by the possibility to exploit multi-modal interaction capabilities that can dynamically act on different human senses and the cooperative capabilities of the small and pervasive devices. In this scenario we present HapticLib, a software library for the development and implementation of vibro-tactile feedback on resource-constrained embedded devices. It was designed to offer a high-level programming interface for the rendering of haptic patterns, accurately modeling the nature of vibro-tactile actuators and different touch experiences.","PeriodicalId":102213,"journal":{"name":"Proceedings of the ACM Symposium on Applied Perception","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-08-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125480034","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Katja Zibrek, Ludovic Hoyet, K. Ruhland, R. Mcdonnell
{"title":"Evaluating the effect of emotion on gender recognition in virtual humans","authors":"Katja Zibrek, Ludovic Hoyet, K. Ruhland, R. Mcdonnell","doi":"10.1145/2492494.2492510","DOIUrl":"https://doi.org/10.1145/2492494.2492510","url":null,"abstract":"In this paper, we investigate the ability of humans to determine the gender of conversing characters, based on facial and body cues for emotion. We used a corpus of simultaneously captured facial and body motions from four male and four female actors. In our Gender Rating task, participants were asked to rate how male or female they considered the motions to be, under different emotional states. In our Emotion Recognition task, participants were asked to classify the emotions, in order to determine how accurately perceived those emotions were. We found that gender perception was affected by emotion, where certain emotions facilitated gender determination while others masked it. We also found that there was no correlation between how accurate an emotion was portrayed and how much gender information was present in that motion. Finally, we found that the model used to display the motion did not affect gender perception of motion but did alter emotion recognition.","PeriodicalId":102213,"journal":{"name":"Proceedings of the ACM Symposium on Applied Perception","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-08-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114935842","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Anna C. Wellerdiek, Markus Leyrer, Ekaterina P. Volkova, D-S Chang, B. Mohler
{"title":"Recognizing your own motions on virtual avatars: is it me or not?","authors":"Anna C. Wellerdiek, Markus Leyrer, Ekaterina P. Volkova, D-S Chang, B. Mohler","doi":"10.1145/2492494.2501895","DOIUrl":"https://doi.org/10.1145/2492494.2501895","url":null,"abstract":"Most of the time point-light figures are used for motion-recognition, which present motions by only displaying the moving joints of the actor. In this study we were interested in whether self-recognition of motion changes with different representations. First, we captured participants' motions and remapped them on a point-light figure and a male and female virtual avatar. In the second part the same participants were asked to recognize their own motions on all three representations. We found that the recognition rate for own motions is high across all representations and different actions. The recognition rate was better on the point-light figure, despite being perceived as most difficult from the participants. The gender of the visual avatar did not matter in self-recognition.","PeriodicalId":102213,"journal":{"name":"Proceedings of the ACM Symposium on Applied Perception","volume":"102 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-08-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130634813","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ekaterina P. Volkova, B. Mohler, Trevor J. Dodds, J. Tesch, H. Bülthoff
{"title":"Perception of emotional body expressions in narrative scenarios","authors":"Ekaterina P. Volkova, B. Mohler, Trevor J. Dodds, J. Tesch, H. Bülthoff","doi":"10.1145/2492494.2501892","DOIUrl":"https://doi.org/10.1145/2492494.2501892","url":null,"abstract":"People use body motion to express and recognise emotions. We investigated whether emotional body expressions can be recognised when they are recorded during natural narration, where actors freely express the emotional colouring of a story told. We then took only the upper body motion trajectories and presented them to participants in the form of animated stick figures. The observers were asked to categorise the emotions expressed in short motion sequences. The results show that recognition level of eleven emotions shown via upper body is significantly above chance level and the responses to motion sequences are consistent across observers.","PeriodicalId":102213,"journal":{"name":"Proceedings of the ACM Symposium on Applied Perception","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-08-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122019300","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
E. Roudaia, Ludovic Hoyet, David P. McGovern, C. O'Sullivan, F. Newell
{"title":"Effects of ageing and sound on perceived timing of human interactions","authors":"E. Roudaia, Ludovic Hoyet, David P. McGovern, C. O'Sullivan, F. Newell","doi":"10.1145/2492494.2501898","DOIUrl":"https://doi.org/10.1145/2492494.2501898","url":null,"abstract":"Variations in the timing and speed of movements in human interactions carry important social information. For example, seeing one player push another player at a football game, we can deduce whether the player being pushed resisted or anticipated the oncoming push based on subtle differences in the timing and velocity of movements of both players. With the development of computer animations of human characters, it is important to understand the sensitivity and limits of human perception in such interactions to accurately portray human interactions.","PeriodicalId":102213,"journal":{"name":"Proceedings of the ACM Symposium on Applied Perception","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-08-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125168621","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The meaning profiles of sensory information and experiences","authors":"S. Kreitler, M. Kreitler","doi":"10.1145/2492494.2501901","DOIUrl":"https://doi.org/10.1145/2492494.2501901","url":null,"abstract":"The objectives of the study were to assess the meanings of sensory inputs in terms of the meaning system (Kreitler & Kreitler, 1990) and to examine the extent to which these inputs share the meanings assigned to them or represent unique meaning profiles. Meaning is defined as a referent-based pattern of cognitive contents, characterized in terms of meaning variables representing contents, forms and types of relation, referent shifts and different forms of expression. The present study was based on using only the part of the system of meaning which refers to meaning dimensions, namely, categories of contents, such as the characteristics of the shape, location, time, function and emotions related to the stimulus.","PeriodicalId":102213,"journal":{"name":"Proceedings of the ACM Symposium on Applied Perception","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-08-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129110557","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}