I-PerceptionPub Date : 2023-03-01DOI: 10.1177/20416695231163473
Eleftheria Pistolas, Johan Wagemans
{"title":"Crossmodal correspondences and interactions between texture and taste perception.","authors":"Eleftheria Pistolas, Johan Wagemans","doi":"10.1177/20416695231163473","DOIUrl":"https://doi.org/10.1177/20416695231163473","url":null,"abstract":"<p><p>In recent years, awareness of the influence of different modalities on taste perception has grown. Although previous research in crossmodal taste perception has touched upon the bipolar distinction between softness/smoothness and roughness/angularity, ambiguity largely remains surrounding other crossmodal correspondences between taste and other specific textures we regularly use to describe our food, such as crispy or crunchy. Sweetness has previously been found to be associated with soft textures but our current understanding does not exceed the basic distinction made between roughness and smoothness. Specifically, the role of texture in taste perception remains relatively understudied. The current study consisted of two parts. First, because of the lack of clarity concerning specific associations between basic tastes and textures, an online questionnaire served to assess whether consistent associations between texture words and taste words exist and how these arise intuitively. The second part consisted of a taste experiment with factorial combinations of four tastes and four textures. The results of the questionnaire study showed that consistent associations are made between soft and sweet and between crispy and salty at the conceptual level. The results of the taste experiment largely showed evidence in support of these findings at the perceptual level. In addition, the experiment allowed for a closer look into the complexity found regarding the association between sour and crunchy, and bitter and sandy.</p>","PeriodicalId":47194,"journal":{"name":"I-Perception","volume":null,"pages":null},"PeriodicalIF":1.9,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10069003/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9626447","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
I-PerceptionPub Date : 2023-03-01DOI: 10.1177/20416695231165182
Peter U Tse, Vincent Hayward
{"title":"The knobby ball illusion.","authors":"Peter U Tse, Vincent Hayward","doi":"10.1177/20416695231165182","DOIUrl":"https://doi.org/10.1177/20416695231165182","url":null,"abstract":"<p><p>A novel haptic illusion is described where deformations of the fingertip skin lead to subsequent misperceptions of an object's shape.</p>","PeriodicalId":47194,"journal":{"name":"I-Perception","volume":null,"pages":null},"PeriodicalIF":1.9,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10034292/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9192736","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
I-PerceptionPub Date : 2023-02-23eCollection Date: 2023-01-01DOI: 10.1177/20416695231157348
Yanna Ren, Hannan Li, Yan Li, Zhihan Xu, Rui Luo, Hang Ping, Xuan Ni, Jiajia Yang, Weiping Yang
{"title":"Sustained visual attentional load modulates audiovisual integration in older and younger adults.","authors":"Yanna Ren, Hannan Li, Yan Li, Zhihan Xu, Rui Luo, Hang Ping, Xuan Ni, Jiajia Yang, Weiping Yang","doi":"10.1177/20416695231157348","DOIUrl":"10.1177/20416695231157348","url":null,"abstract":"<p><p>Previous studies have shown that attention influences audiovisual integration (AVI) in multiple stages, but it remains unclear how AVI interacts with attentional load. In addition, while aging has been associated with sensory-functional decline, little is known about how older individuals integrate cross-modal information under attentional load. To investigate these issues twenty older adults and 20 younger adults were recruited to conduct a dual task including a multiple object tracking (MOT) task, which manipulated sustained visual attentional load, and an audiovisual discrimination task, which assesses AVI. The results showed that response times were shorter and hit rate was higher for audiovisual stimuli than for auditory or visual stimuli alone and in younger adults than in older adults. The race model analysis showed that AVI was higher under the load_3 condition (monitoring two targets of the MOT task) than under any other load condition (no-load [NL], one or three targets monitoring). This effect was found regardless of age. However, AVI was lower in older adults than younger adults under NL condition. Moreover, the peak latency was longer, and the time window of AVI was delayed in older adults compared to younger adults under all conditions. These results suggest that slight visual sustained attentional load increased AVI but that heavy visual sustained attentional load decreased AVI, which supports the claim that attention resource was limited, and we further proposed that AVI was positively modulated by attentional resource. Finally, there were substantial impacts of aging on AVI; AVI was delayed in older adults.</p>","PeriodicalId":47194,"journal":{"name":"I-Perception","volume":null,"pages":null},"PeriodicalIF":2.4,"publicationDate":"2023-02-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9950617/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10850314","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
I-PerceptionPub Date : 2023-02-09eCollection Date: 2023-01-01DOI: 10.1177/20416695231152144
Misa Kobayashi, Makoto Ichikawa
{"title":"Emotional response evoked by viewing facial expression pictures leads to higher temporal resolution.","authors":"Misa Kobayashi, Makoto Ichikawa","doi":"10.1177/20416695231152144","DOIUrl":"10.1177/20416695231152144","url":null,"abstract":"<p><p>We examined the effects of emotional response, with different levels of valence and arousal, on the temporal resolution of visual processing by using photos of various facial expressions. As an index of the temporal resolution of visual processing, we measured the minimum lengths of the noticeable durations for desaturated photographs using the method of constant stimuli by switching colorful facial expression photographs to desaturated versions of the same photographs. Experiments 1 and 2 used facial photographs that evoke various degrees of arousal and valence. Those photographs were prepared not only in an upright orientation but also in an inverted orientation to reduce emotional response without changing the photographs' image properties. Results showed that the minimum duration to notice monochrome photographs for anger, fear, and joy was shorter than that for a neutral face when viewing upright face photographs but not when viewing inverted face photographs. For Experiment 3, we used facial expression photographs to evoke various degrees of arousal. Results showed that the temporal resolution of visual processing increased with the degree of arousal. These results suggest that the arousal of emotional responses evoked by viewing facial expressions might increase the temporal resolution of visual processing.</p>","PeriodicalId":47194,"journal":{"name":"I-Perception","volume":null,"pages":null},"PeriodicalIF":2.4,"publicationDate":"2023-02-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9943968/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9341326","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
I-PerceptionPub Date : 2023-01-01DOI: 10.1177/20416695221144732
Matthias Hartmann, Petra Müller
{"title":"Illusory perception of visual patterns in pure noise is associated with COVID-19 conspiracy beliefs.","authors":"Matthias Hartmann, Petra Müller","doi":"10.1177/20416695221144732","DOIUrl":"https://doi.org/10.1177/20416695221144732","url":null,"abstract":"<p><p>Just as perceptual heuristics can lead to visual illusions, cognitive heuristics can lead to biased judgements, such as \"illusory pattern perception\" (i.e., seeing patterns in unrelated events). Here we further investigated the common underlying mechanism behind irrational beliefs and illusory pattern perception in visual images. For trials in which no object was present in the noise, we found that the tendency to report seeing an object was positively correlated with the endorsement of both COVID-19 specific conspiracy theories and paranormal beliefs. The present results suggest that the cognitive bias to see meaningful connections in noise can have an impact on socio-political cognition as well as on perceptual decision making.</p>","PeriodicalId":47194,"journal":{"name":"I-Perception","volume":null,"pages":null},"PeriodicalIF":1.9,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9893368/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9213575","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
I-PerceptionPub Date : 2023-01-01DOI: 10.1177/20416695221149638
Sara Finocchietti, Davide Esposito, Monica Gori
{"title":"Monaural auditory spatial abilities in early blind individuals.","authors":"Sara Finocchietti, Davide Esposito, Monica Gori","doi":"10.1177/20416695221149638","DOIUrl":"https://doi.org/10.1177/20416695221149638","url":null,"abstract":"<p><p>Early blind individuals can localize single sound sources better than sighted participants, even under monaural conditions. Yet, in binaural listening, they struggle with understanding the distances between three different sounds. The latter ability has never been tested under monaural conditions. We investigated the performance of eight early blind and eight blindfolded healthy individuals in monaural and binaural listening during two audio-spatial tasks. In the localization task, a single sound was played in front of participants who needed to localize it properly. In the auditory bisection task, three consecutive sounds were played from different spatial positions, and participants reported which sound the second one was closer to. Only early blind individuals improved their performance in the monaural bisection, while no statistical difference was present for the localization task. We concluded that early blind individuals show superior ability in using spectral cues under monaural conditions.</p>","PeriodicalId":47194,"journal":{"name":"I-Perception","volume":null,"pages":null},"PeriodicalIF":1.9,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9969445/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10821296","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
I-PerceptionPub Date : 2023-01-01DOI: 10.1177/20416695221148039
Izumi Ayase, Masaki Mori, Takaaki Kato
{"title":"Eye size recognition of self and others among people with self-face dissatisfaction.","authors":"Izumi Ayase, Masaki Mori, Takaaki Kato","doi":"10.1177/20416695221148039","DOIUrl":"https://doi.org/10.1177/20416695221148039","url":null,"abstract":"<p><p>Previous studies have shown that individuals visually recognize their eye size as larger than the actual. However, it is unclear whether this cognitive tendency occurs in people with high self-face dissatisfaction. Therefore, this study aimed to investigate whether the cognitive size of one's own and others' eyes differs according to the degree of self-face dissatisfaction. Participants comprised 32 college students (5 males, 27 females; age: 21.3 ± 2.11) who completed the Face Dissatisfaction Scale (FDS) and a face recognition memory task. The task was to choose whether their or their friends' eyes in the face photos with changed eye size were larger or smaller than their actual eye size. The cognitively equivalent eye size to the actual one was estimated from a psychophysical function. We conducted a correlation analysis of the total scores on the FDS and the point of subjective equality (PSE) of eye size. We found a high negative correlation between the FDS and the PSE of own eye size. There was also a high positive correlation between the FDS and the PSE for all others' faces. Thus, high self-face dissatisfaction is differentially associated with cognitive distortions of the face, depending on whether it is self or other.</p>","PeriodicalId":47194,"journal":{"name":"I-Perception","volume":null,"pages":null},"PeriodicalIF":1.9,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9900673/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10684776","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
I-PerceptionPub Date : 2023-01-01DOI: 10.1177/20416695221142537
Steven E Bagienski, Gustav Kuhn
{"title":"A balanced view of impossible aesthetics: An empirical investigation of how impossibility relates to our enjoyment of magic tricks.","authors":"Steven E Bagienski, Gustav Kuhn","doi":"10.1177/20416695221142537","DOIUrl":"https://doi.org/10.1177/20416695221142537","url":null,"abstract":"<p><p>The performance art of magic allows us to experience the impossible, and this study used a balancing magic trick to investigate the relationship between participants' enjoyment and perceived impossibility. Participants watched a live performance of a magic trick in which the magician balanced objects in progressively more impossible configurations. At seven different time points observers rated their enjoyment, and the extent to which they believed what they saw was impossible. Regression analysis revealed that participants' enjoyment of the magical effect relates to their perceived impossibility of the magic trick, and this relationship was independent of how much they enjoyed magic in general. Moreover, a one-way within-subjects analysis of variance showed that participants enjoyed the performance More as the trick became more impossible. However, once the magical effect was anticipated, enjoyment began to plateau while perceived impossibility continued to increase. These results are discussed in the context of people's aesthetic appreciation of magic and current arts appreciation models.</p>","PeriodicalId":47194,"journal":{"name":"I-Perception","volume":null,"pages":null},"PeriodicalIF":1.9,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9829883/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10527553","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
I-PerceptionPub Date : 2023-01-01DOI: 10.1177/20416695231157349
Hironori Maruyama, Kosuke Okada, Isamu Motoyoshi
{"title":"A two-stage spectral model for sound texture perception: Synthesis and psychophysics.","authors":"Hironori Maruyama, Kosuke Okada, Isamu Motoyoshi","doi":"10.1177/20416695231157349","DOIUrl":"https://doi.org/10.1177/20416695231157349","url":null,"abstract":"<p><p>The natural environment is filled with a variety of auditory events such as wind blowing, water flowing, and fire crackling. It has been suggested that the perception of such textural sounds is based on the statistics of the natural auditory events. Inspired by a recent spectral model for visual texture perception, we propose a model that can describe the perceived sound texture only with the linear spectrum and the energy spectrum. We tested the validity of the model by using synthetic noise sounds that preserve the two-stage amplitude spectra of the original sound. Psychophysical experiment showed that our synthetic noises were perceived as like the original sounds for 120 real-world auditory events. The performance was comparable with the synthetic sounds produced by McDermott-Simoncelli's model which considers various classes of auditory statistics. The results support the notion that the perception of natural sound textures is predictable by the two-stage spectral signals.</p>","PeriodicalId":47194,"journal":{"name":"I-Perception","volume":null,"pages":null},"PeriodicalIF":1.9,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9950610/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10850315","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
I-PerceptionPub Date : 2022-11-28eCollection Date: 2022-11-01DOI: 10.1177/20416695221116653
Long Yi, Robert Sekuler
{"title":"Audiovisual interaction with rate-varying signals.","authors":"Long Yi, Robert Sekuler","doi":"10.1177/20416695221116653","DOIUrl":"https://doi.org/10.1177/20416695221116653","url":null,"abstract":"<p><p>A task-irrelevant, amplitude-modulating sound influences perception of a size-modulating visual stimulus. To probe the limits of this audiovisual interaction we vary the second temporal derivative of object size and of sound amplitude. In the study's first phase subjects see a visual stimulus size-modulating with <math><msup><mi>f</mi> <mo>″</mo></msup> <mo>(</mo> <mi>x</mi> <mo>)</mo> <mo>></mo></math> 0, 0, or <0, and judge each one's rate as increasing, constant, or decreasing. Visual stimuli are accompanied by a steady, non-modulated auditory stimulus. The novel combination of multiple stimuli and multi-alternative responses allows subjects' similarity space to be estimated from the stimulus-response confusion matrix. In the study's second phase, rate-varying visual stimuli are presented in concert with auditory stimuli whose second derivative also varied. Subjects identified each visual stimuli as one of the three types, while trying to ignore the accompanying sound. Unlike some previous results with <math><msup><mi>f</mi> <mo>″</mo></msup> <mo>(</mo> <mi>x</mi> <mo>)</mo></math> fixed at 0, performance benefits relatively little when visual and auditory stimuli share the same directional change in modulation. However, performance does drop when visual and auditory stimului differ in their directions of rate change. Our task's computational demands may make it particularly vulnerable to the effects of a dynamic task-irrelevant stimulus.</p>","PeriodicalId":47194,"journal":{"name":"I-Perception","volume":null,"pages":null},"PeriodicalIF":1.9,"publicationDate":"2022-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9716610/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"35345270","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}