Samantha Reina O'Connell, Ali Rahimpour Jounghani, Julianne Marie Papadopoulos, Heather Bortfeld, Raymond Lee Goldsworthy
{"title":"Investigating Hemodynamic Patterns During Beat Processing in Cochlear Implant Users: Insights from a Finger Tapping Study.","authors":"Samantha Reina O'Connell, Ali Rahimpour Jounghani, Julianne Marie Papadopoulos, Heather Bortfeld, Raymond Lee Goldsworthy","doi":"10.1080/25742442.2025.2510182","DOIUrl":"10.1080/25742442.2025.2510182","url":null,"abstract":"<p><strong>Introduction: </strong>Individuals with cochlear implants often struggle with melody and timbre perception in music, leading to diminished music appreciation. While they demonstrate proficiency in recognizing beat and rhythm, it remains unclear whether beat information is processed similarly in their brains compared to those with normal hearing.</p><p><strong>Methods: </strong>In this study, adapted from Rahimpour et al. (2020), both cochlear implant users and normal hearing listeners engaged in finger tapping tasks that synchronized or syncopated with isochronous beats. Participants were asked to align their taps with an auditory metronome (pacing) and then maintain tapping pace after the metronome attenuation (continuation). Hemodynamic responses were recorded using functional near-infrared spectroscopy (fNIRS) during tapping.</p><p><strong>Results: </strong>Results revealed comparable performance between cochlear implant users and normal hearing listeners in the finger tapping task, with both groups finding the syncopated continuation task particularly challenging for maintaining consistent tapping. Despite similar tapping performance, cochlear implant users exhibited more widespread hemodynamic activation than normal hearing listeners in temporal, frontal, motor, and parietal regions.</p><p><strong>Discussion: </strong>Cochlear implant users engage auditory-motor networks during beat processing akin to normal hearing listeners; however, factors such as neural adaptation post-cochlear implantation and heightened listening effort may contribute to the observed widespread activation.</p>","PeriodicalId":72332,"journal":{"name":"Auditory perception & cognition","volume":"8 2","pages":"132-156"},"PeriodicalIF":0.0,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12419772/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145042386","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A K Bosen, B C Kohlmeier, S E Harris, S T Neely, A M Kamerer
{"title":"Frequency Modulation Detection Thresholds are Unrelated to Individual Differences in Verbal Memory Capacity.","authors":"A K Bosen, B C Kohlmeier, S E Harris, S T Neely, A M Kamerer","doi":"10.1080/25742442.2025.2489912","DOIUrl":"10.1080/25742442.2025.2489912","url":null,"abstract":"<p><strong>Purpose: </strong>Psychophysical measures of auditory sensitivity are often used to explain speech recognition outcomes. However, interpretation of performance on these tasks assumes that they are insensitive to other factors, such as cognitive ability. Recent studies have cast doubt on this assumption by observing relationships between cognition and performance on psychoacoustic tasks. Here, we examined the relationship between memory tasks and two tasks designed to measure frequency modulation (FM) detection to determine whether FM detection task performance reflects individual differences in memory capacity.</p><p><strong>Method: </strong>To test for a relationship between FM thresholds and memory capacity, young adults with normal hearing (N = 31, ages 19 - 40 years) completed FM detection tasks using two different designs (three alternative forced choice and Yes/No) and memory tasks (auditory digit span and visual free recall).</p><p><strong>Results: </strong>Psychometric functions differed across the two FM detection task designs and individual differences in performance were reliable, but no significant correlations were found between memory capacity and FM thresholds.</p><p><strong>Conclusions: </strong>In young adults with normal hearing encoding of temporal fine structure and memory capacity are distinct constructs. Thus, previously observed associations between psychophysical and cognitive measures may reflect the shared effects of age- or hearing-related declines.</p>","PeriodicalId":72332,"journal":{"name":"Auditory perception & cognition","volume":"8 2","pages":"113-131"},"PeriodicalIF":0.0,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12396839/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144980572","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Evaluating aperiodic and periodic neural activity as markers of listening effort in speech perception.","authors":"Sarah J Woods, Jack W Silcox, Brennan R Payne","doi":"10.1080/25742442.2024.2395217","DOIUrl":"10.1080/25742442.2024.2395217","url":null,"abstract":"<p><p>Listening effort (LE) is critical to understanding speech perception in acoustically challenging environments. EEG alpha power has emerged as a potential neural correlate of LE. However, the magnitude and direction of the relationship between acoustic challenge and alpha power has been inconsistent in the literature. In the current study, a secondary data analysis of Silcox and Payne (2021), we examine the broadband 1/f-like exponent and offset of the EEG power spectrum as measures of aperiodic neural activity during effortful speech perception and the influence of this aperiodic activity on reliable estimation of periodic (i.e., alpha) neural activity. EEG was continuously recorded during sentence listening and the broadband (1-40 Hz) EEG power spectrum was computed for each participant for quiet and noise trials separately. Using the specparam algorithm, we decomposed the power spectrum into both aperiodic and periodic components and found that broadband aperiodic activity was sensitive to background noise during speech perception and additionally impacted the measurement of noise-induced changes on alpha oscillations. We discuss the implications of these results for the LE and neural speech processing literatures.</p>","PeriodicalId":72332,"journal":{"name":"Auditory perception & cognition","volume":"7 3","pages":"203-218"},"PeriodicalIF":0.0,"publicationDate":"2024-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11469580/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142482121","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Influence of talker and accent variability on rapid adaptation and generalization to non-native accented speech in younger and older adults.","authors":"R E Bieber, S Gordon-Salant","doi":"10.1080/25742442.2024.2345568","DOIUrl":"10.1080/25742442.2024.2345568","url":null,"abstract":"<p><strong>Introduction: </strong>Listeners can rapidly adapt to English speech produced by non-native speakers of English with unfamiliar accents. Prior work has shown that the type and number of talkers contained within a stimulus set may impact rate and magnitude of learning, as well as any generalization of learning. However, findings across the literature have been inconsistent, with relatively little study of these effects in populations of older listeners.</p><p><strong>Methods: </strong>In this study, adaptation and generalization to unfamiliar talkers with familiar and unfamiliar accents are studied in younger normal-hearing adults and older adults with and without hearing loss. Rate and magnitude of adaptation are modelled using both generalized linear mixed effects regression and generalized additive mixed effects modelling.</p><p><strong>Results: </strong>Rate and magnitude of adaptation were not impacted by increasing the number of talkers and/or varying the consistency of non-native English accents across talkers. Increasing the number of talkers did strengthen generalization of learning for a talker with a familiar non-native accent, but not for an unfamiliar accent. Aging alone did not diminish adaptation or generalization.</p><p><strong>Discussion: </strong>These findings support prior evidence of a limited benefit for talker variability in facilitating generalization of learning for non-native accented speech, and extend the findings to older adults.</p>","PeriodicalId":72332,"journal":{"name":"Auditory perception & cognition","volume":"7 2","pages":"110-139"},"PeriodicalIF":0.0,"publicationDate":"2024-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11323066/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141989687","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Psychophysiological Markers of Auditory Distraction: A Scoping Review","authors":"Alexandre Marois, François Vachon","doi":"10.1080/25742442.2023.2274270","DOIUrl":"https://doi.org/10.1080/25742442.2023.2274270","url":null,"abstract":"Short-term memory can be disrupted by task-irrelevant sound. Auditory distraction has been globally studied under the lens of two main phenomena: the deviation effect and the changing-state effect. Yet, it remains unclear whether they rely on common cerebral mechanisms and, concomitantly, what psychophysiological responses they can trigger. This scoping review provides a state of knowledge regarding psychophysiological indices of auditory distraction. Records published between 2001 and 2021 on the deviation effect and the changing-state effect with psychophysiological measures were extracted from PubMed, ERIC, PsycNet, Web of Science, and ScienceDirect. Records investigating task-relevant sounds, as well as those that failed to observe performance disruption, or to include a control condition or a concurrent cognitive task, were excluded from the review. The Revised Cochrane risk-of-bias tool for randomized trials was used for bias evaluation. Fifteen records were reviewed, mainly characterized by randomization, measurement and selection of results biases. Some markers were specific to the distraction type, but nonspecific responses were also found. Overall, we outline the main markers used to index auditory distraction, present their meaning for understanding underpinning mechanisms, and discuss implications and knowledge gaps that need to be filled to fully exploit psychophysiology for auditory distraction research.","PeriodicalId":72332,"journal":{"name":"Auditory perception & cognition","volume":" 2","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-11-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135285995","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Temporal Summation for Noise Stimuli","authors":"Chanit Cohen, Leah Fostick","doi":"10.1080/25742442.2023.2258759","DOIUrl":"https://doi.org/10.1080/25742442.2023.2258759","url":null,"abstract":"ABSTRACTIntroduction Temporal summation describes a relationship between loudness and duration: increasing the duration also increases loudness. However, this relationship was studied mainly for pure tones (PTs) and not for complex sounds, mainly speech.Methods Twenty-four young adults with normal hearing participated in the study. Hearing thresholds were measured to five steady-state speech shape noise (SSN) of/a/, /i/, /u/, /sh/, and /m/, three PTs (0.5, 1, and 4 kHz), and white noise (WN). Thresholds were measured separately for five durations: 1, 5, 20, 50, and 100 ms.Results As hypothesized, temporal summation for PTs was greater than for complex sounds. PTs had greater temporal summation for 500 and 1,000 Hz, than 4,000 Hz, but SSNs had the reverse pattern, with less temporal summation for sounds with a lower first formant than for those with a higher one.Discussion Noise stimuli differ in energy and spectral range, and these two factors seem to affect temporal summation. A wider frequency range shows lower sensitivity to changes in duration, thus showing less temporal summation than a narrower spectrum.KEYWORDS: Temporal summationnoise stimulispeech-shaped noise Disclosure statementNo potential conflict of interest was reported by the author(s).Additional informationFundingThis work was supported by Ariel University internal fund RA1900000488.","PeriodicalId":72332,"journal":{"name":"Auditory perception & cognition","volume":"213 3","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135371348","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Neural Responses to Repeated Noise Structure in Sounds Are Invariant to Temporal Interruptions","authors":"Björn Herrmann","doi":"10.1080/25742442.2023.2248849","DOIUrl":"https://doi.org/10.1080/25742442.2023.2248849","url":null,"abstract":"ABSTRACTThe ability to extract meaning from acoustic environments requires sensitivity to repeating sound structures. Yet, how events that repeat are encoded and maintained in the brain and how the brain responds to events that reoccur at later points in time is not well understood. In two electroencephalography experiments, participants listened to a longer, ongoing white-noise sound which comprised shorter, frozen noise snippets that repeated at a regular 2-Hz rate. In several conditions, the snippet repetition discontinued for a brief period after which the noise snippet reoccurred. The experiments aimed to answer whether neural activity becomes entrained by the regular repetition of noise snippets, whether entrained neural activity self-sustains during the discontinuation period, and how the brain responds to a reoccurring noise snippet. Results show that neural activity is entrained by the snippet repetition, but there was no evidence for self-sustained neural activity during the discontinuation period. However, the auditory cortex responded with similar magnitude to a noise snippet reoccurring after a brief discontinuation as it responded to a noise snippet for which the snippet repetition had not been discontinued. This response invariance was observed for different onset times of the reoccurring noise snippet relative to the previously established regularity. The results thus demonstrate that the auditory cortex sensitively responds to, and thus maintains a memory trace of, previously learned acoustic noise independent of temporal interruptions.KEYWORDS: Electroencephalographyfrozen noisetemporal regularityauditory perceptionneural synchronization AcknowledgmentsWe thank Christie Tsagopoulos for her help with data collection for both experiments. The research was supported by the Canada Research Chair program (CRC-2019-00156, 232733) and the Natural Sciences and Engineering Research Council of Canada (Discovery Grant: RGPIN-2021-02602).Disclosure statementNo potential conflict of interest was reported by the author(s).Additional informationFundingThis work was supported by the Canada Research Chairs [CRC-2019-00156, 232733]; Natural Sciences and Engineering Research Council of Canada [RGPIN-2021-02602].","PeriodicalId":72332,"journal":{"name":"Auditory perception & cognition","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-08-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134927895","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Monica Ashokumar, Clément Guichet, Jean-Luc Schwartz, Takayuki Ito
{"title":"Correlation between the effect of orofacial somatosensory inputs in speech perception and speech production performance.","authors":"Monica Ashokumar, Clément Guichet, Jean-Luc Schwartz, Takayuki Ito","doi":"10.1080/25742442.2022.2134674","DOIUrl":"10.1080/25742442.2022.2134674","url":null,"abstract":"<p><strong>Introduction: </strong>Orofacial somatosensory inputs modify the perception of speech sounds. Such auditory-somatosensory integration likely develops alongside speech production acquisition. We examined whether the somatosensory effect in speech perception varies depending on individual characteristics of speech production.</p><p><strong>Methods: </strong>The somatosensory effect in speech perception was assessed by changes in category boundary between /e/ and /ø/ in a vowel identification test resulting from somatosensory stimulation providing facial skin deformation in the rearward direction corresponding to articulatory movement for /e/ applied together with the auditory input. Speech production performance was quantified by the acoustic distances between the average first, second and third formants of /e/ and /ø/ utterances recorded in a separate test.</p><p><strong>Results: </strong>The category boundary between /e/ and /ø/ was significantly shifted towards /ø/ due to the somatosensory stimulation which is consistent with previous research. The amplitude of the category boundary shift was significantly correlated with the acoustic distance between the mean second - and marginally third - formants of /e/ and /ø/ productions, with no correlation with the first formant distance.</p><p><strong>Discussion: </strong>Greater acoustic distances can be related to larger contrasts between the articulatory targets of vowels in speech production. These results suggest that the somatosensory effect in speech perception can be linked to speech production performance.</p>","PeriodicalId":72332,"journal":{"name":"Auditory perception & cognition","volume":"6 1-2","pages":"97-107"},"PeriodicalIF":0.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10229140/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9571256","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Vincent A. Medina, Danielle A. Lutfi-Proctor, E. Elliott
{"title":"The Role of Joint Influence on the Cross-Modal Stroop Effect: Investigating Time Course and Asymmetry","authors":"Vincent A. Medina, Danielle A. Lutfi-Proctor, E. Elliott","doi":"10.1080/25742442.2022.2034394","DOIUrl":"https://doi.org/10.1080/25742442.2022.2034394","url":null,"abstract":"","PeriodicalId":72332,"journal":{"name":"Auditory perception & cognition","volume":"29 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2022-02-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73887587","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Emotions and Consciousness Alterations in Music-color Synesthesia","authors":"Cathy Lebeau, F. Richer","doi":"10.1080/25742442.2022.2041971","DOIUrl":"https://doi.org/10.1080/25742442.2022.2041971","url":null,"abstract":"","PeriodicalId":72332,"journal":{"name":"Auditory perception & cognition","volume":"27 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2022-02-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"72639484","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}