{"title":"Motor styles in action: Developing a computational framework for operationalization of motor distances.","authors":"Jordi Manuello, Camilla Maronati, Matilde Rocca, Riccardo Guidotti, Tommaso Costa, Andrea Cavallo","doi":"10.3758/s13428-024-02530-0","DOIUrl":"10.3758/s13428-024-02530-0","url":null,"abstract":"<p><p>Aside from some common movement regularities, significant inter-individual and inter-trial variation within the same individual exists in motor system output. However, there is still a lack of a robust and widely adopted solution for quantifying the degree of similarity between movements. We therefore developed an innovative approach based on the Procrustes transformation to compute 'motor distance' between pairs of kinematic data. As a proof of concept, we tested this on a dataset of reach-to-grasp movements performed by 16 participants while acting with the same confederate. Using the information of wrist velocity, acceleration, and jerk, the proposed technique was able to correctly estimate smaller distances between movements performed by the confederate compared with those of participants. Moreover, the reconstructed pattern of inter-subject distances was consistent when computed either on precision grip prehension or whole hand prehension, suggesting its suitability for the investigation of 'motor styles'. The definition of a solid approach to 'motor distance' computation, therefore, opens the way to new research lines in the field of movement kinematics.</p>","PeriodicalId":8717,"journal":{"name":"Behavior Research Methods","volume":"57 1","pages":"13"},"PeriodicalIF":4.6,"publicationDate":"2024-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11634918/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142812108","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Zohre Soleymani Tekbudak, Mehdi Purmohammad, Ayşegül Özkan, Cengiz Acartürk
{"title":"The PSR corpus: A Persian sentence reading corpus of eye movements.","authors":"Zohre Soleymani Tekbudak, Mehdi Purmohammad, Ayşegül Özkan, Cengiz Acartürk","doi":"10.3758/s13428-024-02517-x","DOIUrl":"10.3758/s13428-024-02517-x","url":null,"abstract":"<p><p>The present study introduces the Persian Sentence Reading (PSR) Corpus, aiming to expand empirical data for Persian, an under-investigated language in research on oculomotor control in reading. Reading research has largely focused on Latin script languages with a left-to-right reading direction. However, languages with different reading directions, such as right-to-left and top-to-bottom, and particularly Persian script-based languages like Farsi and Dari, have remained understudied. This study pioneers in providing an eye movement dataset for reading Persian sentences, enabling further exploration of the influences of unique Persian characteristics on eye movement patterns during sentence reading. The core objective of the study is to provide data about how word characteristics impact eye movement patterns. The research also investigates the characteristics of the interplay between neighboring words and eye movements on them. By broadening the scope of reading research beyond commonly studied languages, the study aims to contribute to an interdisciplinary approach to reading research, exemplifying investigations through various theoretical and methodological perspectives.</p>","PeriodicalId":8717,"journal":{"name":"Behavior Research Methods","volume":"57 1","pages":"14"},"PeriodicalIF":4.6,"publicationDate":"2024-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11634938/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142812109","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Noemí Grinspun, Eden Landesman, Yonnhatan García, Tal-Chen Rabinowitch
{"title":"Dance with me? Analyzing interpersonal synchrony and quality of interaction during joint dance.","authors":"Noemí Grinspun, Eden Landesman, Yonnhatan García, Tal-Chen Rabinowitch","doi":"10.3758/s13428-024-02563-5","DOIUrl":"10.3758/s13428-024-02563-5","url":null,"abstract":"<p><p>This methodological paper examines the assessment of interpersonal synchrony during a joint dancing task between mothers and their children (aged 4 to 5 years) using OpenPose. This pose estimation tool captures movement in naturalistic settings. The study analyzes 45 mother-child dyads, comparing two analytical methods for assessing synchrony, and examines their correlation with the Coding Interactive Behavior (CIB) measure of interaction quality. The first method employs cross-wavelet transform (CWT) coherence to assess synchrony based on vertical head movement. This straightforward and computationally efficient approach reveals a significant correlation between interpersonal synchrony and CIB scores, thus implying its potential as a reliable indicator of interaction quality and suggesting its potential as a measure of interaction quality. The second method, the generalized cross-wavelet transform (GCWT), analyzes synchrony across multiple body parts, offering a more complex and detailed analysis of interpersonal dynamics. However, it did not significantly correlate with the CIB scores. Our findings suggest that focusing on head movement using CWT can effectively capture critical elements of interpersonal synchrony linked to interaction quality. In contrast, despite its richness, the more complex GCWT approach may not align as closely with observed interactive behaviors as the CIB scores indicate. This study underscores the need to balance methodological complexity and ecological validity in research, offering insights into selecting analytical techniques based on research objectives and the nuances of interpersonal dynamics. Our results contribute to the field of interpersonal synchrony research, emphasizing the benefits of efficient methods in understanding mother-child interactions and interaction relationships in general.</p>","PeriodicalId":8717,"journal":{"name":"Behavior Research Methods","volume":"57 1","pages":"12"},"PeriodicalIF":4.6,"publicationDate":"2024-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11634922/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142812107","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Predicting high variability in imageability ratings across age groups and its influence on visual word recognition.","authors":"Sandra Aka, Stéphanie Mathey","doi":"10.3758/s13428-024-02520-2","DOIUrl":"https://doi.org/10.3758/s13428-024-02520-2","url":null,"abstract":"<p><p>Imageability, an important word characteristic in the psycholinguistic literature, is typically assessed by asking participants to estimate the ease with which a word can evoke a mental image. Our aim was to explore inter-rater disagreement in normative imageability ratings. We examined the predictors of variability around average imageability ratings for young, middle-aged and older adults (Study 1) and assessed its impact on visual word recognition performance in young adults (Study 2). Analyses of French age-related imageability ratings (Ballot et al., Behavior Research Methods, 54, 196-215, 2022) revealed that inter-rater disagreement around the average imageability value was critically high for most words within the imageability norms, thus questioning the construct validity of the average rating for the most variable items. Variability in ratings changed between age groups (18-25, 26-40, 41-59, and over 60 years) and was associated with words that are longer, less frequent, learnt later in life and less emotional (Study 1). To examine the consequences of elevated standard deviations around the average imageability rating on visual word recognition, we entered this factor in a hierarchical regression alongside classic lexico-semantic predictors. The effect of word-imageability on young adults' lexical decision times (Ferrand et al., Behavior Research Methods, 50, 1285-1307, 2018) remained significant after accounting for inter-rater disagreement in imageability ratings, even when considering the least consensual words (Study 2). We conclude that imageability ratings reliably predict visual word recognition performance in young adults for large datasets, but might require caution for smaller ones. Given imageability rating differences across adulthood, further research investigating age-related differences in language processing is necessary.</p>","PeriodicalId":8717,"journal":{"name":"Behavior Research Methods","volume":"57 1","pages":"8"},"PeriodicalIF":4.6,"publicationDate":"2024-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142799357","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Daniel Schmidtke, Julie A Van Dyke, Victor Kuperman
{"title":"DerLex: An eye-movement database of derived word reading in English.","authors":"Daniel Schmidtke, Julie A Van Dyke, Victor Kuperman","doi":"10.3758/s13428-024-02565-3","DOIUrl":"https://doi.org/10.3758/s13428-024-02565-3","url":null,"abstract":"<p><p>This paper introduces a new database of eye-tracking data on English derived words, DerLex. A total of 598 unique derived suffixed words were embedded in sentences and read by 357 participants representing both university convenience pools and community pools of non-college-bound adults. Besides the eye-movement record of reading derived suffixed words, the DerLex database provides the author recognition test (ART) scores for each participant, tapping into their reading proficiency, as well as multiple lexical variables reflecting distributional, orthographic, phonological, and semantic features of the words, their constituent morphemes, and morphological families. The paper additionally reports the main effects of select lexical variables and their interactions with the ART scores. It also produces estimates of statistical power and sample sizes required to reliably detect those lexical effects. While some effects are robust and can be readily detected even in a small-scale typical experiment, the over-powered DerLex database does not offer sufficient power to detect many other effects-including those of theoretical importance for existing accounts of morphological processing. We believe that both the availability of the new data resource and the limitations it provides for the planning and design of upcoming experiments are useful for future research on morphological complexity.</p>","PeriodicalId":8717,"journal":{"name":"Behavior Research Methods","volume":"57 1","pages":"11"},"PeriodicalIF":4.6,"publicationDate":"2024-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142806083","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Toward an Asian-based bodily movement database for emotional communication.","authors":"Miao Cheng, Chia-Huei Tseng, Ken Fujiwara, Shoi Higashiyama, Abby Weng, Yoshifumi Kitamura","doi":"10.3758/s13428-024-02558-2","DOIUrl":"10.3758/s13428-024-02558-2","url":null,"abstract":"<p><p>Most current databases for bodily emotion expression are created in Western countries, resulting in culturally skewed representations. To address the obvious risk this bias poses to academic comprehension, we attempted to expand the current repertoire of human bodily emotions by recruiting Asian professional performers to wear whole-body suits with 57 retroreflective markers attached to major joints and body segments, and express seven basic emotions with whole-body movements in a motion-capture lab. For each emotion, actors performed three self-created scenarios that covered a broad range of real-life events to elicit the target emotion within 2-5 seconds. Subsequently, a separate group of participants was invited to judge the perceived emotional category from the extracted biological motions (point-light displays with 18 or 57 markers). The results demonstrated that the emotion discrimination accuracy was comparable to Western databases containing standardized performance scenarios. The results provide a significant step toward establishing a database using a novel emotional induction approach based on personalized scenarios. This database will contribute to a more comprehensive understanding of emotional expression across diverse contexts.</p>","PeriodicalId":8717,"journal":{"name":"Behavior Research Methods","volume":"57 1","pages":"10"},"PeriodicalIF":4.6,"publicationDate":"2024-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11632091/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142799363","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The power of effect size stabilization.","authors":"Benjamin Kowialiewski","doi":"10.3758/s13428-024-02549-3","DOIUrl":"https://doi.org/10.3758/s13428-024-02549-3","url":null,"abstract":"<p><p>Determining an appropriate sample size in psychological experiments is a common challenge, requiring a balance between maximizing the chance of detecting a true effect (minimizing false negatives) and minimizing the risk of observing an effect where none exists (minimizing false positives). A recent study proposes using effect size stabilization, a form of optional stopping, to define sample size without increasing the risk of false positives. In effect size stabilization, researchers monitor the effect size of their samples throughout the sampling process and stop sampling when the effect no longer varies beyond predefined thresholds. This study aims to improve our understanding of effect size stabilization properties. Simulations involving effect size stabilization are presented, with parametric modulation of the true effect in the population and the strictness of the stabilization rule. As previously demonstrated, the results indicate that optional stopping based on effect-size stabilization consistently yields unbiased samples over the long run. However, simulations also reveal that effect size stabilization does not guarantee the detection of a true effect in the population. Consequently, researchers adopting effect size stabilization put themselves at risk of increasing type 2 error probability. Instead of using effect-size stabilization procedures for testing, researchers should use them to reach accurate parameter estimates.</p>","PeriodicalId":8717,"journal":{"name":"Behavior Research Methods","volume":"57 1","pages":"7"},"PeriodicalIF":4.6,"publicationDate":"2024-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142799359","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Sharif I Kronemer, Victoria E Gobo, Catherine R Walsh, Joshua B Teves, Diana C Burk, Somayeh Shahsavarani, Javier Gonzalez-Castillo, Peter A Bandettini
{"title":"Cross-species real-time detection of trends in pupil size fluctuation.","authors":"Sharif I Kronemer, Victoria E Gobo, Catherine R Walsh, Joshua B Teves, Diana C Burk, Somayeh Shahsavarani, Javier Gonzalez-Castillo, Peter A Bandettini","doi":"10.3758/s13428-024-02545-7","DOIUrl":"10.3758/s13428-024-02545-7","url":null,"abstract":"<p><p>Pupillometry is a popular method because pupil size is easily measured and sensitive to central neural activity linked to behavior, cognition, emotion, and perception. Currently, there is no method for online monitoring phases of pupil size fluctuation. We introduce rtPupilPhase-an open-source software that automatically detects trends in pupil size in real time. This tool enables novel applications of real-time pupillometry for achieving numerous research and translational goals. We validated the performance of rtPupilPhase on human, rodent, and monkey pupil data, and we propose future implementations of real-time pupillometry.</p>","PeriodicalId":8717,"journal":{"name":"Behavior Research Methods","volume":"57 1","pages":"9"},"PeriodicalIF":4.6,"publicationDate":"2024-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11632003/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142799354","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Affective Norms for German as a Second Language (ANGL2).","authors":"Zeming Xu, Jia Liu, Lin Fan","doi":"10.3758/s13428-024-02539-5","DOIUrl":"https://doi.org/10.3758/s13428-024-02539-5","url":null,"abstract":"<p><p>The present study introduces affective norms for a set of 880 German words rated by learners of German as a second language (L2), i.e., the Affective Norms for German as a Second Language (ANGL2). The database provides ratings across affective and subjective psycholinguistic dimensions. Besides valence and arousal ratings, ANGL2 features data on emotional prototypicality, which helps to identify emotion-label words and emotion-laden words. Moreover, the database includes two additional semantic variables: concreteness and familiarity. We observed similarities with previous studies, and the ratings provided by L2 speakers demonstrate characteristics that should be noted in studies involving bilinguals, including more moderate valence ratings, and a stronger correlation between valence and arousal, specifically for positive words. ANGL2 is the first set of affective norms that has been rated by L2 speakers for a language other than English. The set of norms is aimed to function as a resource for psycholinguistic experimental studies on the intersection between emotion and language among L2 speakers.</p>","PeriodicalId":8717,"journal":{"name":"Behavior Research Methods","volume":"57 1","pages":"6"},"PeriodicalIF":4.6,"publicationDate":"2024-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142779283","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Continuous motion tracking for accurate and efficient color vision assessment.","authors":"Chenxi Liang, Jing Chen, Zhongting Chen","doi":"10.3758/s13428-024-02518-w","DOIUrl":"https://doi.org/10.3758/s13428-024-02518-w","url":null,"abstract":"<p><p>The assessment of color vision is crucial in both fundamental visual research and clinical diagnosis. However, existing tools for color vision assessment are limited by various factors. This study introduces a novel, efficient method for color vision assessment, which is based on a continuous motion tracking task and a Kalman filter model. The effectiveness of this new method was evaluated by assessing the color vision of both color-deficient observers and normal controls. The results from both a small sample (N = 29, Experiment 1) and a large sample (N = 171, Experiment 2) showed that color-deficient observers could be perfectly identified within 20 s using the tracking performance. We also compared the new method with a traditional psychophysical detection task to examine the consistency of perceptual noise estimation between the two methods, and the results showed a moderate correlation (Pearson's r = .59 ~ .64). The results also demonstrated that the new method could measure individuals' contrast response functions of both red-green and blue-yellow colors (e.g., the L-M and S-(L + M) axes in DKL color space) in just a few minutes, showing much higher efficiency than traditional methods. All the findings from this study indicate that the continuous motion tracking method is a promising tool for both rapid screening of color vision deficiencies and fundamental research on color vision.</p>","PeriodicalId":8717,"journal":{"name":"Behavior Research Methods","volume":"57 1","pages":"3"},"PeriodicalIF":4.6,"publicationDate":"2024-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142779287","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}