Lizbeth Benson, Emily T Hébert, Nicholas Hartman, Sarah H Sperry, Walter Dempsey, Darla E Kendzor, Michael S Businelle, Nilam Ram
{"title":"Time-Related Considerations for Modeling Event-Based Data Collected via Ecological Momentary Assessment.","authors":"Lizbeth Benson, Emily T Hébert, Nicholas Hartman, Sarah H Sperry, Walter Dempsey, Darla E Kendzor, Michael S Businelle, Nilam Ram","doi":"10.1177/25152459251401191","DOIUrl":"https://doi.org/10.1177/25152459251401191","url":null,"abstract":"<p><p>Ecological momentary assessments (EMAs) and wearable devices afford opportunities to collect real-time data on events experienced in daily life. Examples of event-based data in the psychological and behavioral sciences include smoking a cigarette, experiencing a stressor, having a disruption to sleep, experiencing a depressive or manic episode, drinking an alcoholic beverage, or engaging in a bout of exercise. The increasing availability of dense sampling approaches allows for the measurement of such events at relatively fast timescales (e.g., occurring across minutes, hours, days, or weeks), expanding the possibilities for how time can be conceptualized and modeled. Survival analysis is a modeling approach that allows researchers to address scientific questions regarding whether and when events occur in time. Although not often applied to EMA data, there are myriad research questions relevant to psychosocial and behavioral scientists that can be addressed using survival analysis. In this article, we provide an overview of survival analysis, describe several time-based considerations for modeling event-based EMA data using survival analysis, and provide several illustrative examples of the different time-based considerations. Altogether, the goals of this article are to enhance knowledge of the types of research questions that can be examined using survival analysis, illustrate nuances of applying the method to EMA data, and spark ideas for future empirical and methodological research.</p>","PeriodicalId":55645,"journal":{"name":"Advances in Methods and Practices in Psychological Science","volume":"9 1","pages":"25152459251401191"},"PeriodicalIF":13.4,"publicationDate":"2026-03-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC13133541/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147823835","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Andrew Castillo, Joshua D Miller, Colin Vize, David A A Baranger, Donald R Lynam
{"title":"When Do Interaction/Moderation Effects Stabilize in Linear Regression?","authors":"Andrew Castillo, Joshua D Miller, Colin Vize, David A A Baranger, Donald R Lynam","doi":"10.1177/25152459251407860","DOIUrl":"10.1177/25152459251407860","url":null,"abstract":"<p><p>Two-way interaction effects in linear regression occur when the relation between two variables changes depending on the level of a third. Despite their frequent use, interactions are notoriously difficult to estimate accurately and test for statistical significance because of small effect sizes and low reliability. In this study, we used Monte Carlo simulations to establish stability thresholds for two-way interactions between continuous variables across combinations of reliability (0.7-1.0), main effect size (0.1-0.5), collinearity (0.1-0.5), and interaction effect size (0.05-0.2). Stability was defined as the consistency of estimated effect sizes across repeated samples of the same size from the same population and operationalized using modified definitions of the corridor of stability and point of stability from Schönbrodt and Perugini. Results show that the stability of interaction estimates is primarily determined by sample size and predictor reliability. The case representing a realistic psychology field study, in which researchers have limited control over variables, stabilized at <math><mi>n</mi> <mo>=</mo> <mn>3,800</mn></math> , requiring 72% statistical power. At <math><mi>n</mi> <mo>≤</mo> <mn>100</mn></math> , 11% to 45% of the estimates were incorrectly signed (i.e., negative when the true effect was positive). Most psychology studies enroll far fewer than 500 participants, and our results indicate many published interactions may be unstable. Analyses involving highly reliable predictors, such as group assignment in experimental designs, may stabilize at lower sample sizes because they attenuate the expected effect size less than variables with more measurement error. Researchers are encouraged to avoid routine tests of two-way interactions unless sample size and reliability are adequate and hypotheses are specified a priori.</p>","PeriodicalId":55645,"journal":{"name":"Advances in Methods and Practices in Psychological Science","volume":"9 1","pages":""},"PeriodicalIF":13.4,"publicationDate":"2026-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12975032/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147437875","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Lydia F. Emery, David M. Silverman, Rebecca M. Carey
{"title":"Conducting Research With People in Lower-Socioeconomic-Status Contexts","authors":"Lydia F. Emery, David M. Silverman, Rebecca M. Carey","doi":"10.1177/25152459231193044","DOIUrl":"https://doi.org/10.1177/25152459231193044","url":null,"abstract":"In recent years, the field of psychology has increasingly recognized the importance of conducting research with lower-socioeconomic-status (SES) participants. Given that SES can powerfully shape people’s thoughts and actions, socioeconomically diverse samples are necessary for rigorous, generalizable research. However, even when researchers aim to collect data with these samples, they often encounter methodological and practical challenges to recruiting and retaining lower-SES participants in their studies. We propose that there are two key factors to consider when trying to recruit and retain lower-SES participants—trust and accessibility. Researchers can build trust by creating personal connections with participants and communities, paying participants fairly, and considering how participants will view their research. Researchers can enhance accessibility by recruiting in participants’ own communities, tailoring study administration to participants’ circumstances, and being flexible in payment methods. Our goal is to provide recommendations that can help to build a more inclusive science.","PeriodicalId":55645,"journal":{"name":"Advances in Methods and Practices in Psychological Science","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136152373","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Madeleine Pownall, Charlotte R. Pennington, Emma Norris, Marie Juanchich, David Smailes, Sophie Russell, Debbie Gooch, T. Evans, Sofia Persson, Matthew H. C. Mak, L. Tzavella, R. Monk, Thomas Gough, Christopher S. Y. Benwell, M. Elsherif, Emily Farran, Thomas Gallagher-Mitchell, Luke T. Kendrick, Julia Bahnmueller, E. Nordmann, Mirela Zaneva, K. Gilligan-Lee, Marina Bazhydai, Andrew Jones, Jemma Sedgmond, Iris Holzleitner, James Reynolds, Jo Moss, Daniel Farrelly, A. J. Parker, Kait Clark
{"title":"Evaluating the Pedagogical Effectiveness of Study Preregistration in the Undergraduate Dissertation","authors":"Madeleine Pownall, Charlotte R. Pennington, Emma Norris, Marie Juanchich, David Smailes, Sophie Russell, Debbie Gooch, T. Evans, Sofia Persson, Matthew H. C. Mak, L. Tzavella, R. Monk, Thomas Gough, Christopher S. Y. Benwell, M. Elsherif, Emily Farran, Thomas Gallagher-Mitchell, Luke T. Kendrick, Julia Bahnmueller, E. Nordmann, Mirela Zaneva, K. Gilligan-Lee, Marina Bazhydai, Andrew Jones, Jemma Sedgmond, Iris Holzleitner, James Reynolds, Jo Moss, Daniel Farrelly, A. J. Parker, Kait Clark","doi":"10.1177/25152459231202724","DOIUrl":"https://doi.org/10.1177/25152459231202724","url":null,"abstract":"Research shows that questionable research practices (QRPs) are present in undergraduate final-year dissertation projects. One entry-level Open Science practice proposed to mitigate QRPs is “study preregistration,” through which researchers outline their research questions, design, method, and analysis plans before data collection and/or analysis. In this study, we aimed to empirically test the effectiveness of preregistration as a pedagogic tool in undergraduate dissertations using a quasi-experimental design. A total of 89 UK psychology students were recruited, including students who preregistered their empirical quantitative dissertation (n = 52; experimental group) and students who did not (n = 37; control group). Attitudes toward statistics, acceptance of QRPs, and perceived understanding of Open Science were measured both before and after dissertation completion. Exploratory measures included capability, opportunity, and motivation to engage with preregistration, measured at Time 1 only. This study was conducted as a Registered Report; Stage 1 protocol: https://osf.io/9hjbw (date of in-principle acceptance: September 21, 2021). Study preregistration did not significantly affect attitudes toward statistics or acceptance of QRPs. However, students who preregistered reported greater perceived understanding of Open Science concepts from Time 1 to Time 2 compared with students who did not preregister. Exploratory analyses indicated that students who preregistered reported significantly greater capability, opportunity, and motivation to preregister. Qualitative responses revealed that preregistration was perceived to improve clarity and organization of the dissertation, prevent QRPs, and promote rigor. Disadvantages and barriers included time, perceived rigidity, and need for training. These results contribute to discussions surrounding embedding Open Science principles into research training.","PeriodicalId":55645,"journal":{"name":"Advances in Methods and Practices in Psychological Science","volume":"176 1","pages":""},"PeriodicalIF":13.6,"publicationDate":"2023-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139327274","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Open-Science Guidance for Qualitative Research: An Empirically Validated Approach for De-Identifying Sensitive Narrative Data","authors":"Rebecca Campbell, McKenzie Javorka, Jasmine Engleton, Kathryn Fishwick, Katie Gregory, Rachael Goodman-Williams","doi":"10.1177/25152459231205832","DOIUrl":"https://doi.org/10.1177/25152459231205832","url":null,"abstract":"The open-science movement seeks to make research more transparent and accessible. To that end, researchers are increasingly expected to share de-identified data with other scholars for review, reanalysis, and reuse. In psychology, open-science practices have been explored primarily within the context of quantitative data, but demands to share qualitative data are becoming more prevalent. Narrative data are far more challenging to de-identify fully, and because qualitative methods are often used in studies with marginalized, minoritized, and/or traumatized populations, data sharing may pose substantial risks for participants if their information can be later reidentified. To date, there has been little guidance in the literature on how to de-identify qualitative data. To address this gap, we developed a methodological framework for remediating sensitive narrative data. This multiphase process is modeled on common qualitative-coding strategies. The first phase includes consultations with diverse stakeholders and sources to understand reidentifiability risks and data-sharing concerns. The second phase outlines an iterative process for recognizing potentially identifiable information and constructing individualized remediation strategies through group review and consensus. The third phase includes multiple strategies for assessing the validity of the de-identification analyses (i.e., whether the remediated transcripts adequately protect participants’ privacy). We applied this framework to a set of 32 qualitative interviews with sexual-assault survivors. We provide case examples of how blurring and redaction techniques can be used to protect names, dates, locations, trauma histories, help-seeking experiences, and other information about dyadic interactions.","PeriodicalId":55645,"journal":{"name":"Advances in Methods and Practices in Psychological Science","volume":"4 1","pages":""},"PeriodicalIF":13.6,"publicationDate":"2023-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139331023","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Lisa Bucher, Tanja Burgard, Ulrich S. Tran, Gerhard M. Prinz, Michael Bosnjak, Martin Voracek
{"title":"Keeping Meta-Analyses Alive and Well: A Tutorial on Implementing and Using Community-Augmented Meta-Analyses in PsychOpen CAMA","authors":"Lisa Bucher, Tanja Burgard, Ulrich S. Tran, Gerhard M. Prinz, Michael Bosnjak, Martin Voracek","doi":"10.1177/25152459231197611","DOIUrl":"https://doi.org/10.1177/25152459231197611","url":null,"abstract":"Newly developed, web-based, open-repository concepts, such as community-augmented meta-analysis (CAMA), provide open access to fulfill the needs for transparency and timeliness of synthesized evidence. The main idea of CAMA is to keep meta-analyses up-to-date by allowing the research community to include new evidence continuously. In 2021, the Leibniz Institute for Psychology released a platform, PsychOpen CAMA, which serves as a publication format for CAMAs in all fields of psychology. The present work serves as a tutorial on implementing and using a CAMA in PsychOpen CAMA from a data-provider perspective, using six large-scale meta-analytic data sets on the dark triad of personality as a working example. First, the processes of data contribution and implementation of either new or updated existing data sets are summarized. Furthermore, a step-by-step tutorial on using and interpreting CAMAs guides the reader through the web application. Finally, the tutorial outlines the major benefits and the remaining challenges of CAMAs in PsychOpen CAMA.","PeriodicalId":55645,"journal":{"name":"Advances in Methods and Practices in Psychological Science","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136152683","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Michael Yeomans, F. Katelynn Boland, Hanne K. Collins, Nicole Abi-Esber, Alison Wood Brooks
{"title":"A Practical Guide to Conversation Research: How to Study What People Say to Each Other","authors":"Michael Yeomans, F. Katelynn Boland, Hanne K. Collins, Nicole Abi-Esber, Alison Wood Brooks","doi":"10.1177/25152459231183919","DOIUrl":"https://doi.org/10.1177/25152459231183919","url":null,"abstract":"Conversation—a verbal interaction between two or more people—is a complex, pervasive, and consequential human behavior. Conversations have been studied across many academic disciplines. However, advances in recording and analysis techniques over the last decade have allowed researchers to more directly and precisely examine conversations in natural contexts and at a larger scale than ever before, and these advances open new paths to understand humanity and the social world. Existing reviews of text analysis and conversation research have focused on text generated by a single author (e.g., product reviews, news articles, and public speeches) and thus leave open questions about the unique challenges presented by interactive conversation data (i.e., dialogue). In this article, we suggest approaches to overcome common challenges in the workflow of conversation science, including recording and transcribing conversations, structuring data (to merge turn-level and speaker-level data sets), extracting and aggregating linguistic features, estimating effects, and sharing data. This practical guide is meant to shed light on current best practices and empower more researchers to study conversations more directly—to expand the community of conversation scholars and contribute to a greater cumulative scientific understanding of the social world.","PeriodicalId":55645,"journal":{"name":"Advances in Methods and Practices in Psychological Science","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136247077","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Wijnand A. P. van Tilburg, Lennert J A van Tilburg
{"title":"Impossible Hypotheses and Effect-Size Limits","authors":"Wijnand A. P. van Tilburg, Lennert J A van Tilburg","doi":"10.1177/25152459231197605","DOIUrl":"https://doi.org/10.1177/25152459231197605","url":null,"abstract":"Psychological science is moving toward further specification of effect sizes when formulating hypotheses, performing power analyses, and considering the relevance of findings. This development has sparked an appreciation for the wider context in which such effect sizes are found because the importance assigned to specific sizes may vary from situation to situation. We add to this development a crucial but in psychology hitherto underappreciated contingency: There are mathematical limits to the magnitudes that population effect sizes can take within the common multivariate context in which psychology is situated, and these limits can be far more restrictive than typically assumed. The implication is that some hypothesized or preregistered effect sizes may be impossible. At the same time, these restrictions offer a way of statistically triangulating the plausible range of unknown effect sizes. We explain the reason for the existence of these limits, illustrate how to identify them, and offer recommendations and tools for improving hypothesized effect sizes by exploiting the broader multivariate context in which they occur.","PeriodicalId":55645,"journal":{"name":"Advances in Methods and Practices in Psychological Science","volume":"1 1","pages":""},"PeriodicalIF":13.6,"publicationDate":"2023-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139326360","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Karoline B. S. Huth, Jill de Ron, Anneke E. Goudriaan, Judy Luigjes, Reza Mohammadi, Ruth J. van Holst, Eric-Jan Wagenmakers, Maarten Marsman
{"title":"Bayesian Analysis of Cross-Sectional Networks: A Tutorial in R and JASP","authors":"Karoline B. S. Huth, Jill de Ron, Anneke E. Goudriaan, Judy Luigjes, Reza Mohammadi, Ruth J. van Holst, Eric-Jan Wagenmakers, Maarten Marsman","doi":"10.1177/25152459231193334","DOIUrl":"https://doi.org/10.1177/25152459231193334","url":null,"abstract":"Network psychometrics is a new direction in psychological research that conceptualizes psychological constructs as systems of interacting variables. In network analysis, variables are represented as nodes, and their interactions yield (partial) associations. Current estimation methods mostly use a frequentist approach, which does not allow for proper uncertainty quantification of the model and its parameters. Here, we outline a Bayesian approach to network analysis that offers three main benefits. In particular, applied researchers can use Bayesian methods to (1) determine structure uncertainty, (2) obtain evidence for edge inclusion and exclusion (i.e., distinguish conditional dependence or independence between variables), and (3) quantify parameter precision. In this article, we provide a conceptual introduction to Bayesian inference, describe how researchers can facilitate the three benefits for networks, and review the available R packages. In addition, we present two user-friendly software solutions: a new R package, easybgm, for fitting, extracting, and visualizing the Bayesian analysis of networks and a graphical user interface implementation in JASP. The methodology is illustrated with a worked-out example of a network of personality traits and mental health.","PeriodicalId":55645,"journal":{"name":"Advances in Methods and Practices in Psychological Science","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136093306","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
S. Hoogeveen, S. Berkhout, Q. Gronau, E. Wagenmakers, J. Haaf
{"title":"Improving Statistical Analysis in Team Science: The Case of a Bayesian Multiverse of Many Labs 4","authors":"S. Hoogeveen, S. Berkhout, Q. Gronau, E. Wagenmakers, J. Haaf","doi":"10.1177/25152459231182318","DOIUrl":"https://doi.org/10.1177/25152459231182318","url":null,"abstract":"Team-science projects have become the “gold standard” for assessing the replicability and variability of key findings in psychological science. However, we believe the typical meta-analytic approach in these projects fails to match the wealth of collected data. Instead, we advocate the use of Bayesian hierarchical modeling for team-science projects, potentially extended in a multiverse analysis. We illustrate this full-scale analysis by applying it to the recently published Many Labs 4 project. This project aimed to replicate the mortality-salience effect—that being reminded of one’s own death strengthens the own cultural identity. In a multiverse analysis, we assess the robustness of the results with varying data-inclusion criteria and prior settings. Bayesian model comparison results largely converge to a common conclusion: The data provide evidence against a mortality-salience effect across the majority of our analyses. We issue general recommendations to facilitate full-scale analyses in team-science projects.","PeriodicalId":55645,"journal":{"name":"Advances in Methods and Practices in Psychological Science","volume":" ","pages":""},"PeriodicalIF":13.6,"publicationDate":"2023-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43159473","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}