Katherine E Koralesky, Marina A G von Keyserlingk, Daniel M Weary
{"title":"Assessing construct reliability through open-ended survey response analysis.","authors":"Katherine E Koralesky, Marina A G von Keyserlingk, Daniel M Weary","doi":"10.1371/journal.pone.0320570","DOIUrl":null,"url":null,"abstract":"<p><p>Online surveys often include quantitative attention checks, but inattentive participants might also be identified using their qualitative responses. We used the software Turnitin™ to assess the originality of open-ended responses in four mixed-method surveys that included validated multi-item rating scales (i.e., constructs). Across surveys, 18-35% of participants (n = 3,771) were identified as having copied responses from online sources. We assessed indicator reliability and internal consistency reliability and found that both were lower for participants identified as using copied text versus those who wrote more original responses. Those who provided more original responses also provided more consistent responses to the validated scales, suggesting that these participants were more attentive. We conclude that this process can be used to screen open-ended responses from online surveys. We encourage future research to replicate this screening process using similar tools, investigate strategies to reduce copying behaviour, and explore the motivation of participants to search for information online, including what sources they find compelling.</p>","PeriodicalId":20189,"journal":{"name":"PLoS ONE","volume":"20 4","pages":"e0320570"},"PeriodicalIF":2.9000,"publicationDate":"2025-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11960925/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"PLoS ONE","FirstCategoryId":"103","ListUrlMain":"https://doi.org/10.1371/journal.pone.0320570","RegionNum":3,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/1/1 0:00:00","PubModel":"eCollection","JCR":"Q1","JCRName":"MULTIDISCIPLINARY SCIENCES","Score":null,"Total":0}
引用次数: 0
Abstract
Online surveys often include quantitative attention checks, but inattentive participants might also be identified using their qualitative responses. We used the software Turnitin™ to assess the originality of open-ended responses in four mixed-method surveys that included validated multi-item rating scales (i.e., constructs). Across surveys, 18-35% of participants (n = 3,771) were identified as having copied responses from online sources. We assessed indicator reliability and internal consistency reliability and found that both were lower for participants identified as using copied text versus those who wrote more original responses. Those who provided more original responses also provided more consistent responses to the validated scales, suggesting that these participants were more attentive. We conclude that this process can be used to screen open-ended responses from online surveys. We encourage future research to replicate this screening process using similar tools, investigate strategies to reduce copying behaviour, and explore the motivation of participants to search for information online, including what sources they find compelling.
期刊介绍:
PLOS ONE is an international, peer-reviewed, open-access, online publication. PLOS ONE welcomes reports on primary research from any scientific discipline. It provides:
* Open-access—freely accessible online, authors retain copyright
* Fast publication times
* Peer review by expert, practicing researchers
* Post-publication tools to indicate quality and impact
* Community-based dialogue on articles
* Worldwide media coverage