Selin Gurgun, Emily Arden-Close, Keith Phalp, Raian Ali
{"title":"Motivated by Design: A Codesign Study to Promote Challenging Misinformation on Social Media","authors":"Selin Gurgun, Emily Arden-Close, Keith Phalp, Raian Ali","doi":"10.1155/2024/5595339","DOIUrl":null,"url":null,"abstract":"<p>The spread of misinformation on social media is a critical issue. One potential solution to mitigate the spread is user corrections; however, users often refrain due to various concerns. Leveraging the established influence of user interface design (UID) on how user interact with and respond to misinformation, this study investigates how user interface features can be designed to motivate users to challenge misinformation. It is aimed at gaining insights into users’ needs and UID requirements that encourage this behaviour. We conducted four codesign sessions with 18 social media users (age range 20–60 years <i>M</i> = 39.1; 10 female and 8 male). We applied the unified theory of acceptance and use of technology (UTAUT) as a theoretical framework and analysed our data based on the core constructs of this framework: performance expectancy, effort expectancy, social influence, and facilitating conditions. Our findings reveal four design considerations: creating secure and supportive environments, facilitating informed discussions through easy confrontation and access to reliable resources, leveraging recognition and social proof, and user support infrastructure. We also identified specific design elements with users, including indirection, semianonymity and privacy, simplicity, one-click challenging, easy access to reliable sources, recognition, displaying social proof, and platform support. These elements are aimed at reducing social discomfort and making the process of correcting misinformation more approachable for users. Our findings offer actionable insights for social media platform designers to reduce the spread of misinformation by creating environments that encourage constructive dialogues and allow users to challenge misinformation without fear of conflict.</p>","PeriodicalId":36408,"journal":{"name":"Human Behavior and Emerging Technologies","volume":null,"pages":null},"PeriodicalIF":4.3000,"publicationDate":"2024-10-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1155/2024/5595339","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Human Behavior and Emerging Technologies","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1155/2024/5595339","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
The spread of misinformation on social media is a critical issue. One potential solution to mitigate the spread is user corrections; however, users often refrain due to various concerns. Leveraging the established influence of user interface design (UID) on how user interact with and respond to misinformation, this study investigates how user interface features can be designed to motivate users to challenge misinformation. It is aimed at gaining insights into users’ needs and UID requirements that encourage this behaviour. We conducted four codesign sessions with 18 social media users (age range 20–60 years M = 39.1; 10 female and 8 male). We applied the unified theory of acceptance and use of technology (UTAUT) as a theoretical framework and analysed our data based on the core constructs of this framework: performance expectancy, effort expectancy, social influence, and facilitating conditions. Our findings reveal four design considerations: creating secure and supportive environments, facilitating informed discussions through easy confrontation and access to reliable resources, leveraging recognition and social proof, and user support infrastructure. We also identified specific design elements with users, including indirection, semianonymity and privacy, simplicity, one-click challenging, easy access to reliable sources, recognition, displaying social proof, and platform support. These elements are aimed at reducing social discomfort and making the process of correcting misinformation more approachable for users. Our findings offer actionable insights for social media platform designers to reduce the spread of misinformation by creating environments that encourage constructive dialogues and allow users to challenge misinformation without fear of conflict.
期刊介绍:
Human Behavior and Emerging Technologies is an interdisciplinary journal dedicated to publishing high-impact research that enhances understanding of the complex interactions between diverse human behavior and emerging digital technologies.