预防自杀的技术:危机支持人员的恐惧与功能

IF 4.3 Q1 PSYCHOLOGY, MULTIDISCIPLINARY
Danielle Hopkins, Kelly Mazzer, Debra Rickwood
{"title":"预防自杀的技术:危机支持人员的恐惧与功能","authors":"Danielle Hopkins,&nbsp;Kelly Mazzer,&nbsp;Debra Rickwood","doi":"10.1155/2024/6625037","DOIUrl":null,"url":null,"abstract":"<p><b>Background:</b> Crisis supporters at Lifeline Australia consistently engage with distressed and often suicidal help-seekers. The development of technological methods, such as machine learning (ML), in suicide prevention may complement their support work. Investigating attitudes towards the use of ML in crisis support is an important first step.</p><p><b>Aims:</b> The current study is aimed at investigating crisis supporters’ attitudes towards ML in crisis support/suicide prevention, beliefs about the effect of technology on the service and help-seeking, and concerns/opinions about any future technology implementation.</p><p><b>Methods:</b> Two hundred fifty-five crisis supporters aged 20–84 years were recruited through Lifeline Australia. Participants voluntarily completed an anonymous questionnaire, including measures of attitudes towards technology as well as open-text options, which provided the data for a thematic analysis.</p><p><b>Results:</b> Crisis supporters were neutral to negative on an adapted measure of ML use in crisis support. Less than one-third held the belief that technology would enhance Lifeline services, and over half of the participants felt help-seekers would be less likely to contact Lifeline if technology was implemented. Thematic analysis of the open-text questions revealed loss of human connection and mistrust of algorithms to be the most prominent barriers to future technological adoption by Lifeline crisis supporters.</p><p><b>Limitations:</b> Clearly defining terms of ML and technology was difficult to do in this hypothetical context, potentially impacting the attitudes expressed.</p><p><b>Conclusions:</b> Any new technology to support crisis supporters needs to be carefully codesigned with the workforce to ensure effective implementation and avoid any potential or perceived negative impacts on help-seekers.</p>","PeriodicalId":36408,"journal":{"name":"Human Behavior and Emerging Technologies","volume":null,"pages":null},"PeriodicalIF":4.3000,"publicationDate":"2024-08-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1155/2024/6625037","citationCount":"0","resultStr":"{\"title\":\"Technology in Suicide Prevention: Fears and Functionality for Crisis Supporters\",\"authors\":\"Danielle Hopkins,&nbsp;Kelly Mazzer,&nbsp;Debra Rickwood\",\"doi\":\"10.1155/2024/6625037\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><b>Background:</b> Crisis supporters at Lifeline Australia consistently engage with distressed and often suicidal help-seekers. The development of technological methods, such as machine learning (ML), in suicide prevention may complement their support work. Investigating attitudes towards the use of ML in crisis support is an important first step.</p><p><b>Aims:</b> The current study is aimed at investigating crisis supporters’ attitudes towards ML in crisis support/suicide prevention, beliefs about the effect of technology on the service and help-seeking, and concerns/opinions about any future technology implementation.</p><p><b>Methods:</b> Two hundred fifty-five crisis supporters aged 20–84 years were recruited through Lifeline Australia. Participants voluntarily completed an anonymous questionnaire, including measures of attitudes towards technology as well as open-text options, which provided the data for a thematic analysis.</p><p><b>Results:</b> Crisis supporters were neutral to negative on an adapted measure of ML use in crisis support. Less than one-third held the belief that technology would enhance Lifeline services, and over half of the participants felt help-seekers would be less likely to contact Lifeline if technology was implemented. Thematic analysis of the open-text questions revealed loss of human connection and mistrust of algorithms to be the most prominent barriers to future technological adoption by Lifeline crisis supporters.</p><p><b>Limitations:</b> Clearly defining terms of ML and technology was difficult to do in this hypothetical context, potentially impacting the attitudes expressed.</p><p><b>Conclusions:</b> Any new technology to support crisis supporters needs to be carefully codesigned with the workforce to ensure effective implementation and avoid any potential or perceived negative impacts on help-seekers.</p>\",\"PeriodicalId\":36408,\"journal\":{\"name\":\"Human Behavior and Emerging Technologies\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":4.3000,\"publicationDate\":\"2024-08-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://onlinelibrary.wiley.com/doi/epdf/10.1155/2024/6625037\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Human Behavior and Emerging Technologies\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1155/2024/6625037\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"PSYCHOLOGY, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Human Behavior and Emerging Technologies","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1155/2024/6625037","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

摘要

背景:澳大利亚生命线(Lifeline Australia)的危机支持人员一直在与苦恼且经常有自杀倾向的求助者接触。在预防自杀方面,机器学习(ML)等技术方法的发展可能会对他们的支持工作起到补充作用。本研究旨在调查危机支持者对危机支持/自杀预防中使用机器学习的态度、对技术对服务和求助的影响的看法,以及对未来技术实施的担忧/看法:通过澳大利亚生命线组织招募了 255 名年龄在 20-84 岁之间的危机支持者。参与者自愿填写了一份匿名问卷,其中包括对技术态度的衡量标准以及开放式文本选项,为主题分析提供了数据:危机支持者对在危机支持中使用移动电话的改编测量结果持中立或否定态度。只有不到三分之一的参与者认为技术会提升生命线服务,超过一半的参与者认为如果采用技术,求助者联系生命线的可能性会降低。对开放文本问题的主题分析表明,失去人与人之间的联系和对算法的不信任是生命线危机支持者未来采用技术的最主要障碍:局限性:在这一假设情境中很难明确定义 ML 和技术,这可能会影响所表达的态度:任何支持危机支持者的新技术都需要与员工进行仔细的代码设计,以确保有效实施,并避免对求助者造成任何潜在或感知上的负面影响。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Technology in Suicide Prevention: Fears and Functionality for Crisis Supporters

Technology in Suicide Prevention: Fears and Functionality for Crisis Supporters

Background: Crisis supporters at Lifeline Australia consistently engage with distressed and often suicidal help-seekers. The development of technological methods, such as machine learning (ML), in suicide prevention may complement their support work. Investigating attitudes towards the use of ML in crisis support is an important first step.

Aims: The current study is aimed at investigating crisis supporters’ attitudes towards ML in crisis support/suicide prevention, beliefs about the effect of technology on the service and help-seeking, and concerns/opinions about any future technology implementation.

Methods: Two hundred fifty-five crisis supporters aged 20–84 years were recruited through Lifeline Australia. Participants voluntarily completed an anonymous questionnaire, including measures of attitudes towards technology as well as open-text options, which provided the data for a thematic analysis.

Results: Crisis supporters were neutral to negative on an adapted measure of ML use in crisis support. Less than one-third held the belief that technology would enhance Lifeline services, and over half of the participants felt help-seekers would be less likely to contact Lifeline if technology was implemented. Thematic analysis of the open-text questions revealed loss of human connection and mistrust of algorithms to be the most prominent barriers to future technological adoption by Lifeline crisis supporters.

Limitations: Clearly defining terms of ML and technology was difficult to do in this hypothetical context, potentially impacting the attitudes expressed.

Conclusions: Any new technology to support crisis supporters needs to be carefully codesigned with the workforce to ensure effective implementation and avoid any potential or perceived negative impacts on help-seekers.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Human Behavior and Emerging Technologies
Human Behavior and Emerging Technologies Social Sciences-Social Sciences (all)
CiteScore
17.20
自引率
8.70%
发文量
73
期刊介绍: Human Behavior and Emerging Technologies is an interdisciplinary journal dedicated to publishing high-impact research that enhances understanding of the complex interactions between diverse human behavior and emerging digital technologies.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信