{"title":"调节心理健康:通过自适应护理逻辑解决人机对齐问题","authors":"Anthony McCosker, Peter Kamstra, Jane Farmer","doi":"10.1177/14614448231186800","DOIUrl":null,"url":null,"abstract":"Covid-19 deepened the need for digital-based support for people experiencing mental ill-health. Discussion platforms have long filled gaps in health service provision and access, offering peer-based support usually maintained by a mix of professional and volunteer peer moderators. Even on dedicated support platforms, however, mental health content poses difficulties for human and machine moderation. While automated systems are considered essential for maintaining safety, research is lagging in understanding how human and machine moderation interacts when addressing mental health content. Working with three digital mental health services, we examine the interaction between human and automated moderation of discussion platforms, contrasting ‘reactive’ and ‘adaptive’ moderation practices. Presenting ways forward for improving digital mental health services, we argue that an integrated ‘adaptive logic of care’ can help manage the interaction between human and machine moderators as they address a tacit ‘risk matrix’ when dealing with sensitive mental health content.","PeriodicalId":443328,"journal":{"name":"New Media & Society","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-08-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Moderating mental health: Addressing the human–machine alignment problem through an adaptive logic of care\",\"authors\":\"Anthony McCosker, Peter Kamstra, Jane Farmer\",\"doi\":\"10.1177/14614448231186800\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Covid-19 deepened the need for digital-based support for people experiencing mental ill-health. Discussion platforms have long filled gaps in health service provision and access, offering peer-based support usually maintained by a mix of professional and volunteer peer moderators. Even on dedicated support platforms, however, mental health content poses difficulties for human and machine moderation. While automated systems are considered essential for maintaining safety, research is lagging in understanding how human and machine moderation interacts when addressing mental health content. Working with three digital mental health services, we examine the interaction between human and automated moderation of discussion platforms, contrasting ‘reactive’ and ‘adaptive’ moderation practices. Presenting ways forward for improving digital mental health services, we argue that an integrated ‘adaptive logic of care’ can help manage the interaction between human and machine moderators as they address a tacit ‘risk matrix’ when dealing with sensitive mental health content.\",\"PeriodicalId\":443328,\"journal\":{\"name\":\"New Media & Society\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-08-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"New Media & Society\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1177/14614448231186800\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"New Media & Society","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1177/14614448231186800","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Moderating mental health: Addressing the human–machine alignment problem through an adaptive logic of care
Covid-19 deepened the need for digital-based support for people experiencing mental ill-health. Discussion platforms have long filled gaps in health service provision and access, offering peer-based support usually maintained by a mix of professional and volunteer peer moderators. Even on dedicated support platforms, however, mental health content poses difficulties for human and machine moderation. While automated systems are considered essential for maintaining safety, research is lagging in understanding how human and machine moderation interacts when addressing mental health content. Working with three digital mental health services, we examine the interaction between human and automated moderation of discussion platforms, contrasting ‘reactive’ and ‘adaptive’ moderation practices. Presenting ways forward for improving digital mental health services, we argue that an integrated ‘adaptive logic of care’ can help manage the interaction between human and machine moderators as they address a tacit ‘risk matrix’ when dealing with sensitive mental health content.