Pim Haselager, Hanna Schraffenberger, Serge Thill, Simon Fischer, Pablo Lanillos, Sebastiaan van de Groes, Miranda van Hooff
{"title":"反思机器:支持人类对医疗决策支持系统的有效监督。","authors":"Pim Haselager, Hanna Schraffenberger, Serge Thill, Simon Fischer, Pablo Lanillos, Sebastiaan van de Groes, Miranda van Hooff","doi":"10.1017/S0963180122000718","DOIUrl":null,"url":null,"abstract":"<p><p>Human decisions are increasingly supported by decision support systems (DSS). Humans are required to remain \"on the loop,\" by monitoring and approving/rejecting machine recommendations. However, use of DSS can lead to overreliance on machines, reducing human oversight. This paper proposes \"reflection machines\" (RM) to increase meaningful human control. An RM provides a medical expert not with suggestions for a decision, but with questions that stimulate reflection about decisions. It can refer to data points or suggest counterarguments that are less compatible with the planned decision. RMs think against the proposed decision in order to increase human resistance against automation complacency. Building on preliminary research, this paper will (1) make a case for deriving a set of design requirements for RMs from EU regulations, (2) suggest a way how RMs could support decision-making, (3) describe the possibility of how a prototype of an RM could apply to the medical domain of chronic low back pain, and (4) highlight the importance of exploring an RM's functionality and the experiences of users working with it.</p>","PeriodicalId":55300,"journal":{"name":"Cambridge Quarterly of Healthcare Ethics","volume":" ","pages":"380-389"},"PeriodicalIF":1.5000,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Reflection Machines: Supporting Effective Human Oversight Over Medical Decision Support Systems.\",\"authors\":\"Pim Haselager, Hanna Schraffenberger, Serge Thill, Simon Fischer, Pablo Lanillos, Sebastiaan van de Groes, Miranda van Hooff\",\"doi\":\"10.1017/S0963180122000718\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Human decisions are increasingly supported by decision support systems (DSS). Humans are required to remain \\\"on the loop,\\\" by monitoring and approving/rejecting machine recommendations. However, use of DSS can lead to overreliance on machines, reducing human oversight. This paper proposes \\\"reflection machines\\\" (RM) to increase meaningful human control. An RM provides a medical expert not with suggestions for a decision, but with questions that stimulate reflection about decisions. It can refer to data points or suggest counterarguments that are less compatible with the planned decision. RMs think against the proposed decision in order to increase human resistance against automation complacency. Building on preliminary research, this paper will (1) make a case for deriving a set of design requirements for RMs from EU regulations, (2) suggest a way how RMs could support decision-making, (3) describe the possibility of how a prototype of an RM could apply to the medical domain of chronic low back pain, and (4) highlight the importance of exploring an RM's functionality and the experiences of users working with it.</p>\",\"PeriodicalId\":55300,\"journal\":{\"name\":\"Cambridge Quarterly of Healthcare Ethics\",\"volume\":\" \",\"pages\":\"380-389\"},\"PeriodicalIF\":1.5000,\"publicationDate\":\"2024-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Cambridge Quarterly of Healthcare Ethics\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.1017/S0963180122000718\",\"RegionNum\":4,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2023/1/10 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"Q3\",\"JCRName\":\"HEALTH CARE SCIENCES & SERVICES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Cambridge Quarterly of Healthcare Ethics","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1017/S0963180122000718","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2023/1/10 0:00:00","PubModel":"Epub","JCR":"Q3","JCRName":"HEALTH CARE SCIENCES & SERVICES","Score":null,"Total":0}
Reflection Machines: Supporting Effective Human Oversight Over Medical Decision Support Systems.
Human decisions are increasingly supported by decision support systems (DSS). Humans are required to remain "on the loop," by monitoring and approving/rejecting machine recommendations. However, use of DSS can lead to overreliance on machines, reducing human oversight. This paper proposes "reflection machines" (RM) to increase meaningful human control. An RM provides a medical expert not with suggestions for a decision, but with questions that stimulate reflection about decisions. It can refer to data points or suggest counterarguments that are less compatible with the planned decision. RMs think against the proposed decision in order to increase human resistance against automation complacency. Building on preliminary research, this paper will (1) make a case for deriving a set of design requirements for RMs from EU regulations, (2) suggest a way how RMs could support decision-making, (3) describe the possibility of how a prototype of an RM could apply to the medical domain of chronic low back pain, and (4) highlight the importance of exploring an RM's functionality and the experiences of users working with it.
期刊介绍:
The Cambridge Quarterly of Healthcare Ethics is designed to address the challenges of biology, medicine and healthcare and to meet the needs of professionals serving on healthcare ethics committees in hospitals, nursing homes, hospices and rehabilitation centres. The aim of the journal is to serve as the international forum for the wide range of serious and urgent issues faced by members of healthcare ethics committees, physicians, nurses, social workers, clergy, lawyers and community representatives.