Pursuing a saboteur of patient safety: The hidden curriculum

IF 0.6 Q4 HEALTH CARE SCIENCES & SERVICES
A. Wu
{"title":"Pursuing a saboteur of patient safety: The hidden curriculum","authors":"A. Wu","doi":"10.1177/25160435221129332","DOIUrl":null,"url":null,"abstract":"There is a fundamental problem at the heart of health care: the problem of human error. Health care relies primarily on humans taking care of other humans. Because we are fallible, there will always be errors in clinical practice. James Reason, generally regarded as the father of safety science, saw the issue as being viewed in two ways: the person approach, and the system approach. The person approach focuses on the errors and violations of individuals, and aims remedial efforts at front line workers. The system approach traces causal factors to the system as whole, and prescribes remedies at multiple levels. Reason explained persuasively that “we cannot change the human condition, but we can change the conditions under which people work.” By designing systems to avert errors and allow recovery when they do occur, we can create the conditions to reduce harm from health care. Ironically, humans in the system are stubbornly resistant to accepting the more effective system approach. They seem blinded to just how common errors are in healthcare, and are often in denial about the frequency of adverse events. This is in part because elements of the system itself perpetuate the idea that a few unreliable “bad apples” are responsible for safety problems in medicine. This example of failing to see the forest for the trees – the system of healthcare for the actions of individual frontline providers – frustrates efforts to improve patient safety. Today, healthcare organizations aspire to achieve high reliability, to reduce errors and recover from their effects. But designing high reliability systems first requires being constantly aware of the possibility of failure. A difficult challenge in medical education and training is to conquer the invisible forces that prevent leaders, managers, frontline clinicians, and patients from understanding the inevitably of medical errors. There is a “hidden curriculum” that prevents us from seeing with system lenses. The formal curriculum is what is consciously intended, endorsed, and taught. But medical education is more than the transmission of knowledge and skills. It is also a socialization process. The hidden curriculum is a set of influences that function at the level of organizational culture. It includes norms and values that can undermine the messages of the stated curriculum. These are taught implicitly and daily – in the halls, and elevators, and through other channels. For example, while we are taught that some patients need more time and attention, the productivity measures employed by institutions signal that increasing the volume of services is the priority. In patient safety, a pervasive message in the hidden curriculum is that errors are uncommon, and are caused by a small number of individuals who should be blamed for inattention, carelessness, moral weakness, or incompetence. This message is especially difficult to root out, as it is founded upon a universal cognitive bias referred to as “fundamental attribution error.” This bias or “cognitive disposition to respond” is the tendency of humans to overestimate the effect of the personality of others, while underestimating the effect of the situation in explaining behavior. Frankly, it is natural to blame other people when things go wrong. However, it has consequences for the treatment of individual providers, and for actions taken at the institutional level. Armed with the misconception that individuals are responsible for errors, the solutions are discipline or retraining. Lucian Leape, a leader in the field of patient safety, once declared that “The single greatest impediment to error prevention in the medical industry is that we punish people for making mistakes.” This approach inspires fear and teaches the safest thing to do when an error occurs is to keep quiet. “Don’t talk about your mistakes – it can only hurt you” is the message, along with its corollaries “it is acceptable to cover up minor mistakes,” “don’t report on your colleagues.” These messages convince workers to disclose as little as possible to patients, colleagues, managers, and even themselves. They also discourage the reporting of safety incidents, frustrating efforts to learn from mistakes. The unfortunate consequence for organizations is that it can convince leaders they need not focus on systemwide improvement. The hidden curriculum can also create direct safety hazards for patients. For example, Liao and colleagues reported an obstetric case in which pressure on a trainee to hurry up – to avoid making a supervising physician angry – led to a dangerous maternal injury. Additional messages that “asking for help is a sign of weakness,” and “an apology to a patient is tantamount to an admission of guilt” deserve their own expanded discussion. A related channel operates at the policy level. An insidious example is the requirement by many medical licensure boards in the US to ask physicians if they have been treated for psychiatric illness. This policy creates a strong disincentive to seek medical care for anxiety or depression, which are as common among health workers as they are throughout society. Editorial","PeriodicalId":73888,"journal":{"name":"Journal of patient safety and risk management","volume":"14 1","pages":"199 - 200"},"PeriodicalIF":0.6000,"publicationDate":"2022-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of patient safety and risk management","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1177/25160435221129332","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"HEALTH CARE SCIENCES & SERVICES","Score":null,"Total":0}
引用次数: 0

Abstract

There is a fundamental problem at the heart of health care: the problem of human error. Health care relies primarily on humans taking care of other humans. Because we are fallible, there will always be errors in clinical practice. James Reason, generally regarded as the father of safety science, saw the issue as being viewed in two ways: the person approach, and the system approach. The person approach focuses on the errors and violations of individuals, and aims remedial efforts at front line workers. The system approach traces causal factors to the system as whole, and prescribes remedies at multiple levels. Reason explained persuasively that “we cannot change the human condition, but we can change the conditions under which people work.” By designing systems to avert errors and allow recovery when they do occur, we can create the conditions to reduce harm from health care. Ironically, humans in the system are stubbornly resistant to accepting the more effective system approach. They seem blinded to just how common errors are in healthcare, and are often in denial about the frequency of adverse events. This is in part because elements of the system itself perpetuate the idea that a few unreliable “bad apples” are responsible for safety problems in medicine. This example of failing to see the forest for the trees – the system of healthcare for the actions of individual frontline providers – frustrates efforts to improve patient safety. Today, healthcare organizations aspire to achieve high reliability, to reduce errors and recover from their effects. But designing high reliability systems first requires being constantly aware of the possibility of failure. A difficult challenge in medical education and training is to conquer the invisible forces that prevent leaders, managers, frontline clinicians, and patients from understanding the inevitably of medical errors. There is a “hidden curriculum” that prevents us from seeing with system lenses. The formal curriculum is what is consciously intended, endorsed, and taught. But medical education is more than the transmission of knowledge and skills. It is also a socialization process. The hidden curriculum is a set of influences that function at the level of organizational culture. It includes norms and values that can undermine the messages of the stated curriculum. These are taught implicitly and daily – in the halls, and elevators, and through other channels. For example, while we are taught that some patients need more time and attention, the productivity measures employed by institutions signal that increasing the volume of services is the priority. In patient safety, a pervasive message in the hidden curriculum is that errors are uncommon, and are caused by a small number of individuals who should be blamed for inattention, carelessness, moral weakness, or incompetence. This message is especially difficult to root out, as it is founded upon a universal cognitive bias referred to as “fundamental attribution error.” This bias or “cognitive disposition to respond” is the tendency of humans to overestimate the effect of the personality of others, while underestimating the effect of the situation in explaining behavior. Frankly, it is natural to blame other people when things go wrong. However, it has consequences for the treatment of individual providers, and for actions taken at the institutional level. Armed with the misconception that individuals are responsible for errors, the solutions are discipline or retraining. Lucian Leape, a leader in the field of patient safety, once declared that “The single greatest impediment to error prevention in the medical industry is that we punish people for making mistakes.” This approach inspires fear and teaches the safest thing to do when an error occurs is to keep quiet. “Don’t talk about your mistakes – it can only hurt you” is the message, along with its corollaries “it is acceptable to cover up minor mistakes,” “don’t report on your colleagues.” These messages convince workers to disclose as little as possible to patients, colleagues, managers, and even themselves. They also discourage the reporting of safety incidents, frustrating efforts to learn from mistakes. The unfortunate consequence for organizations is that it can convince leaders they need not focus on systemwide improvement. The hidden curriculum can also create direct safety hazards for patients. For example, Liao and colleagues reported an obstetric case in which pressure on a trainee to hurry up – to avoid making a supervising physician angry – led to a dangerous maternal injury. Additional messages that “asking for help is a sign of weakness,” and “an apology to a patient is tantamount to an admission of guilt” deserve their own expanded discussion. A related channel operates at the policy level. An insidious example is the requirement by many medical licensure boards in the US to ask physicians if they have been treated for psychiatric illness. This policy creates a strong disincentive to seek medical care for anxiety or depression, which are as common among health workers as they are throughout society. Editorial
追查病人安全的破坏者:隐藏的课程
医疗保健的核心存在一个根本问题:人为失误的问题。卫生保健主要依赖于人类照顾他人。因为我们会犯错,所以在临床实践中总是会有错误。詹姆斯·瑞森,通常被认为是安全科学之父,他认为这个问题可以从两种角度来看待:人的角度和系统的角度。个人方法侧重于个人的错误和违规行为,并针对一线工作人员进行补救。系统方法将因果因素追踪到整个系统,并在多个层面上规定补救措施。理性令人信服地解释说:“我们不能改变人类的状况,但我们可以改变人们工作的条件。”通过设计系统来避免错误,并在错误发生时允许恢复,我们可以创造条件来减少医疗保健的危害。具有讽刺意味的是,系统中的人类顽固地抵制接受更有效的系统方法。他们似乎对医疗保健中常见的错误视而不见,并且经常否认不良事件的发生频率。这在一定程度上是因为医疗体系本身的一些因素使人们认为,少数不可靠的“坏苹果”要为医疗安全问题负责。这个不见森林不见树木的例子——医疗保健系统只关注一线医护人员的个人行为——阻碍了改善患者安全的努力。今天,医疗保健组织渴望实现高可靠性,以减少错误并从其影响中恢复。但是设计高可靠性系统首先需要时刻意识到故障的可能性。在医学教育和培训中,克服阻碍领导者、管理者、一线临床医生和患者理解不可避免的医疗差错的无形力量是一项艰巨的挑战。有一种“隐藏的课程”阻止我们用系统的透镜看问题。正式课程是有意识地设计、认可和教授的。但是医学教育不仅仅是知识和技能的传授。这也是一个社会化的过程。隐性课程是在组织文化层面起作用的一系列影响。它包含的规范和价值观可能会破坏既定课程的信息。这些都是含蓄的、日常的教导——在大厅里、电梯里,以及通过其他渠道。例如,虽然我们被告知有些病人需要更多的时间和关注,但各机构采用的生产率指标表明,增加服务量是优先事项。在患者安全方面,隐藏课程中普遍存在的信息是,错误并不常见,而且是由少数人造成的,他们应该因注意力不集中、粗心大意、道德薄弱或无能而受到指责。这个信息尤其难以根除,因为它是建立在一种被称为“基本归因错误”的普遍认知偏见之上的。这种偏见或“回应的认知倾向”是人类倾向于高估他人个性的影响,而低估解释行为时情境的影响。坦率地说,当事情出错时,责怪别人是很自然的。但是,它对个别提供者的待遇和在机构一级采取的行动都有影响。有了个人应该为错误负责的错误观念,解决办法就是纪律或再培训。病人安全领域的领军人物卢西安·利普(Lucian Leape)曾宣称:“在医疗行业中,预防错误的唯一最大障碍是我们惩罚犯错的人。”这种方法激发了恐惧,并告诉人们,当错误发生时,最安全的做法是保持沉默。“不要谈论你的错误——这只会伤害到你自己”,这条信息及其推论是“掩盖小错误是可以接受的”、“不要举报你的同事”。这些信息说服员工尽可能少地向病人、同事、经理甚至自己透露。它们还阻碍了安全事故的报告,挫败了从错误中吸取教训的努力。对组织来说,不幸的后果是,它可以说服领导者,他们不需要关注整个系统的改进。隐性课程也会给患者带来直接的安全隐患。例如,廖和他的同事报告了一个产科案例,在这个案例中,为了避免让监督医生生气,一名实习生被迫加快速度,这导致了一次危险的产妇受伤。“寻求帮助是软弱的表现”、“向病人道歉等于承认有罪”等其他信息值得进一步讨论。相关的通道在策略级别上运行。一个阴险的例子是,美国许多医疗执照委员会要求医生询问他们是否接受过精神疾病治疗。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
2.00
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信