{"title":"追查病人安全的破坏者:隐藏的课程","authors":"A. Wu","doi":"10.1177/25160435221129332","DOIUrl":null,"url":null,"abstract":"There is a fundamental problem at the heart of health care: the problem of human error. Health care relies primarily on humans taking care of other humans. Because we are fallible, there will always be errors in clinical practice. James Reason, generally regarded as the father of safety science, saw the issue as being viewed in two ways: the person approach, and the system approach. The person approach focuses on the errors and violations of individuals, and aims remedial efforts at front line workers. The system approach traces causal factors to the system as whole, and prescribes remedies at multiple levels. Reason explained persuasively that “we cannot change the human condition, but we can change the conditions under which people work.” By designing systems to avert errors and allow recovery when they do occur, we can create the conditions to reduce harm from health care. Ironically, humans in the system are stubbornly resistant to accepting the more effective system approach. They seem blinded to just how common errors are in healthcare, and are often in denial about the frequency of adverse events. This is in part because elements of the system itself perpetuate the idea that a few unreliable “bad apples” are responsible for safety problems in medicine. This example of failing to see the forest for the trees – the system of healthcare for the actions of individual frontline providers – frustrates efforts to improve patient safety. Today, healthcare organizations aspire to achieve high reliability, to reduce errors and recover from their effects. But designing high reliability systems first requires being constantly aware of the possibility of failure. A difficult challenge in medical education and training is to conquer the invisible forces that prevent leaders, managers, frontline clinicians, and patients from understanding the inevitably of medical errors. There is a “hidden curriculum” that prevents us from seeing with system lenses. The formal curriculum is what is consciously intended, endorsed, and taught. But medical education is more than the transmission of knowledge and skills. It is also a socialization process. The hidden curriculum is a set of influences that function at the level of organizational culture. It includes norms and values that can undermine the messages of the stated curriculum. These are taught implicitly and daily – in the halls, and elevators, and through other channels. For example, while we are taught that some patients need more time and attention, the productivity measures employed by institutions signal that increasing the volume of services is the priority. In patient safety, a pervasive message in the hidden curriculum is that errors are uncommon, and are caused by a small number of individuals who should be blamed for inattention, carelessness, moral weakness, or incompetence. This message is especially difficult to root out, as it is founded upon a universal cognitive bias referred to as “fundamental attribution error.” This bias or “cognitive disposition to respond” is the tendency of humans to overestimate the effect of the personality of others, while underestimating the effect of the situation in explaining behavior. Frankly, it is natural to blame other people when things go wrong. However, it has consequences for the treatment of individual providers, and for actions taken at the institutional level. Armed with the misconception that individuals are responsible for errors, the solutions are discipline or retraining. Lucian Leape, a leader in the field of patient safety, once declared that “The single greatest impediment to error prevention in the medical industry is that we punish people for making mistakes.” This approach inspires fear and teaches the safest thing to do when an error occurs is to keep quiet. “Don’t talk about your mistakes – it can only hurt you” is the message, along with its corollaries “it is acceptable to cover up minor mistakes,” “don’t report on your colleagues.” These messages convince workers to disclose as little as possible to patients, colleagues, managers, and even themselves. They also discourage the reporting of safety incidents, frustrating efforts to learn from mistakes. The unfortunate consequence for organizations is that it can convince leaders they need not focus on systemwide improvement. The hidden curriculum can also create direct safety hazards for patients. For example, Liao and colleagues reported an obstetric case in which pressure on a trainee to hurry up – to avoid making a supervising physician angry – led to a dangerous maternal injury. Additional messages that “asking for help is a sign of weakness,” and “an apology to a patient is tantamount to an admission of guilt” deserve their own expanded discussion. A related channel operates at the policy level. An insidious example is the requirement by many medical licensure boards in the US to ask physicians if they have been treated for psychiatric illness. This policy creates a strong disincentive to seek medical care for anxiety or depression, which are as common among health workers as they are throughout society. Editorial","PeriodicalId":73888,"journal":{"name":"Journal of patient safety and risk management","volume":"14 1","pages":"199 - 200"},"PeriodicalIF":0.6000,"publicationDate":"2022-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Pursuing a saboteur of patient safety: The hidden curriculum\",\"authors\":\"A. Wu\",\"doi\":\"10.1177/25160435221129332\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"There is a fundamental problem at the heart of health care: the problem of human error. Health care relies primarily on humans taking care of other humans. Because we are fallible, there will always be errors in clinical practice. James Reason, generally regarded as the father of safety science, saw the issue as being viewed in two ways: the person approach, and the system approach. The person approach focuses on the errors and violations of individuals, and aims remedial efforts at front line workers. The system approach traces causal factors to the system as whole, and prescribes remedies at multiple levels. Reason explained persuasively that “we cannot change the human condition, but we can change the conditions under which people work.” By designing systems to avert errors and allow recovery when they do occur, we can create the conditions to reduce harm from health care. Ironically, humans in the system are stubbornly resistant to accepting the more effective system approach. They seem blinded to just how common errors are in healthcare, and are often in denial about the frequency of adverse events. This is in part because elements of the system itself perpetuate the idea that a few unreliable “bad apples” are responsible for safety problems in medicine. This example of failing to see the forest for the trees – the system of healthcare for the actions of individual frontline providers – frustrates efforts to improve patient safety. Today, healthcare organizations aspire to achieve high reliability, to reduce errors and recover from their effects. But designing high reliability systems first requires being constantly aware of the possibility of failure. A difficult challenge in medical education and training is to conquer the invisible forces that prevent leaders, managers, frontline clinicians, and patients from understanding the inevitably of medical errors. There is a “hidden curriculum” that prevents us from seeing with system lenses. The formal curriculum is what is consciously intended, endorsed, and taught. But medical education is more than the transmission of knowledge and skills. It is also a socialization process. The hidden curriculum is a set of influences that function at the level of organizational culture. It includes norms and values that can undermine the messages of the stated curriculum. These are taught implicitly and daily – in the halls, and elevators, and through other channels. For example, while we are taught that some patients need more time and attention, the productivity measures employed by institutions signal that increasing the volume of services is the priority. In patient safety, a pervasive message in the hidden curriculum is that errors are uncommon, and are caused by a small number of individuals who should be blamed for inattention, carelessness, moral weakness, or incompetence. This message is especially difficult to root out, as it is founded upon a universal cognitive bias referred to as “fundamental attribution error.” This bias or “cognitive disposition to respond” is the tendency of humans to overestimate the effect of the personality of others, while underestimating the effect of the situation in explaining behavior. Frankly, it is natural to blame other people when things go wrong. However, it has consequences for the treatment of individual providers, and for actions taken at the institutional level. Armed with the misconception that individuals are responsible for errors, the solutions are discipline or retraining. Lucian Leape, a leader in the field of patient safety, once declared that “The single greatest impediment to error prevention in the medical industry is that we punish people for making mistakes.” This approach inspires fear and teaches the safest thing to do when an error occurs is to keep quiet. “Don’t talk about your mistakes – it can only hurt you” is the message, along with its corollaries “it is acceptable to cover up minor mistakes,” “don’t report on your colleagues.” These messages convince workers to disclose as little as possible to patients, colleagues, managers, and even themselves. They also discourage the reporting of safety incidents, frustrating efforts to learn from mistakes. The unfortunate consequence for organizations is that it can convince leaders they need not focus on systemwide improvement. The hidden curriculum can also create direct safety hazards for patients. For example, Liao and colleagues reported an obstetric case in which pressure on a trainee to hurry up – to avoid making a supervising physician angry – led to a dangerous maternal injury. Additional messages that “asking for help is a sign of weakness,” and “an apology to a patient is tantamount to an admission of guilt” deserve their own expanded discussion. A related channel operates at the policy level. An insidious example is the requirement by many medical licensure boards in the US to ask physicians if they have been treated for psychiatric illness. This policy creates a strong disincentive to seek medical care for anxiety or depression, which are as common among health workers as they are throughout society. Editorial\",\"PeriodicalId\":73888,\"journal\":{\"name\":\"Journal of patient safety and risk management\",\"volume\":\"14 1\",\"pages\":\"199 - 200\"},\"PeriodicalIF\":0.6000,\"publicationDate\":\"2022-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of patient safety and risk management\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1177/25160435221129332\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"HEALTH CARE SCIENCES & SERVICES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of patient safety and risk management","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1177/25160435221129332","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"HEALTH CARE SCIENCES & SERVICES","Score":null,"Total":0}
Pursuing a saboteur of patient safety: The hidden curriculum
There is a fundamental problem at the heart of health care: the problem of human error. Health care relies primarily on humans taking care of other humans. Because we are fallible, there will always be errors in clinical practice. James Reason, generally regarded as the father of safety science, saw the issue as being viewed in two ways: the person approach, and the system approach. The person approach focuses on the errors and violations of individuals, and aims remedial efforts at front line workers. The system approach traces causal factors to the system as whole, and prescribes remedies at multiple levels. Reason explained persuasively that “we cannot change the human condition, but we can change the conditions under which people work.” By designing systems to avert errors and allow recovery when they do occur, we can create the conditions to reduce harm from health care. Ironically, humans in the system are stubbornly resistant to accepting the more effective system approach. They seem blinded to just how common errors are in healthcare, and are often in denial about the frequency of adverse events. This is in part because elements of the system itself perpetuate the idea that a few unreliable “bad apples” are responsible for safety problems in medicine. This example of failing to see the forest for the trees – the system of healthcare for the actions of individual frontline providers – frustrates efforts to improve patient safety. Today, healthcare organizations aspire to achieve high reliability, to reduce errors and recover from their effects. But designing high reliability systems first requires being constantly aware of the possibility of failure. A difficult challenge in medical education and training is to conquer the invisible forces that prevent leaders, managers, frontline clinicians, and patients from understanding the inevitably of medical errors. There is a “hidden curriculum” that prevents us from seeing with system lenses. The formal curriculum is what is consciously intended, endorsed, and taught. But medical education is more than the transmission of knowledge and skills. It is also a socialization process. The hidden curriculum is a set of influences that function at the level of organizational culture. It includes norms and values that can undermine the messages of the stated curriculum. These are taught implicitly and daily – in the halls, and elevators, and through other channels. For example, while we are taught that some patients need more time and attention, the productivity measures employed by institutions signal that increasing the volume of services is the priority. In patient safety, a pervasive message in the hidden curriculum is that errors are uncommon, and are caused by a small number of individuals who should be blamed for inattention, carelessness, moral weakness, or incompetence. This message is especially difficult to root out, as it is founded upon a universal cognitive bias referred to as “fundamental attribution error.” This bias or “cognitive disposition to respond” is the tendency of humans to overestimate the effect of the personality of others, while underestimating the effect of the situation in explaining behavior. Frankly, it is natural to blame other people when things go wrong. However, it has consequences for the treatment of individual providers, and for actions taken at the institutional level. Armed with the misconception that individuals are responsible for errors, the solutions are discipline or retraining. Lucian Leape, a leader in the field of patient safety, once declared that “The single greatest impediment to error prevention in the medical industry is that we punish people for making mistakes.” This approach inspires fear and teaches the safest thing to do when an error occurs is to keep quiet. “Don’t talk about your mistakes – it can only hurt you” is the message, along with its corollaries “it is acceptable to cover up minor mistakes,” “don’t report on your colleagues.” These messages convince workers to disclose as little as possible to patients, colleagues, managers, and even themselves. They also discourage the reporting of safety incidents, frustrating efforts to learn from mistakes. The unfortunate consequence for organizations is that it can convince leaders they need not focus on systemwide improvement. The hidden curriculum can also create direct safety hazards for patients. For example, Liao and colleagues reported an obstetric case in which pressure on a trainee to hurry up – to avoid making a supervising physician angry – led to a dangerous maternal injury. Additional messages that “asking for help is a sign of weakness,” and “an apology to a patient is tantamount to an admission of guilt” deserve their own expanded discussion. A related channel operates at the policy level. An insidious example is the requirement by many medical licensure boards in the US to ask physicians if they have been treated for psychiatric illness. This policy creates a strong disincentive to seek medical care for anxiety or depression, which are as common among health workers as they are throughout society. Editorial