{"title":"Patient Safety in the Cardiac Operating Room: What Can, Will, and Might Make Patients Safer and You Happier?","authors":"J. Abernathy","doi":"10.1097/ASA.0000000000000031","DOIUrl":null,"url":null,"abstract":"The cardiac operating room (OR) is a complex environment consisting of four teams of providers—surgeons, nurses, perfusionists, and anesthesiologists—and where a myriad of complicated equipment is often crammed into a space that might not have been designed for this purpose. Despite the obstacles, mortality and morbidity from cardiac surgery have steadily decreased over the past decade. Inevitably, however, humans continue to make errors. Gawande and colleagues found that adverse events occurred in 12% of cardiac surgical operations, compared with only 3% in a general surgery population. Some 28,000 of the 350,000 cardiac surgical patients in the United States each year will have an adverse, preventable event. Preventable errors are not related to failure of technical skill, training, or knowledge, but represent cognitive, system, or teamwork failures (Supplemental Digital Content 1, http://links.lww.com/ASA/A558). Jim Reason, the renowned human factors engineer, was the first to propose a simplified model of error, now referred to as the ‘‘Swiss cheese’’ model (Figure 1). This model eloquently describes how hidden—or, in human factors terminology, latent— errors can line up to create actual errors or patient harm. In one example, originally outlined by Pronovost et al., a patient suffered from a venous air embolism not because a doctor was careless, but because there were many hidden failures, often termed latent failures, that added up to create a catastrophe. In this example, components of latent error included poor communication, lack of protocols or lack of knowledge of protocols, inadequate training, and fear of retribution if the nurse spoke up. Resilient systems are designed to reduce the number of latent errors. If there are fewer latent errors, the holes in the Swiss cheese for an error to pass through are harder to align.","PeriodicalId":91163,"journal":{"name":"Refresher courses in anesthesiology","volume":"43 1","pages":"1-6"},"PeriodicalIF":0.0000,"publicationDate":"2015-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1097/ASA.0000000000000031","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Refresher courses in anesthesiology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1097/ASA.0000000000000031","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
The cardiac operating room (OR) is a complex environment consisting of four teams of providers—surgeons, nurses, perfusionists, and anesthesiologists—and where a myriad of complicated equipment is often crammed into a space that might not have been designed for this purpose. Despite the obstacles, mortality and morbidity from cardiac surgery have steadily decreased over the past decade. Inevitably, however, humans continue to make errors. Gawande and colleagues found that adverse events occurred in 12% of cardiac surgical operations, compared with only 3% in a general surgery population. Some 28,000 of the 350,000 cardiac surgical patients in the United States each year will have an adverse, preventable event. Preventable errors are not related to failure of technical skill, training, or knowledge, but represent cognitive, system, or teamwork failures (Supplemental Digital Content 1, http://links.lww.com/ASA/A558). Jim Reason, the renowned human factors engineer, was the first to propose a simplified model of error, now referred to as the ‘‘Swiss cheese’’ model (Figure 1). This model eloquently describes how hidden—or, in human factors terminology, latent— errors can line up to create actual errors or patient harm. In one example, originally outlined by Pronovost et al., a patient suffered from a venous air embolism not because a doctor was careless, but because there were many hidden failures, often termed latent failures, that added up to create a catastrophe. In this example, components of latent error included poor communication, lack of protocols or lack of knowledge of protocols, inadequate training, and fear of retribution if the nurse spoke up. Resilient systems are designed to reduce the number of latent errors. If there are fewer latent errors, the holes in the Swiss cheese for an error to pass through are harder to align.