{"title":"认知、技术和组织限制:法航447空难的教训","authors":"N. Oliver, T. Calvard, K. Potočnik","doi":"10.1287/ORSC.2017.1138","DOIUrl":null,"url":null,"abstract":"Organizations, particularly those for whom safety and reliability are crucial, develop routines to protect them from failure. But even highly reliable organizations are not immune to disaster and prolonged periods of safe operation are punctuated by occasional catastrophes. Scholars of safety science label this the “paradox of almost totally safe systems,” noting that systems that are very safe under normal conditions may be vulnerable under unusual ones. In this paper, we explain, develop, and apply the concept of “organizational limits” to this puzzle through an analysis of the loss of Air France 447. We show that an initial, relatively minor limit violation set in train a cascade of human and technological limit violations, with catastrophic consequences. Focusing on cockpit automation, we argue that the same measures that make a system safe and predictable may introduce restrictions on cognition, which over time, inhibit or erode the disturbance-handling capability of the actors involved. We also note...","PeriodicalId":93599,"journal":{"name":"Organization science (Providence, R.I.)","volume":"17 3","pages":"729-743"},"PeriodicalIF":0.0000,"publicationDate":"2017-06-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1287/ORSC.2017.1138","citationCount":"46","resultStr":"{\"title\":\"Cognition, Technology, and Organizational Limits: Lessons from the Air France 447 Disaster\",\"authors\":\"N. Oliver, T. Calvard, K. Potočnik\",\"doi\":\"10.1287/ORSC.2017.1138\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Organizations, particularly those for whom safety and reliability are crucial, develop routines to protect them from failure. But even highly reliable organizations are not immune to disaster and prolonged periods of safe operation are punctuated by occasional catastrophes. Scholars of safety science label this the “paradox of almost totally safe systems,” noting that systems that are very safe under normal conditions may be vulnerable under unusual ones. In this paper, we explain, develop, and apply the concept of “organizational limits” to this puzzle through an analysis of the loss of Air France 447. We show that an initial, relatively minor limit violation set in train a cascade of human and technological limit violations, with catastrophic consequences. Focusing on cockpit automation, we argue that the same measures that make a system safe and predictable may introduce restrictions on cognition, which over time, inhibit or erode the disturbance-handling capability of the actors involved. We also note...\",\"PeriodicalId\":93599,\"journal\":{\"name\":\"Organization science (Providence, R.I.)\",\"volume\":\"17 3\",\"pages\":\"729-743\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-06-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1287/ORSC.2017.1138\",\"citationCount\":\"46\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Organization science (Providence, R.I.)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1287/ORSC.2017.1138\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Organization science (Providence, R.I.)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1287/ORSC.2017.1138","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Cognition, Technology, and Organizational Limits: Lessons from the Air France 447 Disaster
Organizations, particularly those for whom safety and reliability are crucial, develop routines to protect them from failure. But even highly reliable organizations are not immune to disaster and prolonged periods of safe operation are punctuated by occasional catastrophes. Scholars of safety science label this the “paradox of almost totally safe systems,” noting that systems that are very safe under normal conditions may be vulnerable under unusual ones. In this paper, we explain, develop, and apply the concept of “organizational limits” to this puzzle through an analysis of the loss of Air France 447. We show that an initial, relatively minor limit violation set in train a cascade of human and technological limit violations, with catastrophic consequences. Focusing on cockpit automation, we argue that the same measures that make a system safe and predictable may introduce restrictions on cognition, which over time, inhibit or erode the disturbance-handling capability of the actors involved. We also note...