驾驶舱人机交互的激活故障

Wen-Chin Li, M. Greaves, Davide Durando, John J. H. Lin
{"title":"驾驶舱人机交互的激活故障","authors":"Wen-Chin Li, M. Greaves, Davide Durando, John J. H. Lin","doi":"10.6125/16-0524-890","DOIUrl":null,"url":null,"abstract":"Cockpit automation has been developed to reduce pilots' workload and increase pilots' performance. However, previous studies have demonstrated that failures of automated systems have significantly impaired pilots' situational awareness. The increased application of automation and the trend of pilots to rely on automation have changed pilot's role from an operator to a supervisor in the cockpit. Based on the analysis of 257 ASRS reports, the result demonstrated that pilots represent the last line of defense during automation failures, though sometimes pilots did commit active failures combined with automation-induced human errors. Current research found that automation breakdown has direct associated with 4 categories of precondition of unsafe acts, including 'adverse mental states', 'CRM', 'personal readiness', and 'technology environment'. Furthermore, the presence of 'CRM' almost 3.6 times, 12.7 times, 2.9 times, and 4 times more likely to occur concomitant failures in the categories of 'decision-errors', 'skill-based error', 'perceptual errors', and 'violations'. Therefore, CRM is the most critical category for developing intervention of Human-Automation Interaction (HAI) issues to improve aviation safety. The study of human factors in automated cockpit is critical to understand how incidents/accidents had developed and how they could be prevented. Future HAI research should continue to increase the reliability of automation on the flight deck, develop backup systems for the occasional failures of cockpit automation, and train flight crews with competence of CRM skills in response to automation breakdowns.","PeriodicalId":335344,"journal":{"name":"Journal of aeronautics, astronautics and aviation, Series A","volume":"10 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"The Activated Failures of Human-Automation Interactions on the Flight Deck\",\"authors\":\"Wen-Chin Li, M. Greaves, Davide Durando, John J. H. Lin\",\"doi\":\"10.6125/16-0524-890\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Cockpit automation has been developed to reduce pilots' workload and increase pilots' performance. However, previous studies have demonstrated that failures of automated systems have significantly impaired pilots' situational awareness. The increased application of automation and the trend of pilots to rely on automation have changed pilot's role from an operator to a supervisor in the cockpit. Based on the analysis of 257 ASRS reports, the result demonstrated that pilots represent the last line of defense during automation failures, though sometimes pilots did commit active failures combined with automation-induced human errors. Current research found that automation breakdown has direct associated with 4 categories of precondition of unsafe acts, including 'adverse mental states', 'CRM', 'personal readiness', and 'technology environment'. Furthermore, the presence of 'CRM' almost 3.6 times, 12.7 times, 2.9 times, and 4 times more likely to occur concomitant failures in the categories of 'decision-errors', 'skill-based error', 'perceptual errors', and 'violations'. Therefore, CRM is the most critical category for developing intervention of Human-Automation Interaction (HAI) issues to improve aviation safety. The study of human factors in automated cockpit is critical to understand how incidents/accidents had developed and how they could be prevented. Future HAI research should continue to increase the reliability of automation on the flight deck, develop backup systems for the occasional failures of cockpit automation, and train flight crews with competence of CRM skills in response to automation breakdowns.\",\"PeriodicalId\":335344,\"journal\":{\"name\":\"Journal of aeronautics, astronautics and aviation, Series A\",\"volume\":\"10 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of aeronautics, astronautics and aviation, Series A\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.6125/16-0524-890\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of aeronautics, astronautics and aviation, Series A","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.6125/16-0524-890","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

摘要

驾驶舱自动化的发展是为了减少飞行员的工作量,提高飞行员的绩效。然而,先前的研究表明,自动系统的故障会严重损害飞行员的态势感知能力。自动化应用的增加和飞行员依赖自动化的趋势使飞行员的角色从驾驶舱的操作员转变为监督者。基于对257份ASRS报告的分析,结果表明,在自动化故障期间,飞行员是最后一道防线,尽管有时飞行员确实会犯主动故障,并结合自动化引起的人为错误。目前的研究发现,自动化故障与4类不安全行为的先决条件直接相关,包括“不良心理状态”、“客户关系管理”、“个人准备”和“技术环境”。此外,在“决策错误”、“基于技能的错误”、“感知错误”和“违规”类别中,“客户关系管理”的出现几乎是“决策错误”、“技能错误”、“感知错误”的3.6倍、12.7倍、2.9倍和4倍。因此,客户关系管理是开发人机交互(HAI)问题干预以提高航空安全的最关键范畴。研究自动驾驶座舱中的人为因素对于了解事故的发展以及如何预防事故至关重要。未来的人工智能研究应继续提高驾驶舱自动化的可靠性,开发备用系统以应对偶尔发生的驾驶舱自动化故障,并培训机组人员具备应对自动化故障的CRM技能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
The Activated Failures of Human-Automation Interactions on the Flight Deck
Cockpit automation has been developed to reduce pilots' workload and increase pilots' performance. However, previous studies have demonstrated that failures of automated systems have significantly impaired pilots' situational awareness. The increased application of automation and the trend of pilots to rely on automation have changed pilot's role from an operator to a supervisor in the cockpit. Based on the analysis of 257 ASRS reports, the result demonstrated that pilots represent the last line of defense during automation failures, though sometimes pilots did commit active failures combined with automation-induced human errors. Current research found that automation breakdown has direct associated with 4 categories of precondition of unsafe acts, including 'adverse mental states', 'CRM', 'personal readiness', and 'technology environment'. Furthermore, the presence of 'CRM' almost 3.6 times, 12.7 times, 2.9 times, and 4 times more likely to occur concomitant failures in the categories of 'decision-errors', 'skill-based error', 'perceptual errors', and 'violations'. Therefore, CRM is the most critical category for developing intervention of Human-Automation Interaction (HAI) issues to improve aviation safety. The study of human factors in automated cockpit is critical to understand how incidents/accidents had developed and how they could be prevented. Future HAI research should continue to increase the reliability of automation on the flight deck, develop backup systems for the occasional failures of cockpit automation, and train flight crews with competence of CRM skills in response to automation breakdowns.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信