Rethinking Six Sigma: Learning from practice in a digital age

IF 6.5 2区 管理学 Q1 MANAGEMENT
Suzanne de Treville, Tyson R. Browning, Matthias Holweg, Rachna Shah
{"title":"Rethinking Six Sigma: Learning from practice in a digital age","authors":"Suzanne de Treville,&nbsp;Tyson R. Browning,&nbsp;Matthias Holweg,&nbsp;Rachna Shah","doi":"10.1002/joom.1284","DOIUrl":null,"url":null,"abstract":"<p>As scholars in the field of operations management (OM), we would like to suggest that our field fell short in terms of due diligence when transitioning from statistical process control (SPC) to Six Sigma—accepting without scrutiny, building theory around, and teaching heuristics and algorithms without recognizing its underlying statistical inaccuracies. It is our view that these incorrect heuristics and algorithms have introduced bias and inefficiencies in process improvement throughout the OM field, contributing to a disconnect between OM and knowledge development in data science more generally. We call for a return to first principles and the establishment of formal conceptual definitions for the theory and methods underlying Six Sigma. We urge the OM academic community to embrace the lessons from SPC and Six Sigma so that we prioritize our due-diligence role, beginning with a requirement that all algorithms and tools be vetted before entering our curricula and case-study repertoires, especially as we move forward into an age of big data and potentially further opaque algorithms and tools. We propose that our top journals be open to research that scrutinizes methods developed in practice, so that OM will continue to be the focal field for quality assurance—even when the “product” of a process is data.</p><p>The application of statistical methods to quality management has been a central theme in OM since Shewhart's seminal work at Western Electric's Hawthorne Works nearly a century ago (Shewhart, <span>1925</span>; Shewhart, <span>1926</span>). He studied process variation to determine “how and under what conditions observations may contribute to a rational decision to change or not a process to accomplish improvements” (W. Edwards Deming, p.i, in the foreword of the 1986 edition of Shewhart, <span>1931</span>). His work fostered methods and tools to monitor, diagnose, measure, reduce, and control variation in the output of a process to increase its consistency and capability.</p><p>The capability of a process with respect to a process parameter hence can be defined as the number of standard deviations (“sigmas”) of the parameter that fit between the mean for that parameter and its specification limits. If the process is not centered between the specification limits, then the process capability is set using whichever specification limit is closer to the process center. Six Sigma moves the concept of process capability from descriptive to prescriptive. At the time that Motorola's process-improvement ideas were proposed by Bill Smith in 1986 (Harry, <span>1994</span>), process capability was defined in terms of specification limits set three standard deviations from the process mean (a “3<i>σ</i> process”). Following Shewhart's logic, a centered and normally distributed 3<i>σ</i> process is expected to produce 2700 parts-per-million (ppm) pieces that are more than three standard deviations from the process mean.</p><p>Motorola quantified the “zero defect” philosophy at the heart of the “quality is free” argument proposed by popular and influential authors like Crosby (<span>1979</span>) by translating the number of standard deviations that fit between the process center and the tightest specification limit into ppm defects. Historically it was assumed that a 3<i>σ</i> process capability was good enough. The 2700 ppm out-of-specification pieces produced by a 3<i>σ</i> process capability, however, is too high in many contexts. Harry described Bill Smith as demanding a higher standard: “Bill's proposition was eloquently simple. He suggested that Motorola should require a 50% design margin for all its key product performance characteristics” (Harry, <span>2003</span>, p. 1). This buffer, which adds another three standard deviations between either side of the mean and the specification limits (as illustrated in Figure 1), marked the inception of the “Six Sigma” concept.</p><p>Up to this point, Motorola's approach can be seen as a protocol that adds intuition and goal setting to a simple application of the theory of process capability. Had the Motorola project stopped there, a refined and stricter protocol might have seen the light of day, resulting in an incremental but eminently sensible extension to SPC as applied to high volume, repetitive-manufacturing contexts. Unfortunately, this is not what happened. Several approaches to the application of these core concepts—artifacts of the limitations of practical application at the time—are difficult to justify today. In the following subsections, we will address a few of the more notable of these legacy issues, and how they might be re-evaluated and addressed in current practice, teaching, and research.</p><p>Our intent is not to deny that implementing Six Sigma has had a positive impact in many companies over the past decades, nor do we wish to discredit the tried-and-tested heuristics that underpin SPC. We seek to draw attention to and rectify the persistent failure of our field to use the peer-review process to clarify Six Sigma's statistical claims, and to update process capability and SPC to allow decision-makers to use these tools with full comprehension. Peer-reviewed research is intended to eliminate this combination of misunderstanding and mystique, facilitating the transformation of interesting ideas emerging from the world of practice into a solid increase in knowledge. Before peer review can function, it is necessary for top journals to be open to submissions whose contribution is this type of due diligence. Back in the mid-1990s when Six Sigma emerged to great excitement, top OM journals were not perceived as being open to this kind of submission. Suri and de Treville (<span>1986</span>)—a conceptual article published in <i>JOM</i>—gives an idea of the kind of research contribution that could open the way to the fact checking that we are calling for. In the early 1980s, the idea of “rocks and stream” became popular: The claim was that as inventory was removed from a system, the resulting line stoppages would cause learning. Suri and de Treville explored in detail what happens between two workstations as intermediate inventory is reduced. Learning can occur when one workstation blocks or starves another, but such blockage and starvation can also result in a failure to learn. This academic exploration contributed to a more nuanced understanding of the relationship between inventory reduction and learning. Along similar lines, we suggest that <i>JOM</i> should be open to relatively technical, conceptual submissions that permit an in-depth exploration of a phenomenon that has emerged from practice to great enthusiasm.</p><p>Allowing these SPC and Six-Sigma methods to stand unchallenged and unchanged has caused confusion and kept OM theory and tools from evolving to be able to address today's data-rich environment. Rather than analyzing small samples of manually collected data from a production line, modern statistical quality monitoring systems produce high frequency, real-time data that is automatically captured in digital form. By linking SPC and process capability to sound statistical principles, we become able to evaluate systems like internet of things (IoT) and real-time location and sensing using OM thinking and tools. Effective use of advanced tools like machine learning requires a solid statistical foundation in a framework that has undergone peer review, not opaque algorithms and arcane heuristics.</p><p>We call for the OM community to insist that process capability and SPC be returned to sound statistical roots and formal conceptual definitions, recognizing Six Sigma as a protocol from the world of practice that prompted important discussions, but whose underlying assumptions are fundamentally flawed. This return is essential if we are to remain relevant to what is now happening in the world of quality assurance and data science more generally. We further call for <i>JOM</i> to be a journal that welcomes the manuscripts that bring ideas from practice to peer review, thus replacing blind faith in magic sauces with solid increases in knowledge.</p><p>Suzanne de Treville, Tyson R. Browning, Matthias Holweg, and Rachna Shah.</p>","PeriodicalId":51097,"journal":{"name":"Journal of Operations Management","volume":"69 8","pages":"1371-1376"},"PeriodicalIF":6.5000,"publicationDate":"2023-12-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/joom.1284","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Operations Management","FirstCategoryId":"91","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/joom.1284","RegionNum":2,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MANAGEMENT","Score":null,"Total":0}
引用次数: 0

Abstract

As scholars in the field of operations management (OM), we would like to suggest that our field fell short in terms of due diligence when transitioning from statistical process control (SPC) to Six Sigma—accepting without scrutiny, building theory around, and teaching heuristics and algorithms without recognizing its underlying statistical inaccuracies. It is our view that these incorrect heuristics and algorithms have introduced bias and inefficiencies in process improvement throughout the OM field, contributing to a disconnect between OM and knowledge development in data science more generally. We call for a return to first principles and the establishment of formal conceptual definitions for the theory and methods underlying Six Sigma. We urge the OM academic community to embrace the lessons from SPC and Six Sigma so that we prioritize our due-diligence role, beginning with a requirement that all algorithms and tools be vetted before entering our curricula and case-study repertoires, especially as we move forward into an age of big data and potentially further opaque algorithms and tools. We propose that our top journals be open to research that scrutinizes methods developed in practice, so that OM will continue to be the focal field for quality assurance—even when the “product” of a process is data.

The application of statistical methods to quality management has been a central theme in OM since Shewhart's seminal work at Western Electric's Hawthorne Works nearly a century ago (Shewhart, 1925; Shewhart, 1926). He studied process variation to determine “how and under what conditions observations may contribute to a rational decision to change or not a process to accomplish improvements” (W. Edwards Deming, p.i, in the foreword of the 1986 edition of Shewhart, 1931). His work fostered methods and tools to monitor, diagnose, measure, reduce, and control variation in the output of a process to increase its consistency and capability.

The capability of a process with respect to a process parameter hence can be defined as the number of standard deviations (“sigmas”) of the parameter that fit between the mean for that parameter and its specification limits. If the process is not centered between the specification limits, then the process capability is set using whichever specification limit is closer to the process center. Six Sigma moves the concept of process capability from descriptive to prescriptive. At the time that Motorola's process-improvement ideas were proposed by Bill Smith in 1986 (Harry, 1994), process capability was defined in terms of specification limits set three standard deviations from the process mean (a “3σ process”). Following Shewhart's logic, a centered and normally distributed 3σ process is expected to produce 2700 parts-per-million (ppm) pieces that are more than three standard deviations from the process mean.

Motorola quantified the “zero defect” philosophy at the heart of the “quality is free” argument proposed by popular and influential authors like Crosby (1979) by translating the number of standard deviations that fit between the process center and the tightest specification limit into ppm defects. Historically it was assumed that a 3σ process capability was good enough. The 2700 ppm out-of-specification pieces produced by a 3σ process capability, however, is too high in many contexts. Harry described Bill Smith as demanding a higher standard: “Bill's proposition was eloquently simple. He suggested that Motorola should require a 50% design margin for all its key product performance characteristics” (Harry, 2003, p. 1). This buffer, which adds another three standard deviations between either side of the mean and the specification limits (as illustrated in Figure 1), marked the inception of the “Six Sigma” concept.

Up to this point, Motorola's approach can be seen as a protocol that adds intuition and goal setting to a simple application of the theory of process capability. Had the Motorola project stopped there, a refined and stricter protocol might have seen the light of day, resulting in an incremental but eminently sensible extension to SPC as applied to high volume, repetitive-manufacturing contexts. Unfortunately, this is not what happened. Several approaches to the application of these core concepts—artifacts of the limitations of practical application at the time—are difficult to justify today. In the following subsections, we will address a few of the more notable of these legacy issues, and how they might be re-evaluated and addressed in current practice, teaching, and research.

Our intent is not to deny that implementing Six Sigma has had a positive impact in many companies over the past decades, nor do we wish to discredit the tried-and-tested heuristics that underpin SPC. We seek to draw attention to and rectify the persistent failure of our field to use the peer-review process to clarify Six Sigma's statistical claims, and to update process capability and SPC to allow decision-makers to use these tools with full comprehension. Peer-reviewed research is intended to eliminate this combination of misunderstanding and mystique, facilitating the transformation of interesting ideas emerging from the world of practice into a solid increase in knowledge. Before peer review can function, it is necessary for top journals to be open to submissions whose contribution is this type of due diligence. Back in the mid-1990s when Six Sigma emerged to great excitement, top OM journals were not perceived as being open to this kind of submission. Suri and de Treville (1986)—a conceptual article published in JOM—gives an idea of the kind of research contribution that could open the way to the fact checking that we are calling for. In the early 1980s, the idea of “rocks and stream” became popular: The claim was that as inventory was removed from a system, the resulting line stoppages would cause learning. Suri and de Treville explored in detail what happens between two workstations as intermediate inventory is reduced. Learning can occur when one workstation blocks or starves another, but such blockage and starvation can also result in a failure to learn. This academic exploration contributed to a more nuanced understanding of the relationship between inventory reduction and learning. Along similar lines, we suggest that JOM should be open to relatively technical, conceptual submissions that permit an in-depth exploration of a phenomenon that has emerged from practice to great enthusiasm.

Allowing these SPC and Six-Sigma methods to stand unchallenged and unchanged has caused confusion and kept OM theory and tools from evolving to be able to address today's data-rich environment. Rather than analyzing small samples of manually collected data from a production line, modern statistical quality monitoring systems produce high frequency, real-time data that is automatically captured in digital form. By linking SPC and process capability to sound statistical principles, we become able to evaluate systems like internet of things (IoT) and real-time location and sensing using OM thinking and tools. Effective use of advanced tools like machine learning requires a solid statistical foundation in a framework that has undergone peer review, not opaque algorithms and arcane heuristics.

We call for the OM community to insist that process capability and SPC be returned to sound statistical roots and formal conceptual definitions, recognizing Six Sigma as a protocol from the world of practice that prompted important discussions, but whose underlying assumptions are fundamentally flawed. This return is essential if we are to remain relevant to what is now happening in the world of quality assurance and data science more generally. We further call for JOM to be a journal that welcomes the manuscripts that bring ideas from practice to peer review, thus replacing blind faith in magic sauces with solid increases in knowledge.

Suzanne de Treville, Tyson R. Browning, Matthias Holweg, and Rachna Shah.

Abstract Image

反思六西格玛:数字时代的实践学习
作为运营管理(OM)领域的学者,我们想说的是,在从统计过程控制(SPC)过渡到六西格玛的过程中,我们的领域没有尽到应有的责任--在没有认识到其背后的统计不准确性的情况下,不加审查地接受、围绕其建立理论并传授启发式方法和算法。我们认为,这些不正确的启发式方法和算法给整个运营管理领域的流程改进带来了偏见和低效,导致运营管理与数据科学知识发展之间出现脱节。我们呼吁回归第一原则,为六西格玛的基础理论和方法建立正式的概念定义。我们敦促OM学术界接受SPC和六西格玛的教训,从而优先考虑我们的尽职调查角色,首先要求所有算法和工具在进入我们的课程和案例研究之前都必须经过审查,尤其是在我们迈入大数据时代,算法和工具可能会更加不透明的情况下。我们建议,我们的顶级期刊对在实践中开发的方法进行严格审查的研究持开放态度,这样,OM 将继续成为质量保证的重点领域--即使过程的 "产品 "是数据。自 Shewhart 近一个世纪前在西电霍桑工厂的开创性工作(Shewhart,1925 年;Shewhart,1926 年)以来,统计方法在质量管理中的应用一直是 OM 的核心主题。他研究流程变异,以确定 "观察结果如何以及在何种条件下有助于做出合理决策,改变或不改变流程以实现改进"(W. Edwards Deming,第 i 页,见《Shewhart,1931》1986 年版前言)。他的工作促进了监控、诊断、测量、减少和控制流程输出变异的方法和工具,以提高流程的一致性和能力。因此,流程在流程参数方面的能力可定义为该参数的标准偏差("西格玛")的数量,该数量介于该参数的平均值和规格限制之间。如果流程的中心不在规格限制之间,那么流程能力的设定就以更接近流程中心的规格限制为准。六西格玛将过程能力的概念从描述性转变为规范性。1986 年,比尔-史密斯(Bill Smith)提出摩托罗拉流程改进理念时(Harry,1994 年),流程能力的定义是在流程平均值的三个标准差("3σ 流程")范围内设定规格限制。按照 Shewhart 的逻辑,一个居中且正态分布的 3σ 过程预计将生产出 2700 个与过程平均值相差 3 个标准偏差以上的百万分之件(ppm)。摩托罗拉公司通过将过程中心与最严格规格限制之间的标准偏差数转化为百万分之缺陷,量化了 "零缺陷 "理念,该理念是 Crosby(1979 年)等流行且有影响力的作者提出的 "质量是免费的 "论点的核心。在过去,人们认为 3σ 的工艺能力已经足够好了。然而,在许多情况下,3σ 工艺能力所产生的 2700 ppm 的不合规格工件是过高的。哈利认为比尔-史密斯提出了更高的要求:"比尔的提议非常简单。他建议摩托罗拉应要求其所有关键产品性能特征的设计余量为 50%"(Harry,2003 年,第 1 页)。这个缓冲区在平均值和规格限制之间又增加了三个标准偏差(如图 1 所示),标志着 "六西格玛 "概念的诞生。到此为止,摩托罗拉公司的方法可以看作是在简单应用过程能力理论的基础上增加了直觉和目标设定的协议。如果摩托罗拉公司的项目就此打住,一个经过改进的、更加严格的规程可能就会出现,从而对应用于大批量、重复性生产的 SPC 进行渐进但非常合理的扩展。遗憾的是,事实并非如此。应用这些核心概念的几种方法--当时实际应用的局限性的产物--如今已难以自圆其说。在下面的小节中,我们将讨论这些遗留问题中比较突出的几个,以及如何在当前的实践、教学和研究中重新评估和解决这些问题。我们的目的并不是要否认六西格玛的实施在过去几十年中对许多公司产生的积极影响,也不是要诋毁作为 SPC 基础的久经考验的启发式方法。 我们力图引起人们的注意,并纠正本领域长期以来未能利用同行评审程序来澄清六西格玛的统计主张,以及更新过程能力和 SPC,使决策者能够充分理解地使用这些工具。同行评审研究旨在消除这种误解和神秘感,促进将实践世界中出现的有趣想法转化为扎实的知识增长。在同行评审发挥作用之前,顶级期刊有必要对投稿持开放态度,而投稿的贡献就是这种尽职调查。20 世纪 90 年代中期,当六西格玛兴起并引起巨大反响时,人们认为顶级 OM 期刊并不欢迎这类投稿。苏里和德-特雷维尔(Suri and de Treville,1986 年)在 JOM 上发表了一篇概念性文章,为我们所呼吁的事实核查开辟了道路。20 世纪 80 年代初,"岩石与溪流 "的概念开始流行:当时的说法是,随着库存从系统中移除,由此产生的生产线停工将导致学习。Suri 和 de Treville 详细探讨了当中间库存减少时,两个工作站之间会发生什么。当一个工作站堵塞或饿死另一个工作站时,学习就会发生,但这种堵塞和饿死也可能导致学习失败。这一学术探讨有助于我们更细致地理解库存减少与学习之间的关系。允许这些 SPC 和六西格玛方法不受质疑、一成不变地存在,已经造成了混乱,并阻碍了 OM 理论和工具的发展,使其无法应对当今数据丰富的环境。现代统计质量监控系统不是分析人工从生产线上收集的小样本数据,而是以数字形式自动生成高频率的实时数据。通过将 SPC 和流程能力与合理的统计原理联系起来,我们就能够使用 OM 思维和工具对物联网(IoT)和实时定位与传感等系统进行评估。我们呼吁 OM 社区坚持将流程能力和 SPC 回归到可靠的统计基础和正式的概念定义上来,认识到六西格玛是来自实践世界的协议,它引发了重要的讨论,但其基本假设存在根本性缺陷。如果我们要与质量保证和数据科学领域正在发生的事情保持关联,这种回归是必不可少的。我们进一步呼吁 JOM 成为一本欢迎将实践中的想法引入同行评审的期刊,从而以知识的扎实增长取代对神奇调味汁的盲目信仰。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Journal of Operations Management
Journal of Operations Management 管理科学-运筹学与管理科学
CiteScore
11.00
自引率
15.40%
发文量
62
审稿时长
24 months
期刊介绍: The Journal of Operations Management (JOM) is a leading academic publication dedicated to advancing the field of operations management (OM) through rigorous and original research. The journal's primary audience is the academic community, although it also values contributions that attract the interest of practitioners. However, it does not publish articles that are primarily aimed at practitioners, as academic relevance is a fundamental requirement. JOM focuses on the management aspects of various types of operations, including manufacturing, service, and supply chain operations. The journal's scope is broad, covering both profit-oriented and non-profit organizations. The core criterion for publication is that the research question must be centered around operations management, rather than merely using operations as a context. For instance, a study on charismatic leadership in a manufacturing setting would only be within JOM's scope if it directly relates to the management of operations; the mere setting of the study is not enough. Published papers in JOM are expected to address real-world operational questions and challenges. While not all research must be driven by practical concerns, there must be a credible link to practice that is considered from the outset of the research, not as an afterthought. Authors are cautioned against assuming that academic knowledge can be easily translated into practical applications without proper justification. JOM's articles are abstracted and indexed by several prestigious databases and services, including Engineering Information, Inc.; Executive Sciences Institute; INSPEC; International Abstracts in Operations Research; Cambridge Scientific Abstracts; SciSearch/Science Citation Index; CompuMath Citation Index; Current Contents/Engineering, Computing & Technology; Information Access Company; and Social Sciences Citation Index. This ensures that the journal's research is widely accessible and recognized within the academic and professional communities.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信