A Framework of Best Practices for Delivering Successful Artificial Intelligence Projects. A Case Study Demonstration

A. Popa, Ben A. Amaba, Jeff Daniels
{"title":"A Framework of Best Practices for Delivering Successful Artificial Intelligence Projects. A Case Study Demonstration","authors":"A. Popa, Ben A. Amaba, Jeff Daniels","doi":"10.2118/206014-ms","DOIUrl":null,"url":null,"abstract":"\n A practical framework that outlines the critical steps of a successful process that uses data, machine learning (Ml), and artificial intelligence (AI) is presented in this study. A practical case study is included to demonstrate the process. The use of artificial intelligent and machine learning has not only enhanced but also sped up problem-solving approaches in many domains, including the oil and gas industry. Moreover, these technologies are revolutionizing all key aspects of engineering including; framing approaches, techniques, and outcomes. The proposed framework includes key components to ensure integrity, quality, and accuracy of data and governance centered on principles such as responsibility, equitability, and reliability. As a result, the industry documentation shows that technology coupled with process advances can improve productivity by 20%.\n A clear work-break-down structure (WBS) to create value using an engineering framework has measurable outcomes. The AI and ML technologies enable the use of large amounts of information, combining static & dynamic data, observations, historical events, and behaviors. The Job Task Analysis (JTA) model is a proven framework to manage processes, people, and platforms. JTA is a modern data-focused approach that prioritizes in order: problem framing, analytics framing, data, methodology, model building, deployment, and lifecycle management. The case study exemplifies how the JTA model optimizes an oilfield production plant, similar to a manufacturing facility. A data-driven approach was employed to analyze and evaluate the production fluid impact during facility-planned or un-planned system disruptions. The workflows include data analytics tools such as ML&AI for pattern recognition and clustering for prompt event mitigation and optimization.\n The paper demonstrates how an integrated framework leads to significant business value. The study integrates surface and subsurface information to characterize and understand the production impact due to planned and unplanned plant events. The findings led to designing a relief system to divert the back pressure during plant shutdown. The study led to cost avoidance of a new plant, saving millions of dollars, environment impact, and safety considerations, in addition to unnecessary operating costs and maintenance. Moreover, tens of millions of dollars value per year by avoiding production loss of plant upsets or shutdown was created. The study cost nothing to perform, about two months of not focused time by a team of five engineers and data scientists. The work provided critical steps in \"creating a trusting\" model and \"explainability’. The methodology was implemented using existing available data and tools; it was the process and engineering knowledge that led to the successful outcome. Having a systematic WBS has become vital in data analytics projects that use AI and ML technologies. An effective governance system creates 25% productivity improvement and 70% capital improvement. Poor requirements can consume 40%+ of development budget. The process, models, and tools should be used on engineering projects where data and physics are present.\n The proposed framework demonstrates the business impact and value creation generated by integrating models, data, AI, and ML technologies for modeling and optimization. It reflects the collective knowledge and perspectives of diverse professionals from IBM, Lockheed Martin, and Chevron, who joined forces to document a standard framework for achieving success in data analytics/AI projects.","PeriodicalId":10928,"journal":{"name":"Day 2 Wed, September 22, 2021","volume":"106 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2021-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Day 2 Wed, September 22, 2021","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2118/206014-ms","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

A practical framework that outlines the critical steps of a successful process that uses data, machine learning (Ml), and artificial intelligence (AI) is presented in this study. A practical case study is included to demonstrate the process. The use of artificial intelligent and machine learning has not only enhanced but also sped up problem-solving approaches in many domains, including the oil and gas industry. Moreover, these technologies are revolutionizing all key aspects of engineering including; framing approaches, techniques, and outcomes. The proposed framework includes key components to ensure integrity, quality, and accuracy of data and governance centered on principles such as responsibility, equitability, and reliability. As a result, the industry documentation shows that technology coupled with process advances can improve productivity by 20%. A clear work-break-down structure (WBS) to create value using an engineering framework has measurable outcomes. The AI and ML technologies enable the use of large amounts of information, combining static & dynamic data, observations, historical events, and behaviors. The Job Task Analysis (JTA) model is a proven framework to manage processes, people, and platforms. JTA is a modern data-focused approach that prioritizes in order: problem framing, analytics framing, data, methodology, model building, deployment, and lifecycle management. The case study exemplifies how the JTA model optimizes an oilfield production plant, similar to a manufacturing facility. A data-driven approach was employed to analyze and evaluate the production fluid impact during facility-planned or un-planned system disruptions. The workflows include data analytics tools such as ML&AI for pattern recognition and clustering for prompt event mitigation and optimization. The paper demonstrates how an integrated framework leads to significant business value. The study integrates surface and subsurface information to characterize and understand the production impact due to planned and unplanned plant events. The findings led to designing a relief system to divert the back pressure during plant shutdown. The study led to cost avoidance of a new plant, saving millions of dollars, environment impact, and safety considerations, in addition to unnecessary operating costs and maintenance. Moreover, tens of millions of dollars value per year by avoiding production loss of plant upsets or shutdown was created. The study cost nothing to perform, about two months of not focused time by a team of five engineers and data scientists. The work provided critical steps in "creating a trusting" model and "explainability’. The methodology was implemented using existing available data and tools; it was the process and engineering knowledge that led to the successful outcome. Having a systematic WBS has become vital in data analytics projects that use AI and ML technologies. An effective governance system creates 25% productivity improvement and 70% capital improvement. Poor requirements can consume 40%+ of development budget. The process, models, and tools should be used on engineering projects where data and physics are present. The proposed framework demonstrates the business impact and value creation generated by integrating models, data, AI, and ML technologies for modeling and optimization. It reflects the collective knowledge and perspectives of diverse professionals from IBM, Lockheed Martin, and Chevron, who joined forces to document a standard framework for achieving success in data analytics/AI projects.
交付成功人工智能项目的最佳实践框架。案例研究演示
本研究提出了一个实用的框架,概述了使用数据、机器学习(Ml)和人工智能(AI)的成功流程的关键步骤。包括一个实际的案例研究来演示这个过程。人工智能和机器学习的使用不仅增强了许多领域的解决问题方法,而且加快了解决问题的速度,包括石油和天然气行业。此外,这些技术正在彻底改变工程的所有关键方面,包括;框架方法、技术和结果。提议的框架包括确保数据完整性、质量和准确性的关键组件,以及以责任、公平性和可靠性等原则为中心的治理。因此,行业文件显示,技术与工艺进步相结合,可以将生产率提高20%。使用工程框架创造价值的清晰的工作分解结构(WBS)具有可测量的结果。人工智能和机器学习技术可以使用大量信息,结合静态和动态数据、观察、历史事件和行为。工作任务分析(Job Task Analysis, JTA)模型是一个经过验证的框架,用于管理流程、人员和平台。JTA是一种以数据为中心的现代方法,它按以下顺序排列优先级:问题框架、分析框架、数据、方法、模型构建、部署和生命周期管理。该案例研究举例说明了JTA模型如何优化油田生产工厂,类似于制造工厂。采用数据驱动的方法来分析和评估设施计划或非计划系统中断期间生产流体的影响。工作流程包括数据分析工具,如用于模式识别的ML&AI和用于快速事件缓解和优化的聚类。本文演示了集成框架如何带来重要的业务价值。该研究整合了地面和地下信息,以表征和了解由于计划和非计划的工厂事件对生产的影响。研究结果促使人们设计了一个减压系统,以在工厂关闭期间转移背压。该研究避免了新建工厂的成本,节省了数百万美元,减少了对环境的影响,并考虑了安全因素,此外还减少了不必要的运营成本和维护成本。此外,每年还创造了数千万美元的价值,避免了工厂停工或停产造成的生产损失。这项研究没有任何成本,由五名工程师和数据科学家组成的团队花了大约两个月的时间。这项工作为“建立信任”模型和“可解释性”提供了关键步骤。该方法是利用现有的可用数据和工具实施的;是过程和工程知识导致了成功的结果。在使用人工智能和机器学习技术的数据分析项目中,拥有一个系统的WBS已经变得至关重要。一个有效的治理体系可以提高25%的生产力和70%的资本。糟糕的需求会消耗开发预算的40%以上。过程、模型和工具应该用于存在数据和物理的工程项目。提出的框架展示了通过集成模型、数据、人工智能和机器学习技术进行建模和优化而产生的业务影响和价值创造。它反映了来自IBM、洛克希德马丁和雪佛龙的不同专业人士的集体知识和观点,他们联合起来为数据分析/人工智能项目的成功制定了一个标准框架。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信