使用假设检验实时评估关键性能指标:一个边缘计算用例

P. Kowalchuk
{"title":"使用假设检验实时评估关键性能指标:一个边缘计算用例","authors":"P. Kowalchuk","doi":"10.2118/195023-MS","DOIUrl":null,"url":null,"abstract":"\n Harvesting vast amounts of data has long been identified as an enabler of operational performance. The measurement of key performance indicators is a routine practice in well construction, but a systematic way of statistically analyzing performance against a large data bank of offset wells is not a common practice. The performance of statistical analysis in real time is even more rare. With the introduction of edge computing, devices capable of complex analytical functions in physical proximity to sensors and operations, this practice can be realized. Two case studies are presented: rate of penetration (ROP) and amount of vibration per run.\n Hypothesis testing is a statistical method in which a sampled dataset is compared against an idealized or status quo model. This model is built using many samples from a population. The characteristics of the population are then inferred from these samples. The model is built in centers where large amounts of data are available. These models are then transferred to an edge device in the field. The device collects real-time data and compares results to the status quo model. In the two cases presented, hypothesis testing was used to determine maximum and minimum levels of ROP and downhole vibration. This information is used to determine the effectiveness of new drilling practices, technologies, or methodologies. Because calculations are performed in real time, changes to drilling practices can be adopted quickly.\n The ROP case was performed at a US operating unit; the vibration case was performed in a Middle East unit. The models showed what improvement values should be. It was revealing to find wells that were thought to be poor performers were actually well within the population normal. Wells were also found that were thought to be good performers, but where new drilling methods were used, actually fell within the population model and thus suggested that the new methods had not affected performance. By performing this analysis on the edge device, operations can make changes early in such a way that results fall outside the status quo model and deliver real performance improvements.\n The paper presents the novel use of statistical models calculated in data centers in conjunction with real-time operations. Similar approaches in technical and physics modeling exist in which models are produced in the office and used in the field. However, building models for operations management, from a large bank of offset data, and performing analysis in the field with real-time data is a not common practice. This paper shows both technology and statistical methods that provide a valid scientific framework for operational performance improvement.","PeriodicalId":11031,"journal":{"name":"Day 4 Thu, March 21, 2019","volume":"10 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2019-03-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Using Hypothesis Testing to Evaluate Key Performance Indicators in Real Time: An Edge Computing Use Case\",\"authors\":\"P. Kowalchuk\",\"doi\":\"10.2118/195023-MS\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"\\n Harvesting vast amounts of data has long been identified as an enabler of operational performance. The measurement of key performance indicators is a routine practice in well construction, but a systematic way of statistically analyzing performance against a large data bank of offset wells is not a common practice. The performance of statistical analysis in real time is even more rare. With the introduction of edge computing, devices capable of complex analytical functions in physical proximity to sensors and operations, this practice can be realized. Two case studies are presented: rate of penetration (ROP) and amount of vibration per run.\\n Hypothesis testing is a statistical method in which a sampled dataset is compared against an idealized or status quo model. This model is built using many samples from a population. The characteristics of the population are then inferred from these samples. The model is built in centers where large amounts of data are available. These models are then transferred to an edge device in the field. The device collects real-time data and compares results to the status quo model. In the two cases presented, hypothesis testing was used to determine maximum and minimum levels of ROP and downhole vibration. This information is used to determine the effectiveness of new drilling practices, technologies, or methodologies. Because calculations are performed in real time, changes to drilling practices can be adopted quickly.\\n The ROP case was performed at a US operating unit; the vibration case was performed in a Middle East unit. The models showed what improvement values should be. It was revealing to find wells that were thought to be poor performers were actually well within the population normal. Wells were also found that were thought to be good performers, but where new drilling methods were used, actually fell within the population model and thus suggested that the new methods had not affected performance. By performing this analysis on the edge device, operations can make changes early in such a way that results fall outside the status quo model and deliver real performance improvements.\\n The paper presents the novel use of statistical models calculated in data centers in conjunction with real-time operations. Similar approaches in technical and physics modeling exist in which models are produced in the office and used in the field. However, building models for operations management, from a large bank of offset data, and performing analysis in the field with real-time data is a not common practice. This paper shows both technology and statistical methods that provide a valid scientific framework for operational performance improvement.\",\"PeriodicalId\":11031,\"journal\":{\"name\":\"Day 4 Thu, March 21, 2019\",\"volume\":\"10 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-03-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Day 4 Thu, March 21, 2019\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.2118/195023-MS\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Day 4 Thu, March 21, 2019","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2118/195023-MS","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

长期以来,收集大量数据一直被认为是提高运营绩效的一个因素。关键性能指标的测量是井建设中的常规做法,但针对大型邻井数据库进行系统的性能统计分析却不是一种常见做法。实时统计分析的性能更是难得一见。随着边缘计算的引入,能够在物理上接近传感器和操作的复杂分析功能的设备可以实现这种做法。给出了两个案例研究:钻速(ROP)和每次下钻的振动量。假设检验是一种统计方法,其中抽样数据集与理想化或现状模型进行比较。这个模型是用一个总体中的许多样本建立的。然后从这些样本中推断出总体的特征。该模型建立在可获得大量数据的中心。然后将这些模型传输到现场的边缘设备。该设备收集实时数据,并将结果与现状模型进行比较。在上述两种情况下,采用假设检验来确定ROP和井下振动的最大和最小水平。这些信息用于确定新的钻井实践、技术或方法的有效性。由于计算是实时进行的,因此可以快速采用钻井作业的变化。ROP案例是在美国的一个作业单位进行的;振动案例是在中东的一个单位进行的。模型显示了改进值应该是什么。发现那些被认为表现不佳的井实际上在总体正常范围内,这很有启发性。我们还发现了一些被认为表现良好的油井,但在使用新钻井方法的地方,实际上是在人口模型范围内,因此表明新方法没有影响产量。通过在边缘设备上执行此分析,操作可以尽早进行更改,从而使结果超出现状模型,并提供真正的性能改进。本文介绍了在数据中心中结合实时操作计算统计模型的新用法。在技术和物理建模方面也存在类似的方法,即在办公室制作模型并在现场使用。然而,从大量的偏移数据中构建操作管理模型,并在现场使用实时数据进行分析,这是一种不常见的做法。本文展示了技术和统计方法,为运营绩效改进提供了有效的科学框架。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Using Hypothesis Testing to Evaluate Key Performance Indicators in Real Time: An Edge Computing Use Case
Harvesting vast amounts of data has long been identified as an enabler of operational performance. The measurement of key performance indicators is a routine practice in well construction, but a systematic way of statistically analyzing performance against a large data bank of offset wells is not a common practice. The performance of statistical analysis in real time is even more rare. With the introduction of edge computing, devices capable of complex analytical functions in physical proximity to sensors and operations, this practice can be realized. Two case studies are presented: rate of penetration (ROP) and amount of vibration per run. Hypothesis testing is a statistical method in which a sampled dataset is compared against an idealized or status quo model. This model is built using many samples from a population. The characteristics of the population are then inferred from these samples. The model is built in centers where large amounts of data are available. These models are then transferred to an edge device in the field. The device collects real-time data and compares results to the status quo model. In the two cases presented, hypothesis testing was used to determine maximum and minimum levels of ROP and downhole vibration. This information is used to determine the effectiveness of new drilling practices, technologies, or methodologies. Because calculations are performed in real time, changes to drilling practices can be adopted quickly. The ROP case was performed at a US operating unit; the vibration case was performed in a Middle East unit. The models showed what improvement values should be. It was revealing to find wells that were thought to be poor performers were actually well within the population normal. Wells were also found that were thought to be good performers, but where new drilling methods were used, actually fell within the population model and thus suggested that the new methods had not affected performance. By performing this analysis on the edge device, operations can make changes early in such a way that results fall outside the status quo model and deliver real performance improvements. The paper presents the novel use of statistical models calculated in data centers in conjunction with real-time operations. Similar approaches in technical and physics modeling exist in which models are produced in the office and used in the field. However, building models for operations management, from a large bank of offset data, and performing analysis in the field with real-time data is a not common practice. This paper shows both technology and statistical methods that provide a valid scientific framework for operational performance improvement.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信