Efficient Adaptation and Calibration of Ad joint-Based Reduced-Order Coarse-Grid Network Models

S. Krogstad, Ø. Klemetsdal, Knut-Andreas Lie
{"title":"Efficient Adaptation and Calibration of Ad joint-Based Reduced-Order Coarse-Grid Network Models","authors":"S. Krogstad, Ø. Klemetsdal, Knut-Andreas Lie","doi":"10.2118/212207-ms","DOIUrl":null,"url":null,"abstract":"\n Network models have proved to be an efficient tool for building data-driven proxy models that match observed production data or reduced-order models that match simulated data. A particularly versatile approach is to construct the network topology so that it mimics the intercell connection in a volumetric grid. That is, one first builds a network of \"reservoir nodes\" to which wells can be subsequently connected. The network model is realized inside a fully differentiable simulator. To train the model, we use a standard mismatch minimization formulation, optimized by a Gauss-Newton method with mismatch Jacobians obtained by solving adjoint equations with multiple right-hand sides. One can also use a quasi-Newton method, but Gauss-Newton is significantly more efficient as long as the number of wells is not too high. A practical challenge in setting up such network models is to determine the granularity of the network. Herein, we demonstrate how this can be mitigated by using a dynamic graph adaption algorithm to find a good granularity that improves predictability both inside and slightly outside the range of the training data.","PeriodicalId":225811,"journal":{"name":"Day 1 Tue, March 28, 2023","volume":"5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Day 1 Tue, March 28, 2023","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2118/212207-ms","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Network models have proved to be an efficient tool for building data-driven proxy models that match observed production data or reduced-order models that match simulated data. A particularly versatile approach is to construct the network topology so that it mimics the intercell connection in a volumetric grid. That is, one first builds a network of "reservoir nodes" to which wells can be subsequently connected. The network model is realized inside a fully differentiable simulator. To train the model, we use a standard mismatch minimization formulation, optimized by a Gauss-Newton method with mismatch Jacobians obtained by solving adjoint equations with multiple right-hand sides. One can also use a quasi-Newton method, but Gauss-Newton is significantly more efficient as long as the number of wells is not too high. A practical challenge in setting up such network models is to determine the granularity of the network. Herein, we demonstrate how this can be mitigated by using a dynamic graph adaption algorithm to find a good granularity that improves predictability both inside and slightly outside the range of the training data.
基于Ad节点的降阶粗网格网络模型的有效自适应与标定
网络模型已被证明是构建数据驱动的代理模型的有效工具,这些模型与观察到的生产数据相匹配,或与模拟数据相匹配的降阶模型。一种特别通用的方法是构造网络拓扑,使其模仿体积网格中的单元间连接。也就是说,首先建立一个“储层节点”网络,随后可以连接油井。网络模型是在一个完全可微模拟器中实现的。为了训练模型,我们使用了一个标准的失配最小化公式,该公式由高斯-牛顿方法优化,失配雅可比矩阵通过求解具有多个右侧的伴随方程得到。人们也可以使用准牛顿方法,但只要井的数量不太高,高斯-牛顿方法的效率要高得多。建立这种网络模型的一个实际挑战是确定网络的粒度。在这里,我们演示了如何通过使用动态图自适应算法来找到一个良好的粒度来减轻这种情况,该粒度可以提高训练数据范围内和稍微超出范围的可预测性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信