E. Illarionov, Pavel Temirchev, D. Voloskov, A. Gubanova, D. Koroteev, M. Simonov, A. Akhmetov, A. Margarit
{"title":"3D Reservoir Model History Matching Based on Machine Learning Technology","authors":"E. Illarionov, Pavel Temirchev, D. Voloskov, A. Gubanova, D. Koroteev, M. Simonov, A. Akhmetov, A. Margarit","doi":"10.2118/201924-ms","DOIUrl":null,"url":null,"abstract":"\n In adaptation of reservoir models a direct gradient backpropagation through the forward model is often intractable or requires enormous computational costs. Thus one have to construct separate models that simulate them implicitly, e.g. via stochastic sampling or solving of adjoint systems. We demonstrate that if the forward model is a neural network, gradient backpropagation becomes naturally involved both in model training and adaptation. In our research we compare 3 adaptation strategies: variation of reservoir model variables, neural network adaptation and latent space adaptation and discuss to what extent they preserve the geological content. We exploit a real-world reservoir model to investigate the problem in practical case. The numerical experiments demonstrate that the latent space adaptation provides the most stable and accurate results.","PeriodicalId":359083,"journal":{"name":"Day 2 Tue, October 27, 2020","volume":"45 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Day 2 Tue, October 27, 2020","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2118/201924-ms","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
In adaptation of reservoir models a direct gradient backpropagation through the forward model is often intractable or requires enormous computational costs. Thus one have to construct separate models that simulate them implicitly, e.g. via stochastic sampling or solving of adjoint systems. We demonstrate that if the forward model is a neural network, gradient backpropagation becomes naturally involved both in model training and adaptation. In our research we compare 3 adaptation strategies: variation of reservoir model variables, neural network adaptation and latent space adaptation and discuss to what extent they preserve the geological content. We exploit a real-world reservoir model to investigate the problem in practical case. The numerical experiments demonstrate that the latent space adaptation provides the most stable and accurate results.