A. Ramdani, A. Perbawa, Andrey Bakulin, V. Vahrenkamp
{"title":"利用生成式对抗网络将三维地球物理图像转化为逼真的虚拟露头地质学","authors":"A. Ramdani, A. Perbawa, Andrey Bakulin, V. Vahrenkamp","doi":"10.1190/tle43020102.1","DOIUrl":null,"url":null,"abstract":"Outcrop analogues play a pivotal role in resolving meter-scale depositional facies heterogeneity of carbonate strata. Two-dimensional outcrops are insufficient to decipher the 3D heterogeneity of carbonate facies. Near-surface geophysical methods, notably ground-penetrating radar (GPR), can be employed to step into 3D and extend the dimensionality of the outcrops to behind the outcrop. However, interpreting geophysical images requires specific geophysical expertise, often unfamiliar to field geologists who are more familiar with the actual rock than the geophysical data. A novel generative adversarial network (GAN) application is presented that constructs a photorealistic 3D virtual outcrop behind-the-outcrop model. The method combines GPR forward modeling with a conditional generative adversarial network (CGAN) and exploits the apparent similarities between outcrop expressions of lithofacies with their radargram counterparts. We exemplified the methodology and applied it to the open-source GPR data acquired from the Late Oxfordian-Early Kimmeridgian Arabian carbonate outcrop. We interpret a 4 km long outcrop photomosaic from a digital outcrop model (DOM) for its lithofacies, populate the DOM with GPR properties, and forward model the synthetic GPR response of these lithofacies. We pair the synthetic GPR with DOM lithofacies and train them using CGAN. Similarly, we pair the DOM lithofacies with outcrop photos and train them using CGAN. We chain the two trained networks and apply them to construct an approximately 2 km long 2D and an approximately 60 m2 3D volume of photorealistic artificial outcrop model. This model operates in a visual medium familiar to outcrop geologists, providing a complementary instrument to visualize and interpret rock formation instead of geophysical signals. This virtual outcrop replicates the visual character of outcrop-scale lithofacies features, such as the intricate bedding contacts and the outline of reef geobodies.","PeriodicalId":507626,"journal":{"name":"The Leading Edge","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"3D geophysical image translated into photorealistic virtual outcrop geology using generative adversarial networks\",\"authors\":\"A. Ramdani, A. Perbawa, Andrey Bakulin, V. Vahrenkamp\",\"doi\":\"10.1190/tle43020102.1\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Outcrop analogues play a pivotal role in resolving meter-scale depositional facies heterogeneity of carbonate strata. Two-dimensional outcrops are insufficient to decipher the 3D heterogeneity of carbonate facies. Near-surface geophysical methods, notably ground-penetrating radar (GPR), can be employed to step into 3D and extend the dimensionality of the outcrops to behind the outcrop. However, interpreting geophysical images requires specific geophysical expertise, often unfamiliar to field geologists who are more familiar with the actual rock than the geophysical data. A novel generative adversarial network (GAN) application is presented that constructs a photorealistic 3D virtual outcrop behind-the-outcrop model. The method combines GPR forward modeling with a conditional generative adversarial network (CGAN) and exploits the apparent similarities between outcrop expressions of lithofacies with their radargram counterparts. We exemplified the methodology and applied it to the open-source GPR data acquired from the Late Oxfordian-Early Kimmeridgian Arabian carbonate outcrop. We interpret a 4 km long outcrop photomosaic from a digital outcrop model (DOM) for its lithofacies, populate the DOM with GPR properties, and forward model the synthetic GPR response of these lithofacies. We pair the synthetic GPR with DOM lithofacies and train them using CGAN. Similarly, we pair the DOM lithofacies with outcrop photos and train them using CGAN. We chain the two trained networks and apply them to construct an approximately 2 km long 2D and an approximately 60 m2 3D volume of photorealistic artificial outcrop model. This model operates in a visual medium familiar to outcrop geologists, providing a complementary instrument to visualize and interpret rock formation instead of geophysical signals. This virtual outcrop replicates the visual character of outcrop-scale lithofacies features, such as the intricate bedding contacts and the outline of reef geobodies.\",\"PeriodicalId\":507626,\"journal\":{\"name\":\"The Leading Edge\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-02-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The Leading Edge\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1190/tle43020102.1\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The Leading Edge","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1190/tle43020102.1","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
3D geophysical image translated into photorealistic virtual outcrop geology using generative adversarial networks
Outcrop analogues play a pivotal role in resolving meter-scale depositional facies heterogeneity of carbonate strata. Two-dimensional outcrops are insufficient to decipher the 3D heterogeneity of carbonate facies. Near-surface geophysical methods, notably ground-penetrating radar (GPR), can be employed to step into 3D and extend the dimensionality of the outcrops to behind the outcrop. However, interpreting geophysical images requires specific geophysical expertise, often unfamiliar to field geologists who are more familiar with the actual rock than the geophysical data. A novel generative adversarial network (GAN) application is presented that constructs a photorealistic 3D virtual outcrop behind-the-outcrop model. The method combines GPR forward modeling with a conditional generative adversarial network (CGAN) and exploits the apparent similarities between outcrop expressions of lithofacies with their radargram counterparts. We exemplified the methodology and applied it to the open-source GPR data acquired from the Late Oxfordian-Early Kimmeridgian Arabian carbonate outcrop. We interpret a 4 km long outcrop photomosaic from a digital outcrop model (DOM) for its lithofacies, populate the DOM with GPR properties, and forward model the synthetic GPR response of these lithofacies. We pair the synthetic GPR with DOM lithofacies and train them using CGAN. Similarly, we pair the DOM lithofacies with outcrop photos and train them using CGAN. We chain the two trained networks and apply them to construct an approximately 2 km long 2D and an approximately 60 m2 3D volume of photorealistic artificial outcrop model. This model operates in a visual medium familiar to outcrop geologists, providing a complementary instrument to visualize and interpret rock formation instead of geophysical signals. This virtual outcrop replicates the visual character of outcrop-scale lithofacies features, such as the intricate bedding contacts and the outline of reef geobodies.