Mohammad H. Taufik, Xinquan Huang, Tariq Alkhalifah
{"title":"全波形反演中物理信息神经网络的潜在表征学习","authors":"Mohammad H. Taufik, Xinquan Huang, Tariq Alkhalifah","doi":"10.1029/2024EA004107","DOIUrl":null,"url":null,"abstract":"<p>Full waveform inversion (FWI), a state-of-the-art seismic inversion algorithm, comprises an iterative data-fitting process to recover high-resolution Earth's properties (e.g., velocity). At the heart of this process lies the numerical wave equation solver, which necessitates discretization. To perform efficient discretization-free FWI for large-scale problems, we introduce physics-informed neural networks (PINNs) as surrogates for conventional numerical solvers. The original PINN implementation requires additional training for the new velocity model when used in the forward simulation. To make PINNs more suitable for such scenarios, we introduce latent representation learning to PINNs. We first append the input with the encoded velocity vectors, which are the latent representation of the velocity models using an autoencoder model. Unlike the original implementation, the trained PINN model can instantly produce different wavefield solutions without retraining with this additional information. To further improve the FWI efficiency, instead of computing the FWI updates on the original velocity dimension, we resort to updating in its latent representation dimension. Specifically, we only update the latent representation vectors and freeze the weights of the autoencoder and the PINN models during FWI. Through a series of numerical tests on synthetic data, the proposed framework shows a significant increase in accuracy and computational efficiency compared to the conventional FWI. The improved performance of our framework can be attributed to implicit regularization introduced by the velocity encoding and physics-informed training procedures. The proposed framework presents a significant step forward in utilizing a discretization-free wave equation solver for a more efficient and accurate FWI application.</p>","PeriodicalId":54286,"journal":{"name":"Earth and Space Science","volume":"12 9","pages":""},"PeriodicalIF":2.6000,"publicationDate":"2025-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://agupubs.onlinelibrary.wiley.com/doi/epdf/10.1029/2024EA004107","citationCount":"0","resultStr":"{\"title\":\"Latent Representation Learning in Physics-Informed Neural Networks for Full Waveform Inversion\",\"authors\":\"Mohammad H. Taufik, Xinquan Huang, Tariq Alkhalifah\",\"doi\":\"10.1029/2024EA004107\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Full waveform inversion (FWI), a state-of-the-art seismic inversion algorithm, comprises an iterative data-fitting process to recover high-resolution Earth's properties (e.g., velocity). At the heart of this process lies the numerical wave equation solver, which necessitates discretization. To perform efficient discretization-free FWI for large-scale problems, we introduce physics-informed neural networks (PINNs) as surrogates for conventional numerical solvers. The original PINN implementation requires additional training for the new velocity model when used in the forward simulation. To make PINNs more suitable for such scenarios, we introduce latent representation learning to PINNs. We first append the input with the encoded velocity vectors, which are the latent representation of the velocity models using an autoencoder model. Unlike the original implementation, the trained PINN model can instantly produce different wavefield solutions without retraining with this additional information. To further improve the FWI efficiency, instead of computing the FWI updates on the original velocity dimension, we resort to updating in its latent representation dimension. Specifically, we only update the latent representation vectors and freeze the weights of the autoencoder and the PINN models during FWI. Through a series of numerical tests on synthetic data, the proposed framework shows a significant increase in accuracy and computational efficiency compared to the conventional FWI. The improved performance of our framework can be attributed to implicit regularization introduced by the velocity encoding and physics-informed training procedures. The proposed framework presents a significant step forward in utilizing a discretization-free wave equation solver for a more efficient and accurate FWI application.</p>\",\"PeriodicalId\":54286,\"journal\":{\"name\":\"Earth and Space Science\",\"volume\":\"12 9\",\"pages\":\"\"},\"PeriodicalIF\":2.6000,\"publicationDate\":\"2025-09-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://agupubs.onlinelibrary.wiley.com/doi/epdf/10.1029/2024EA004107\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Earth and Space Science\",\"FirstCategoryId\":\"89\",\"ListUrlMain\":\"https://agupubs.onlinelibrary.wiley.com/doi/10.1029/2024EA004107\",\"RegionNum\":3,\"RegionCategory\":\"地球科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ASTRONOMY & ASTROPHYSICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Earth and Space Science","FirstCategoryId":"89","ListUrlMain":"https://agupubs.onlinelibrary.wiley.com/doi/10.1029/2024EA004107","RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ASTRONOMY & ASTROPHYSICS","Score":null,"Total":0}
Latent Representation Learning in Physics-Informed Neural Networks for Full Waveform Inversion
Full waveform inversion (FWI), a state-of-the-art seismic inversion algorithm, comprises an iterative data-fitting process to recover high-resolution Earth's properties (e.g., velocity). At the heart of this process lies the numerical wave equation solver, which necessitates discretization. To perform efficient discretization-free FWI for large-scale problems, we introduce physics-informed neural networks (PINNs) as surrogates for conventional numerical solvers. The original PINN implementation requires additional training for the new velocity model when used in the forward simulation. To make PINNs more suitable for such scenarios, we introduce latent representation learning to PINNs. We first append the input with the encoded velocity vectors, which are the latent representation of the velocity models using an autoencoder model. Unlike the original implementation, the trained PINN model can instantly produce different wavefield solutions without retraining with this additional information. To further improve the FWI efficiency, instead of computing the FWI updates on the original velocity dimension, we resort to updating in its latent representation dimension. Specifically, we only update the latent representation vectors and freeze the weights of the autoencoder and the PINN models during FWI. Through a series of numerical tests on synthetic data, the proposed framework shows a significant increase in accuracy and computational efficiency compared to the conventional FWI. The improved performance of our framework can be attributed to implicit regularization introduced by the velocity encoding and physics-informed training procedures. The proposed framework presents a significant step forward in utilizing a discretization-free wave equation solver for a more efficient and accurate FWI application.
期刊介绍:
Marking AGU’s second new open access journal in the last 12 months, Earth and Space Science is the only journal that reflects the expansive range of science represented by AGU’s 62,000 members, including all of the Earth, planetary, and space sciences, and related fields in environmental science, geoengineering, space engineering, and biogeochemistry.