Claudia Masciulli , Giacomo Guiduzzi , Donato Tiano , Marta Zocchi , Francesco Guerra , Paolo Mazzanti , Gabriele Scarascia Mugnozza
{"title":"Interpretable clustering of PS-InSAR time series for ground deformation detection","authors":"Claudia Masciulli , Giacomo Guiduzzi , Donato Tiano , Marta Zocchi , Francesco Guerra , Paolo Mazzanti , Gabriele Scarascia Mugnozza","doi":"10.1016/j.cageo.2025.105959","DOIUrl":"10.1016/j.cageo.2025.105959","url":null,"abstract":"<div><div>Persistent Scatterer Interferometry Synthetic Aperture Radar (PS-InSAR) provides high-precision ground deformation measurements over wide areas. However, analyzing PS time series remains challenging due to complex temporal patterns and the need to consider comprehensive displacement fields to fully characterize ground deformation processes. This study evaluates and compares unsupervised clustering approaches for PS time series analysis, contrasting feature extraction techniques against raw time series methods. We developed an online optimization algorithm for cluster number determination and introduced a custom density-based score (MLRD) for evaluating clustering quality in sparse geospatial datasets. The approaches were tested on Sentinel-1-derived PS data from the landslide-prone Offida municipality (Marche region, Italy), where feature-based methodologies demonstrated superior performance, achieving improvements of one to two orders of magnitude in clustering quality metrics compared to conventional approaches. The multivariate analysis notably outperformed univariate methods, with optimal MLRD (<span><math><mrow><mn>2</mn><mo>.</mo><mn>59</mn><mi>⋅</mi><mn>1</mn><msup><mrow><mn>0</mn></mrow><mrow><mo>−</mo><mn>5</mn></mrow></msup></mrow></math></span>) and Calinski–Harabasz scores (194.73) at 50% explained variance, while preserving the physical interpretability of the results. This comprehensive analysis identified coherent deformation clusters extending beyond previously mapped landslide boundaries, demonstrating the effectiveness of multivariate clustering in detecting potentially unstable areas. This methodological framework advances PS time series analysis through robust pattern recognition while enhancing geohazard assessment capabilities, offering a robust foundation for identifying unstable areas and providing quantitative support for improving our understanding of complex ground deformation mechanisms.</div></div>","PeriodicalId":55221,"journal":{"name":"Computers & Geosciences","volume":"203 ","pages":"Article 105959"},"PeriodicalIF":4.2,"publicationDate":"2025-05-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144098794","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Fehmi Özbayrak , John T. Foster , Michael J. Pyrcz
{"title":"Spatial bagging for predictive machine learning uncertainty quantification","authors":"Fehmi Özbayrak , John T. Foster , Michael J. Pyrcz","doi":"10.1016/j.cageo.2025.105947","DOIUrl":"10.1016/j.cageo.2025.105947","url":null,"abstract":"<div><div>Uncertainty quantification is a critical component in the interpretation of spatial phenomena, particularly within the geosciences, where incomplete subsurface data leads to various possible scenarios, making it crucial for risk assessment and decision-making. Traditional geostatistical methods have served as the cornerstone for uncertainty analysis; however, the incorporation of machine learning, particularly ensemble methods, offers a compelling augmentation, especially in handling complex and noisy datasets. Building on our previous work, which introduced a spatial bagging technique for enhancing prediction accuracy, this study extends the method to uncertainty quantification by applying a widely-used UQ metric from geostatistics.</div><div>Our approach employs a bootstrap method adjusted for effective sample size derived from spatial statistics, addressing the common issue of overfitting when dealing with dependent data. We demonstrate, through a series of synthetic datasets with varied noise levels and spatial structures, that our spatial bagging method not only outperforms standard bagging techniques in prediction accuracy but also provides superior uncertainty quantification. The robustness of the method against noise and its computational efficiency, particularly in spatially correlated data, positions it as a promising tool for geoscientists and others who require reliable uncertainty measures in spatial analysis.</div></div>","PeriodicalId":55221,"journal":{"name":"Computers & Geosciences","volume":"203 ","pages":"Article 105947"},"PeriodicalIF":4.2,"publicationDate":"2025-05-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144106991","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Advancing raster DEM generalization with a quadric error metric approach","authors":"Richard Feciskanin, Jozef Minár","doi":"10.1016/j.cageo.2025.105963","DOIUrl":"10.1016/j.cageo.2025.105963","url":null,"abstract":"<div><div>Generalizing Digital Elevation Models (DEMs)—a process that simplifies data while preserving essential features—is crucial for efficient land surface analysis and revealing hierarchical structures of landforms. However, traditional methods often struggle to balance simplification with feature preservation. This paper presents a novel approach for generalizing raster-based DEMs using Quadric Error Metrics (QEM). Traditionally used for polygonal simplification, QEM has been uniquely adapted to operate directly on gridded data, which is required by most geomorphometric calculation and analysis tools. By minimizing geometric distortion, QEM effectively maintains significant land surface features, even at high levels of generalization, where the limitations of existing methods become evident. This was confirmed through a methods comparison, evaluating the generalization level using local roughness measurements based on the circular variance of aspect on four distinct areas that vary considerably in terms of landform type. The QEM approach's implicit evaluation of local surface properties ensures that significant features are preserved without the need for explicit feature detection or extensive parameter tuning. The method employs an adaptive error threshold to progressively remove smaller, non-essential landforms, providing flexible control over the generalization process. The proposed method has significant implications for various applications utilizing DEMs, particularly for analyses for which micro-scale features are undesirable noise, but preservation of the terrain skeleton is especially important. By offering a robust tool for DEM generalization, this research aims to enhance support for digital geomorphological mapping, but it can also be useful for a wider range of geoscientific research.</div></div>","PeriodicalId":55221,"journal":{"name":"Computers & Geosciences","volume":"202 ","pages":"Article 105963"},"PeriodicalIF":4.2,"publicationDate":"2025-05-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144068268","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Felipe H. Santos-da-Silva , João B. Fernandes , Idalmis M. Sardina , Tiago Barros , Samuel Xavier-de-Souza , Italo A.S. Assis
{"title":"Auto-Tuning for OpenMP Dynamic Scheduling applied to Full Waveform Inversion","authors":"Felipe H. Santos-da-Silva , João B. Fernandes , Idalmis M. Sardina , Tiago Barros , Samuel Xavier-de-Souza , Italo A.S. Assis","doi":"10.1016/j.cageo.2025.105932","DOIUrl":"10.1016/j.cageo.2025.105932","url":null,"abstract":"<div><div>Full Waveform Inversion (FWI) is a widely used method in seismic data processing, capable of estimating models that represent the characteristics of the geological layers of the subsurface. Because it works with a massive amount of data, the execution of this method requires much time and computational resources, being restricted to large-scale computer systems such as supercomputers. Techniques such as FWI adapt well to parallel computing and can be parallelized in shared memory systems using the application programming interface (API) OpenMP. The management of parallel tasks can be performed through loop schedulers contained in OpenMP. The dynamic scheduler stands out for distributing predefined fixed-size chunk sizes to idle processing cores at runtime. It can better adapt to FWI, where data processing can be irregular. However, the relationship between the size of the chunk size and the runtime is unknown. Optimization techniques can employ meta-heuristics to explore the parameter search space, avoiding testing all possible solutions. Here, we propose a strategy to use the Parameter Auto-Tuning for Shared Memory Algorithms (PATSMA), with Coupled Simulated Annealing (CSA) as its optimization method, to automatically adjust the chunk size for the dynamic scheduling of wave propagation, one of the most expensive steps in FWI. Since testing each candidate chunk size in the complete FWI is unpractical, our approach consists of running a PATSMA where the objective function is the runtime of the first time iteration of the first seismic shot of the first FWI iteration. The resulting chunk size is then employed in all wave propagations involved in an FWI. We conducted tests to measure the runtime of an FWI using the proposed auto-tuning, varying the problem size and running on different computational environments, such as supercomputers and cloud computing instances. The results show that applying the proposed auto-tuning in an FWI reduces its runtime by up to 70.46% compared to standard OpenMP schedulers.</div></div>","PeriodicalId":55221,"journal":{"name":"Computers & Geosciences","volume":"202 ","pages":"Article 105932"},"PeriodicalIF":4.2,"publicationDate":"2025-05-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143947191","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"PyInvGeo: An open-source Python package for regularized linear inversion in geophysics","authors":"Naveen Gupta , Nasser Kazemi","doi":"10.1016/j.cageo.2025.105948","DOIUrl":"10.1016/j.cageo.2025.105948","url":null,"abstract":"<div><div>We developed several algorithms to solve the generalized linear inversion problem. In real-world problems, the datasets are huge and direct inversion of data matrix is not possible. Iterative algorithms can provide the desired solution by iteratively updating the solution along the opposite direction of the gradient. Hence, we develop steepest descent with <span><math><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>2</mn></mrow></msub></math></span>, Huber, Cauchy, and hybrid <span><math><mrow><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>1</mn></mrow></msub><mo>/</mo><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>2</mn></mrow></msub></mrow></math></span> norms regularization, conjugate gradient with smoothness and sparsity constraints, FISTA, and alternating minimization algorithms. L-curve for the <span><math><mrow><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>2</mn></mrow></msub><mo>−</mo><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>2</mn></mrow></msub></mrow></math></span> minimization and Generalized Cross Validation function for the <span><math><mrow><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>2</mn></mrow></msub><mo>−</mo><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>1</mn></mrow></msub></mrow></math></span> minimization are used to provide the optimum regularization parameter. The numerical seismic deconvolution tests on synthetic single-channel data show the performances of the different algorithms and the parameter selections. Then, based on the performances of the algorithms on single channel data, we select the conjugate gradient with sparsity constraint and FISTA for deconvolution of Teapot dome 2D real data. We find that on 2D data, the FISTA method provides sparser solutions. However, through deconvolution of 3D seismic data, by increasing the dimensions and complexity of signals of interest, we show that the FISTA algorithm struggles to provide continuous and interpretable results. To address this issue, we introduce the Hoyer-squared norm to promote sparsity. Hoyer-squared norm is almost everywhere differentiable, scale-invariant, and contrary to <span><math><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>1</mn></mrow></msub></math></span> norm it does not equally shrink all the coefficients. The 3D deconvolution shows that the Hoyer-squared method outperforms FISTA and provides a continuous and interpretable solution. Finally, we develop a Hoyer-squared-based multiple suppression in the Radon domain and successfully test the algorithm on synthetic and real marine Gulf of Mexico data. The multiple suppression algorithm is based on the parabolic Radon transform. The Python package for the algorithms and numerical testes is included for reproducibility purposes and to facilitate the use of the algorithms on different problems.</div></div>","PeriodicalId":55221,"journal":{"name":"Computers & Geosciences","volume":"202 ","pages":"Article 105948"},"PeriodicalIF":4.2,"publicationDate":"2025-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144068267","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Expert K-means reconstruction method: a novel image processing approach for mesostructure reconstruction of crystalline rocks","authors":"Haoyu Pan , Cheng Zhao , Jialun Niu , Jinquan Xing , Huiguan Chen , Rui Zhang","doi":"10.1016/j.cageo.2025.105957","DOIUrl":"10.1016/j.cageo.2025.105957","url":null,"abstract":"<div><div>Crystalline rocks exhibit pronounced heterogeneity, making the accurate reconstruction of their mesostructures a fundamental prerequisite for mesomechanical analysis. Current methods for reconstructing the mesostructures of crystalline rocks can be broadly categorized into statistical reconstruction methods and digital image processing methods. This paper systematically reviews these approaches and innovatively integrates expert systems with unsupervised machine learning, proposing the Expert K-Means Reconstruction Method (EKRM). EKRM combines the accuracy of expert systems with the objectivity of unsupervised machine learning, enabling highly precise reconstruction of rock mesostructures. Additionally, this study delves into the identification of grain boundaries in rocks, introducing a probabilistic approach to delineate mesostructural boundaries. The results demonstrate that EKRM significantly outperforms existing methods in terms of reconstruction accuracy and reusability. Furthermore, numerical simulations of the mesostructures reconstructed using EKRM were conducted and compared with laboratory experiments. The findings confirm that EKRM-reconstructed mesostructures effectively capture the influence of rock mesostructures on their mesomechanical behavior. The related code has been shared on GitHub.</div></div>","PeriodicalId":55221,"journal":{"name":"Computers & Geosciences","volume":"202 ","pages":"Article 105957"},"PeriodicalIF":4.2,"publicationDate":"2025-05-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143942816","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Eungyu Park , Jize Piao , Hyunggu Jun , Yong-Sung Kim , Heejun Suk , Weon Shik Han
{"title":"Manifold embedding of geological and geophysical observations for non-stationary subsurface property estimation using geodesic Gaussian processes","authors":"Eungyu Park , Jize Piao , Hyunggu Jun , Yong-Sung Kim , Heejun Suk , Weon Shik Han","doi":"10.1016/j.cageo.2025.105958","DOIUrl":"10.1016/j.cageo.2025.105958","url":null,"abstract":"<div><div>Traditional methods for geological characterization often overlook or oversimplify the challenge of subsurface non-stationarity. This study introduces an innovative methodology that uses ancillary data, such as geological insights and geophysical exploration, to accurately delineate the spatial distribution of subsurface petrophysical properties in large, non-stationary geological fields. The approach leverages geodesic distance on an embedded manifold, with the level-set curve linking observed geological structures to intrinsic non-stationarity. Critical parameters <span><math><mrow><mi>ρ</mi></mrow></math></span> and <span><math><mrow><mi>β</mi></mrow></math></span> were identified, influencing the strength and dependence of estimates on secondary data. Comparative evaluations demonstrated that this method outperforms traditional kriging, particularly in representing complex subsurface structures. This enhanced accuracy is crucial for applications such as contaminant remediation and underground repository design. While focused on two-dimensional models, future work should explore three-dimensional applications across diverse geological structures. This research provides novel strategies for estimating non-stationary geologic media, advancing subsurface characterization.</div></div>","PeriodicalId":55221,"journal":{"name":"Computers & Geosciences","volume":"202 ","pages":"Article 105958"},"PeriodicalIF":4.2,"publicationDate":"2025-05-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143905944","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Sara Nerone , Pierre Lanari , Hugo Dominguez , Jacob B. Forshaw , Chiara Groppo , Franco Rolfo
{"title":"IntersecT: a Python script for quantitative isopleth thermobarometry of equilibrium and disequilibrium systems","authors":"Sara Nerone , Pierre Lanari , Hugo Dominguez , Jacob B. Forshaw , Chiara Groppo , Franco Rolfo","doi":"10.1016/j.cageo.2025.105949","DOIUrl":"10.1016/j.cageo.2025.105949","url":null,"abstract":"<div><div>Isopleth thermobarometry involves comparing compositional isopleths generated from forward thermodynamic models with the measured mineral compositions in a specific assemblage to retrieve the pressure and temperature conditions of equilibration. This technique has been used extensively in the last two decades to constrain the conditions of metamorphism for natural rock samples. However, this method is often applied qualitatively, relying on the intersection of a limited number of isopleths for a few selected phases. Recent works have introduced software solutions with more quantitative approaches; these use statistical methods to derive optimal <em>P–T</em> conditions and provide a more accurate interpretation of forward modelling results. Despite these advances, these methods are not commonly used. IntersecT aims at distributing a tool for statistically quantifying the quality of fit using the WERAMI output of Perple_X and applying multiple approaches, including the quality factor concept from Bingo-Antidote. This formulation allows the propagation of measurement uncertainty in isopleth thermobarometry. In addition, IntersecT applies reduced <em>χ</em><sup>2</sup> statistics to assess the weight of the considered phases, enabling the down-weighting of outlier data due to model inaccuracies or incorrect assumptions, such as disequilibrium features. The quality factor approach helps to visualize discrepancies resulting from these issues. IntersecT provides a quantitative framework to improve the interpretation of Perple_X isopleth thermobarometry results, allowing compositional uncertainties in the measured mineral composition to be considered. This approach can also help interpret how phase equilibrium experiments reproduce the observed compositions for magmatic and metamorphic systems.</div></div>","PeriodicalId":55221,"journal":{"name":"Computers & Geosciences","volume":"202 ","pages":"Article 105949"},"PeriodicalIF":4.2,"publicationDate":"2025-05-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143905945","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"PyHawk: An efficient gravity recovery solver for low–low satellite-to-satellite tracking gravity missions","authors":"Yi Wu , Fan Yang , Shuhao Liu , Ehsan Forootan","doi":"10.1016/j.cageo.2025.105934","DOIUrl":"10.1016/j.cageo.2025.105934","url":null,"abstract":"<div><div>The low–low satellite-to-satellite tracking (LL-SST) gravity missions, such as the Gravity Recovery and Climate Experiment (GRACE) and its Follow-On (GRACE-FO), provide an important space-based Essential Climate Variable (ECV) to measure changes in the Terrestrial Water Storage (TWS). Due to the high-precision Global Navigation Satellite System (GNSS) receiver, accelerometers, and inter-satellite ranging instrument, these LL-SST missions are able to sense extremely tiny perturbations on both the orbit and inter-satellite ranges, which can project into the Earth’s time-variable gravity fields. The measurement systems of these LL-SST missions are highly complex; therefore, a data processing chain is required to exploit the potential of their high-precision measurements, which challenges both general and expert users. In this study, we present an open-source, user-friendly, cross-platform and integrated toolbox “PyHawk”, which is the first Python-based software in relevant field, to address the complete data processing chain of LL-SST missions including GRACE, GRACE-FO and probably the future gravity missions. This toolbox provides non-expert users an easy access to the payload data pre-processing, background force modeling, orbit integration, ranging calibration, as well as the ability for temporal gravity field recovery using LL-SST measurements. In addition, a series of high-standard benchmark tests have been provided to evaluate PyHawk, confirming its performance to be comparable with those used to provide the official Level-2 time-variable gravity field solutions of GRACE. Researchers working with orbit determination and gravity field modeling can benefit from this toolbox.</div></div>","PeriodicalId":55221,"journal":{"name":"Computers & Geosciences","volume":"201 ","pages":"Article 105934"},"PeriodicalIF":4.2,"publicationDate":"2025-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143863847","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Why the relational data model matters for climate data management","authors":"Ezequiel Cimadevilla","doi":"10.1016/j.cageo.2025.105931","DOIUrl":"10.1016/j.cageo.2025.105931","url":null,"abstract":"<div><div>Efficient data management of climate data banks, in particular those generated by Global or Regional Climate Models, is an important requirement for precise understanding of current changes in the climate system. Current data management practices in the climate community are based on the analysis of binary files for storage of multidimensional arrays that require ad hoc software libraries for accessing the data. Several approaches are being developed to ease and facilitate climate data management and data analysis. However, the theoretical foundations that cause climate data manipulation difficulties remain unchallenged. The Relational Data Model was proposed as a formal solution for database management based on mathematical logic. It has been widely accepted in the industry and has survived the test of time. However, the foundational principles of the Relational Data Model have been overlooked by the climate data management community, mostly due to a lack of emphasis in the relevance of mathematical logic for database management and misunderstanding between physical and logical levels of abstraction. As a result, climate data management workflows lack the rigor and formality provided by the Relational Data Model. This work explains the Relational Data Model at the logical level of abstraction and provides the arguments, clarifies the misconceptions, and justifies its adoption for climate data management in the context of gridded data generated by climate models.</div></div>","PeriodicalId":55221,"journal":{"name":"Computers & Geosciences","volume":"201 ","pages":"Article 105931"},"PeriodicalIF":4.2,"publicationDate":"2025-04-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143844826","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}