Jalil Ghobadi, David Ramirez, S. Khoramfar, R. Jerman, M. Crane, Olufemi Oladosu
{"title":"Application of Hollow Fiber Membrane Contacting System for CO2/CH4 Separation","authors":"Jalil Ghobadi, David Ramirez, S. Khoramfar, R. Jerman, M. Crane, Olufemi Oladosu","doi":"10.2118/191729-MS","DOIUrl":"https://doi.org/10.2118/191729-MS","url":null,"abstract":"\u0000 Carbon dioxide separation using membrane based contacting system is a reliable alternative to traditional gas absorbent techniques such as wet scrubbers. The main objective of this research was to design, develop and implement a hollow fiber membrane based contactor system to absorb and separate CO2 from CH4 in a simulated flare gas stream.\u0000 Gas-liquid contacting system was constructed using microporous polytetrafluoroethylene (PTFE) hollow fibers as a highly hydrophobic membrane. The module used for the experimental studies has 51 mm diameter and 200 mm effective length. The membrane module had the packing density of 60 % and the PTFE hollow fiber being employed in this module had the mean pore size of 0.48 μm. Experiments conducted in a laboratory-scale plant fed with a simulated flare gas mixture containing 2.5 % of CO2 balanced with CH4 which could produce varying concentrations of inlet gas using mass flow controller.\u0000 CO2 separation experimentation studies were performed and effect of operational variables on separation efficiency of the system has been studied. In order to optimize the gas separation performance of the membrane module, effects of gas and liquid flow rates, absorbent-phase concentration, and nature of scrubbing liquid were examined. The absorption efficiency of deionized-water and aqueous solutions of sodium hydroxide (NaOH) and diethanolamine (DEA) as the physical and chemical absorbents has been compared. Results indicated that increasing the flow rate and concentration of scrubbing liquid can enhance the separation efficiency; however, increasing the flow rates of the gas-phase has a negative impact on the CO2 absorption performance of the system.\u0000 The traditional CO2 separation process suffers from many limitations, such as high capital and operational costs, and potential of equipment corrosion. Membrane processes offer attractive opportunities for gas treatment applications including removal of CO2, H2S, and SO2 from flare gas mixtures. This technology offers a variety of practical benefits including low energy and operation costs and at the same time it can help to mitigate the adverse health effects associated with burning the waste gases.","PeriodicalId":441169,"journal":{"name":"Day 3 Wed, September 26, 2018","volume":"138 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124363222","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Z. Chai, Hewei Tang, Youwei He, J. Killough, Yuhe Wang
{"title":"Uncertainty Quantification of the Fracture Network with a Novel Fractured Reservoir Forward Model","authors":"Z. Chai, Hewei Tang, Youwei He, J. Killough, Yuhe Wang","doi":"10.2118/191395-MS","DOIUrl":"https://doi.org/10.2118/191395-MS","url":null,"abstract":"\u0000 A major part of the uncertainty for shale reservoirs comes from the distribution and properties of the fracture network. However, explicit fracture models are rarely used in uncertainty quantification due to their high computational cost. This paper presents a workflow to match the history of reservoirs with complex fracture network with a novel forward model. By taking advantage of the efficiency of the model, fractures can be explicitly characterized, and the corresponding uncertainty about the distribution and properties of fractures can be evaluated. No upscaling of the fracture properties is necessary, which is usually a required step in a traditional workflow.\u0000 The embedded discrete fracture model (EDFM) has recently been studied by many researchers due to its high efficiency compared to other explicit fracture models. By assuming a linearly distributed pressure near fractures, EDFM can provide a sub-grid resolution that lifts the requirement to refine near the fractures to a comparable size as the fracture aperture. Although efficient, considerable error is reported when applying this method to simulate flow barriers, especially when dominant flux direction is across instead of along the fractures. In this work, a novel discrete fracture model, compartmental EDFM (cEDFM) is developed based on the original EDFM framework. However, different from the original method, in cEDFM the fracture would split matrix grid blocks when intersecting them. The new model is benchmarked for single phase as well as multi-phase cases, and the accuracy is evaluated by comparing to fine explicit cases. Results indicate the improved model yields much better accuracy even for multi-phase flow simulation with flow barriers.\u0000 In the second part of the work, we applied the model in history matching and performed uncertainty quantification to the fracture network for two synthetic cases. We used Ensemble Kalman Filter (EnKF) as the data assimilation algorithm due to its robustness for cases with large uncertainty. The initial state does not need to be close to the truth to achieve convergence. Also EnKF performs well for the history matching of reservoirs with complex fracture network, where the number of parameters can be large. Therefore, it is advantageous compared to using Ensemble Smoother (ES) or Markov Chain Monte Carlo (MCMC) for fractured reservoirs. After the final step of data assimilation, a good match is obtained that can predict the production reasonably well. The proposed cEDFM model shows its robustness to be incorporated into the EnKF workflow, and benefit from the efficiency of the model, this work made it practical to perform history matching with explicit fracture models.","PeriodicalId":441169,"journal":{"name":"Day 3 Wed, September 26, 2018","volume":"4 6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123467033","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Experimental Proppant Conductivity Observations: Evaluating Impact of Proppant Size and Fluid Chemistry on Long-Term Production in Shales","authors":"Abhinav Mittal, C. Rai, C. Sondergeld","doi":"10.2118/191741-MS","DOIUrl":"https://doi.org/10.2118/191741-MS","url":null,"abstract":"\u0000 Propped hydraulic fractures have enabled economic hydrocarbon production from organic rich shales. Laboratory testing of proppants can help in systematic evaluation of different factors that can affect proppant performance. This study is focused on long-term conductivity measurements of proppant-packs at simulated reservoir pressure and temperature conditions. Mechanisms like proppant crushing, embedment, and diagenesis are investigated.\u0000 Testing was done using a conductivity cell made of Hastelloy; allowing simultaneous measurement of fracture compaction and permeability. The proppant filled fracture (concentration: 0.75-3 lb/ft2) is subjected to axial load (5000 psi) to simulate closure stress. Brine is flowed through the pack at a constant rate (3 ml/min) at elevated temperature (250° F) over an extended duration of time (from 10-60 days). 20/40 and 60/100 mesh Ottawa sand were used in this study. The proppant-pack performance is evaluated between shale platens fabricated from Eagle Ford rock (58% clay by wt.; Nanoindentation Young's modulus - 16 GPa).\u0000 Experiments on the 20/40 and 60/100 Ottawa sand (1.5 lb/ft2 proppant concentration) at elevated pressure (5000 psi) and temperature (250° F), spanning 10 days demonstrate that proppant size strongly impacts proppant performance. The proppant-pack permeability for 60/100 sand drops dramatically within a few hours. The 20/40 proppant permeability is double the permeability of 60/100 sand even after 10 days of testing. Approximately 60% compaction is observed over the test duration, with 28% contribution from proppant crushing and rearrangement, and 32% contribution from embedment. Particle size analysis of proppant grains and SEM images verify proppant crushing, fines migration and embedment as dominant damage mechanisms. Proppant embedment and crushing are observed to be dependent on the shales being tested.\u0000 Fracturing jobs involve maintaining a basic pH environment for optimal performance of fluid additives for better proppant placement via control on viscosity. A second study was conducted to compare performance on similar Eagle Ford shale by altering the fluid chemistry (pH ~ 10.5) to understand the impact on permeability and compaction over time. Over a duration of 20 days, the permeability dropped from 120 darcy to 200 md. After 8 days, the pH:10 brine permeability was 10 times lower than pH:7 brine permeability. After 18 days, the fracture width reduced by 90%, indicating a creep behavior. High silica content (>20 ppm) was observed in the outlet brine. The proppant and rock surface were studied under SEM to investigate the role of secondary mineral growth during the drastic reduction of permeability.\u0000 This study is focused on understanding fracture conductivity under as realistic near in-situ experimental conditions. Testing between shale platens at reservoir temperature and pressure conditions is more representative of subsurface environment. Dynamic measurements in the current study w","PeriodicalId":441169,"journal":{"name":"Day 3 Wed, September 26, 2018","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122612236","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Patrick Tower, A. Williams, Sam Sakievich, Tony Howdeshell, Patrick Kearley, Lonny Motruk, Tony McCullough
{"title":"Perfecting Operational Execution: The Journey to High Reliability in the Oilfield","authors":"Patrick Tower, A. Williams, Sam Sakievich, Tony Howdeshell, Patrick Kearley, Lonny Motruk, Tony McCullough","doi":"10.2118/191754-MS","DOIUrl":"https://doi.org/10.2118/191754-MS","url":null,"abstract":"\u0000 High Reliability Organizations (HROs) are organizations with systems that maintain exceptionally low failure rates while operating in environments where the nature of the risk and complexity of serious incidents would be anticipated. HROs such as nuclear submarines, aircraft carriers, and the fire service have no fail missions because the costs of failure are extremely high. Applying the same systems and tools that created their passionate commitment to excellence that permeates every aspect of their operations, other industries can produce their own culture of high reliability where nearly perfect safety and service quality are the norm.\u0000 The aim of this paper is to illustrate how the principles, concepts, and processes of HROs, including Crew Resource Management (CRM), can optimize operational execution of a pressure pumping company through increased levels of safety and quality. The contemporary oilfield operating environment requires teams to be proactively aware of emerging threats and to trap errors or incidents to prevent them from escalating into significant incidents.\u0000 The paper includes the following themes: A literature review of the principles and concepts of traditional and established HROs from other industries;The process and systems to operationalize HRO principles and concepts within the energy industry; andTwo case studies demonstrating how HRO principles and concepts allow teams in the oilfield to reduce system failures and to notice, confront, resolve, and learn from unforeseen problems and failures when they do occur.","PeriodicalId":441169,"journal":{"name":"Day 3 Wed, September 26, 2018","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129411310","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Higgins, D. Burnett, U. Kreuter, G. Theodori, R. Haut
{"title":"Public Engagement: New Approaches to Enhance Field Planning and Environmental Compliance for O&G Development","authors":"M. Higgins, D. Burnett, U. Kreuter, G. Theodori, R. Haut","doi":"10.2118/191739-MS","DOIUrl":"https://doi.org/10.2118/191739-MS","url":null,"abstract":"\u0000 Studies have shown the need to engage the public early in planning new energy development ventures. Many times, this may be critical to the ultimate success of that program. Based on our previous studies of community acceptance of shale development projects, we have undertaken new studies to understand better the communities’ views of development in environmentally sensitive areas. Early interviews with key stakeholders and specially conducted focus group meetings helped previously to identify the public's perceptions of such technology and showed that favorable or unfavorable views were based mostly on pre-conceived notions of potential effects rather that factual data. Those findings have shown that a developer should actively work to increase the public's awareness about a potential project. There is a strong need for those who plan new energy development to be completely transparent and to give the public knowledgeable (and credible) information about events and developments that will impact them, not just financial gains, but environmental impacts and overall effects on community.\u0000 Recognizing that sustainable development of energy resources is a sound business strategy, some O&G companies are now incorporating public engagement as part of their business strategy. However, some companies do not. Recent activity in West Texas provides an example of how companies should and should not go about energy development projects.","PeriodicalId":441169,"journal":{"name":"Day 3 Wed, September 26, 2018","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129778990","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
G. Gao, J. Vink, Chaohui Chen, M. Araujo, Benjamin Ramirez, J. W. Jennings, Y. E. Khamra, J. Ita
{"title":"Robust Uncertainty Quantification through Integration of Distributed Gauss-Newton Optimization with Gaussian Mixture Model and Parallelized Sampling Algorithms","authors":"G. Gao, J. Vink, Chaohui Chen, M. Araujo, Benjamin Ramirez, J. W. Jennings, Y. E. Khamra, J. Ita","doi":"10.2118/191516-MS","DOIUrl":"https://doi.org/10.2118/191516-MS","url":null,"abstract":"\u0000 Uncertainty quantification of production forecasts is crucially important for business planning of hydrocarbon field developments. This is still a very challenging task, especially when subsurface uncertainties must be conditioned to production data. Many different approaches have been proposed, each with their strengths and weaknesses. In this work, we develop a robust uncertainty quantification workflow by seamless integration of a distributed Gauss-Newton (DGN) optimization method with Gaussian Mixture Model (GMM) and parallelized sampling algorithms. Results are compared with those obtained from other approaches.\u0000 Multiple local maximum-a-posteriori (MAP) estimates are located with the local-search DGN optimization method. A GMM is constructed to approximate the posterior probability density function, by fitting simulation results generated during the DGN minimization process. The traditional acceptance-rejection (AR) algorithm is parallelized and applied to improve the quality of GMM samples by rejecting unqualified samples. AR-GMM samples are independent, identically-distributed (i.i.d.) samples that can be directly used for uncertainty quantification of model parameters and production forecasts.\u0000 The proposed method is first validated with 1-D nonlinear synthetic problems having multiple MAP points. The AR-GMM samples are better than the original GMM samples. Then, it is tested with a synthetic history-matching problem using the SPE-1 reservoir model with 8 uncertain parameters. The proposed method generates conditional samples that are better than or equivalent to those generated by other methods, e.g., Markov chain Monte Carlo (MCMC) and global search DGN combined with the Randomized Maximum Likelihood (RML) approach, but have a much lower computational cost (by a factor of 5 to 100). Finally, it is applied to a real field reservoir model with synthetic data, having 235 uncertain parameters. A GMM with 27 Gaussian components is constructed to approximate the actual posterior PDF. 105 AR-GMM samples are accepted from the 1000 original GMM samples, and are used to quantify uncertainty of production forecasts. The proposed method is further validated by the fact that production forecasts for all AR-GMM samples are quite consistent with the production data observed after the history matching period.\u0000 The newly proposed approach for history matching and uncertainty quantification is quite efficient and robust. The DGN optimization method can efficiently identify multiple local MAP points in parallel. The GMM yields proposal candidates with sufficiently high acceptance ratios for the AR algorithm. Parallelization makes the AR algorithm much more efficient, which further enhances the efficiency of the integrated workflow.","PeriodicalId":441169,"journal":{"name":"Day 3 Wed, September 26, 2018","volume":"73 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129178706","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Data-Driven In-Situ Geomechanical Characterization in Shale Reservoirs","authors":"Hao Li, Jiabo He, S. Misra","doi":"10.2118/191400-MS","DOIUrl":"https://doi.org/10.2118/191400-MS","url":null,"abstract":"\u0000 Compressional and shear travel time logs (DTC and DTS) acquired using sonic logging tools are crucial for subsurface geomechanical characterization. In this study, 13 ‘easy-to-acquire’ conventional logs were processed using 6 shallow learning models, namely ordinary least squares (OLS), partial least squares (PLS), elastic net (EN), LASSO, multivariate adaptive regression splines (MARS), and artificial neural network (ANN), to successfully synthesize DTC and DTS logs. Among the 6 models, ANN outperforms other models with R2 of 0.87 and 0.85 for the syntheses of DTC and DTS logs, respectively. The 6 shallow learning models are trained and tested with 8481 data points acquired from a 4240-feet depth interval of a shale reservoir in Well 1, and the trained models are deployed in Well 2 for purposes of blind testing against 2920 data points from 1460-feet depth interval. Following that, 5 clustering algorithms are applied on the 13 ‘easy-to-acquire’ logs to identify clusters and compare them with the prediction performance of the shallow learning models used for log synthesis. Dimensionality reduction algorithm is used to visualize the characteristics of the clustering algorithm. Hierarchical clustering, DBSCAN, and self-organizing map (SOM) algorithms are sensitive to outliers and did not effectively differentiate the input data into consistent clusters. Gaussian mixture model can well differentiate the various formations, but the clusters do not have a strong correlation with the prediction performance of the log-synthesis models. Clusters identified using K-means method have a strong correlation with the prediction performance of the shallow learning models. By combining the predictive shallow learning models for log synthesis with the K-means clustering algorithm, we propose a reliable workflow that can synthesize the DTC and DTS logs, as well as generate a reliability indicator for the predicted logs to help an user better understand the performance of the shallow learning models during deployment.","PeriodicalId":441169,"journal":{"name":"Day 3 Wed, September 26, 2018","volume":"74 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130347856","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Juntao Liu, Shucai Liu, C. Yuan, Feng Zhang, Huizhong Yan, B. Miao, Hu Li
{"title":"A Method for Improving the Evaluation of Elemental Concentration in Neutron-Induced Gamma-Ray Spectroscopy Logging","authors":"Juntao Liu, Shucai Liu, C. Yuan, Feng Zhang, Huizhong Yan, B. Miao, Hu Li","doi":"10.2118/191438-MS","DOIUrl":"https://doi.org/10.2118/191438-MS","url":null,"abstract":"\u0000 The determination of elemental concentrations, mineralogy and lithology is essential for the evaluation of unconventional plays. Several instruments utilizing D-T generator and gamma ray detectors have been developed to determine elemental weight fractions. The measured gamma ray energy spectrum can be approximated by a linear combination of the standard spectra of individual elements. Each geochemical well logging instrument has its unique set of elemental standards, which can be obtained by experimental approaches. The variations of the shapes of elemental standards are not considered under different conditions and it affects the accuracy of the elemental concentrations.\u0000 A decomposition of measured spectrum is carried out based on the elemental standard spectra to determine the relative elemental yields, which represent the contribution of gamma rays emitted by each element to the measured spectrum. Then the elemental yields can be converted to elemental weight fractions by using oxide closure model. The standard gamma ray spectra of individual elements are calculated by using Monte Carlo numerical simulation. The responses of standard spectra under different conditions such as different borehole sizes and drilling mud types, are simulated. The reasons for the change of the standard spectra shape are analyzed. A model based on gamma ray attenuation is introduced to compensate for the effects of logging conditions on the changes of standard spectra.\u0000 The simulation results show that variation of logging conditions have impacts on the shapes of the standard libraries. Therefore, if only one set of fixed standard spectra is utilized in processing measured spectra, the accuracy of elemental concentrations is affected. To address this problem, a model for compensating for the variations of the shapes of standard spectra is proposed. A field example of pulsed neutron spectroscopy logging is illustrated to validate the proposed method and results show that the agreement between computed elemental weight fractions and core analysis values is improved.\u0000 In the processing of geochemical logging data, changes in the shapes of the standard spectra were not considered. One set of fixed elemental standards are always utilized. In our study, a specific Monte Carlo simulation code is used to study the changes of elemental standards due to the variation of logging conditions and a method is proposed to improve the accuracy of elemental concentration measured by geochemical well logging.","PeriodicalId":441169,"journal":{"name":"Day 3 Wed, September 26, 2018","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115290056","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Understanding Stress Effects on Borehole Acoustic Waves for Unconventional Shale Reservoirs","authors":"Ting Lei, S. Zeroug, B. Sinha, S. Bose","doi":"10.2118/191404-ms","DOIUrl":"https://doi.org/10.2118/191404-ms","url":null,"abstract":"\u0000 Acoustic velocities in many sedimentary rocks exhibit stress sensitivity. This behaviour has been validated through experiments, observed from field measurements, and is described by the acoustoelastic model. Inversion methods based on this model have been developed to characterize stresses, and provide the basis for non-destructive means to calibrate stress profiles. Observation of borehole sonic dipole dispersion cross-over signatures serve as an indicator of stress-indcued anisotropy – an effect that has been validated theoretically through 3D numerical modeling. Such modeling has been carried out for sandstone and cabonate rock and less so for shale rock. To understand the stress effects on sonic measurements in wells traversing unconventional reservoirs, we carry out simulations of the borehole sonic measurement in shale formations subjected to subsurface stresses.\u0000 To this end, we have developed and used a new 3D modeling code based on the finite-difference time domain scheme in a cylindrical coordinate borehole system. The linear and nonlinear elastic constants of shale core samples from laboratory experiments are used as inputs to the modeling. Synthetic waveforms are processed using a modified matrix pencil algorithm to estimate the borehole sonic mode dispersions and their sensitivities to the stress-induced anisotropy.\u0000 For a vertical well, our modeling results demonstrate new dispersion signatures associated with certain shale formations. The borehole flexural dispersions at the two canonical horizontal stress directions split at high frequencies whereas they overlay at low frequencies. The split at high frequencies is caused by near-wellbore stress concentrations and the overlay at low frequencies is owing to the typical shale laminated lithology. The modelled dispersion signatures were also observed from processing of field data acquired with both sonic and ultrasonic tools in a vertical well in a laminated unconventional shale formation. The ultrasonic tool measures compressional and shear slownesses azimuthally at radial depth of about 1 in. from the borehole surface. The presence of imbalanced stresses is confirmed in adjacent intervals from symmetric breakouts. In a 20-ft interval not exhibiting breakouts but surrounded by intervals with breakouts, the ultrasonic tool also measures the compressional and shear slownesses with an azimuthal quasi-sinusoidal variation caused by stress concentrations around the borehole. On the other hand, sonic waveforms recorded by cross-dipole measurements in the same interval show high-frequency splitting dispersions as reproduced by the modeling. Taken together, these results confirm the existence of a new sonic dipole signature caused by the subsurface stresses in vertical wells traversing unconventional shale reservoirs.","PeriodicalId":441169,"journal":{"name":"Day 3 Wed, September 26, 2018","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123493736","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ryan Hillier, E. Vosburgh, D. Warrington, M. Ibrahim
{"title":"Developing a New Technique for Calculating Accurate Water Saturations from Well Logs in Source Rocks","authors":"Ryan Hillier, E. Vosburgh, D. Warrington, M. Ibrahim","doi":"10.2118/191453-MS","DOIUrl":"https://doi.org/10.2118/191453-MS","url":null,"abstract":"\u0000 Determining log-based water saturation using Archie's (1942) equation, or any derivative shaly sand method, requires correct inputs to produce valid results. In resource plays, the rock matrix is composed of water wet and oil wet constituents, therefore, correct values of Archie's cementation factor (m) and saturation exponent (n) are critical. In practice, it is pragmatic to use the Pickett plot (Pickett, 1973) to set connate water resistivity (Rw) and Archie's ‘m’. However, it is difficult, if not impossible, to derive Archie's ‘n’ parameter without additional information. Research combining core and log data shows evidence of a positive correlation between Archie's saturation exponent and the total organic content (TOC) in a given unit volume. Using this relationship, Archie's equation may be used to define a variable ‘n’. It is hypothesized that ‘n’ increases with increasing TOC volume as a result of an interruption of electrical pathways that resistivity tools exploit. This disruption results in an increase in the apparent value of ‘n’ required to compute correct water saturations. Due to the apparent excess resistivity in organic-rich rocks, an increase in ‘n’ values or kerogen corrected resistivity is needed to produce a fit to core-derived water saturations. This article will demonstrate the methodology used to derive a variable ‘n’ parameter and kerogen corrected resistivity in an organic-rich interval.","PeriodicalId":441169,"journal":{"name":"Day 3 Wed, September 26, 2018","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125962278","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}