Björn Heling, Thomas Oberleiter, B. Schleich, K. Willner, S. Wartzack
{"title":"On the Selection of Sensitivity Analysis Methods in the Context of Tolerance Management","authors":"Björn Heling, Thomas Oberleiter, B. Schleich, K. Willner, S. Wartzack","doi":"10.1115/1.4043912","DOIUrl":"https://doi.org/10.1115/1.4043912","url":null,"abstract":"Although mass production parts look the same at first sight, every manufactured part is unique, at least on a closer inspection. The reason for this is that every manufactured part is inevitable subjected to different scattering influencing factors and variation in the manufacturing process, such as varying temperatures or tool wear. Products, which are built from these deviation-afflicted parts, consequently show deviations from their ideal properties. To ensure that every single product nevertheless meets its technical requirements, it is necessary to specify the permitted deviations. Furthermore, it is crucial to estimate the consequences of the permitted deviations, which is done via tolerance analysis. During this process, the imperfect parts are assembled virtually and the effects of the geometric deviations can be calculated. Since the tolerance analysis enables engineers to identify weak points in an early design stage, it is important to know which contribution every single tolerance has on a certain quality-relevant characteristic to restrict or increase the correct tolerances. In this paper, four different methods to calculate the sensitivity are introduced and compared. Based on the comparison, guidelines are derived which are intended to facilitate a selection of these different methods. In particular, a newly developed approach, which is based on fuzzy arithmetic, is compared to the established high–low–median method, a variance-based method, and a density-based approach. Since all these methods are based on different assumptions, their advantages and disadvantages are critically discussed based on two case studies.","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49001589","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Estimating Physics Models and Quantifying Their Uncertainty Using Optimization With a Bayesian Objective Function","authors":"Stephen A. Andrews, A. Fraser","doi":"10.1115/1.4043807","DOIUrl":"https://doi.org/10.1115/1.4043807","url":null,"abstract":"This paper reports a verification study for a method that fits functions to sets of data from several experiments simultaneously. The method finds a maximum a posteriori probability estimate of a function subject to constraints (e.g., convexity in the study), uncertainty about the estimate, and a quantitative characterization of how data from each experiment constrains that uncertainty. While this work focuses on a model of the equation of state (EOS) of gasses produced by detonating a high explosive, the method can be applied to a wide range of physics processes with either parametric or semiparametric models. As a verification exercise, a reference EOS is used and artificial experimental data sets are created using numerical integration of ordinary differential equations and pseudo-random noise. The method yields an estimate of the EOS that is close to the reference and identifies how each experiment most constrains the result.","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49586187","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Marc Horner, Stephen M Luke, Kerim O Genc, Todd M Pietila, Ross T Cotton, Benjamin A Ache, Zachary H Levine, Kevin C Townsend
{"title":"Towards Estimating the Uncertainty Associated with Three-Dimensional Geometry Reconstructed from Medical Image Data.","authors":"Marc Horner, Stephen M Luke, Kerim O Genc, Todd M Pietila, Ross T Cotton, Benjamin A Ache, Zachary H Levine, Kevin C Townsend","doi":"","DOIUrl":"","url":null,"abstract":"<p><p>Patient-specific computational modeling is increasingly used to assist with visualization, planning, and execution of medical treatments. This trend is placing more reliance on medical imaging to provide accurate representations of anatomical structures. Digital image analysis is used to extract anatomical data for use in clinical assessment/planning. However, the presence of image artifacts, whether due to interactions between the physical object and the scanning modality or the scanning process, can degrade image accuracy. The process of extracting anatomical structures from the medical images introduces additional sources of variability, e.g., when thresholding or when eroding along apparent edges of biological structures. An estimate of the uncertainty associated with extracting anatomical data from medical images would therefore assist with assessing the reliability of patient-specific treatment plans. To this end, two image datasets were developed and analyzed using standard image analysis procedures. The first dataset was developed by performing a \"virtual voxelization\" of a CAD model of a sphere, representing the idealized scenario of no error in the image acquisition and reconstruction algorithms (i.e., a perfect scan). The second dataset was acquired by scanning three spherical balls using a laboratory-grade CT scanner. For the idealized sphere, the error in sphere diameter was less than or equal to 2% if 5 or more voxels were present across the diameter. The measurement error degraded to approximately 4% for a similar degree of voxelization of the physical phantom. The adaptation of established thresholding procedures to improve segmentation accuracy was also investigated.</p>","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":"4 4","pages":""},"PeriodicalIF":0.6,"publicationDate":"2019-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7448268/pdf/nihms-1572949.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"38318163","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Epistemic Uncertainty Stemming From Measurement Processing—A Case Study of Multiphase Shock Tube Experiments","authors":"Chanyoung Park, J. Matthew, N. Kim, R. Haftka","doi":"10.1115/1.4042814","DOIUrl":"https://doi.org/10.1115/1.4042814","url":null,"abstract":"Experiments of a shock hitting a curtain of particles were conducted at the multiphase shock tube facility at Sandia National Laboratories. These are studied in this paper for quantifying the epistemic uncertainty in the experimental measurements due to processing via measurement models. Schlieren and X-ray imaging techniques were used to obtain the measurements that characterize the particle curtain with particle volume fraction and curtain edge locations. The epistemic uncertainties in the experimental setup and image processing methods were identified and measured. The effects of these uncertainties on the uncertainty in the extracted experimental measurements were quantified. The influence of the epistemic uncertainty was significantly higher than the experimental variability that has been previously considered as the most important uncertainty of experiments.","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2018-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1115/1.4042814","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46609727","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
K. Hughes, S. Balachandar, N. Kim, Chanyoung Park, R. Haftka, A. Diggs, D. Littrell, Jason Darr
{"title":"Forensic Uncertainty Quantification for Experiments on the Explosively Driven Motion of Particles","authors":"K. Hughes, S. Balachandar, N. Kim, Chanyoung Park, R. Haftka, A. Diggs, D. Littrell, Jason Darr","doi":"10.1115/1.4043478","DOIUrl":"https://doi.org/10.1115/1.4043478","url":null,"abstract":"Six explosive experiments were performed in October 2014 and February of 2015 at the Munitions Directorate of the Air Force Research Laboratory with the goal of providing validation-quality data for particle drag models in the extreme regime of detonation. Three repeated single particle experiments and three particle array experiments were conducted. The time-varying position of the particles was captured within the explosive products by X-ray imaging. The contact front and shock locations were captured by high-speed photography to provide information on the early time gas behavior. Since these experiments were performed in the past and could not be repeated, we faced an interesting challenge of quantifying and reducing uncertainty through a detailed investigation of the experimental setup and operating conditions. This paper presents the results from these unique experiments, which can serve as benchmark for future modeling, and also our effort to reduce uncertainty, which we dub forensic uncertainty quantification (FUQ).","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2018-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1115/1.4043478","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46948980","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Lucas Konnigk, B. Torner, Sebastian Hallier, M. Witte, F. Wurm
{"title":"Grid-Induced Numerical Errors for Shear Stresses and Essential Flow Variables in a Ventricular Assist Device: Crucial for Blood Damage Prediction?","authors":"Lucas Konnigk, B. Torner, Sebastian Hallier, M. Witte, F. Wurm","doi":"10.1115/1.4042989","DOIUrl":"https://doi.org/10.1115/1.4042989","url":null,"abstract":"Adverse events due to flow-induced blood damage remain a serious problem for blood pumps as cardiac support systems. The numerical prediction of blood damage via computational fluid dynamics (CFD) is a helpful tool for the design and optimization of reliable pumps. Blood damage prediction models primarily are based on the acting shear stresses, which are calculated by solving the Navier–Stokes equations on computational grids. The purpose of this paper is to analyze the influence of the spatial discretization and the associated discretization error on the shear stress calculation in a blood pump in comparison to other important flow quantities like the pressure head of the pump. Therefore, CFD analysis using seven unsteady Reynolds-averaged Navier–Stokes (URANS) simulations was performed. Two simple stress calculation indicators were applied to estimate the influence of the discretization on the results using an approach to calculate numerical uncertainties, which indicates discretization errors. For the finest grid with 19 × 106 elements, numerical uncertainties up to 20% for shear stresses were determined, while the pressure heads show smaller uncertainties with a maximum of 4.8%. No grid-independent solution for velocity gradient-dependent variables could be obtained on a grid size that is comparable to mesh sizes in state-of-the-art blood pump studies. It can be concluded that the grid size has a major influence on the shear stress calculation, and therefore, the potential blood damage prediction, and that the quantification of this error should always be taken into account.","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2018-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1115/1.4042989","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43099660","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Assessment of Model Confidence of a Laser Source Model in xRAGE Using Omega Direct-Drive Implosion Experiments","authors":"B. Wilson, A. Koskelo","doi":"10.1115/1.4043370","DOIUrl":"https://doi.org/10.1115/1.4043370","url":null,"abstract":"Los Alamos National Laboratory is interested in developing high-energy-density physics validation capabilities for its multiphysics code xRAGE. xRAGE was recently updated with the laser package Mazinisin to improve predictability. We assess the current implementation and coupling of the laser package via validation of laser-driven, direct-drive spherical capsule experiments from the Omega laser facility. The ASME V&V 20-2009 standard is used to determine the model confidence of xRAGE, and considerations for high-energy-density physics are identified. With current modeling capabilities in xRAGE, the model confidence is overwhelmed by significant systematic errors from the experiment or model. Validation evidence suggests cross-beam energy transfer as a dominant source of the systematic error.","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2018-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1115/1.4043370","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45412044","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Choudhary, William C. Tyson, Christopher J. Roy
{"title":"Implementation and Assessment of a Residual-Based r-Adaptation Technique on Structured Meshes","authors":"A. Choudhary, William C. Tyson, Christopher J. Roy","doi":"10.1115/1.4043652","DOIUrl":"https://doi.org/10.1115/1.4043652","url":null,"abstract":"In this study, an r-adaptation technique for mesh adaptation is employed for reducing the solution discretization error, which is the error introduced due to spatial and temporal discretization of the continuous governing equations in numerical simulations. In r-adaptation, mesh modification is achieved by relocating the mesh nodes from one region to another without introducing additional nodes. Truncation error (TE) or the discrete residual is the difference between the continuous and discrete form of the governing equations. Based upon the knowledge that the discrete residual acts as the source of the discretization error in the domain, this study uses discrete residual as the adaptation driver. The r-adaptation technique employed here uses structured meshes and is verified using a series of one-dimensional (1D) and two-dimensional (2D) benchmark problems for which exact solutions are readily available. These benchmark problems include 1D Burgers equation, quasi-1D nozzle flow, 2D compression/expansion turns, and 2D incompressible flow past a Karman–Trefftz airfoil. The effectiveness of the proposed technique is evident for these problems where approximately an order of magnitude reduction in discretization error (when compared with uniform mesh results) is achieved. For all problems, mesh modification is compared using different schemes from literature including an adaptive Poisson grid generator (APGG), a variational grid generator (VGG), a scheme based on a center of mass (COM) analogy, and a scheme based on deforming maps. In addition, several challenges in applying the proposed technique to real-world problems are outlined.","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2018-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1115/1.4043652","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43736946","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Verification and Validation of the FLAG Hydrocode for Impact Cratering Simulations","authors":"W. Caldwell, A. Hunter, C. Plesko, S. Wirkus","doi":"10.1115/1.4042516","DOIUrl":"https://doi.org/10.1115/1.4042516","url":null,"abstract":"Verification and validation (V&V) are necessary processes to ensure accuracy of the computational methods used to solve problems key to vast numbers of applications and industries. Simulations are essential for addressing impact cratering problems, because these problems often exceed experimental capabilities. Here, we show that the free Lagrange (FLAG) hydrocode, developed at Los Alamos National Laboratory (Los Alamos, NM), can be used for impact cratering simulations by verifying FLAG against two analytical models of aluminum-on-aluminum impacts at different impact velocities and validating FLAG against a glass-into-water laboratory impact experiment. Our verification results show good agreement with the theoretical maximum pressures, with relative errors as low in magnitude as 1.00%. Our validation results demonstrate FLAG's ability to model various stages of impact cratering, with crater radius relative errors as low as 3.48% and crater depth relative errors as low as 0.79%. Our mesh resolution study shows that FLAG converges at resolutions low enough to reduce the required computation time from about 28 h to about 25 min. We anticipate that FLAG can be used to model larger impact cratering problems with increased accuracy and decreased computational cost on current systems relative to other hydrocodes tested by Pierazzo et al. (2008, “Validation of Numerical Codes for Impact and Explosion Cratering: Impacts on Strengthless and Metal Targets,” MAPS, 43(12), pp. 1917–1938).","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2018-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1115/1.4042516","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48368305","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Verification of Advective Bar Elements Implemented in the sierra/aria Thermal Response Code","authors":"Brantley Mills, Adam C. Hetzler, Oscar Deng","doi":"10.1115/1.4041837","DOIUrl":"https://doi.org/10.1115/1.4041837","url":null,"abstract":"A thorough code verification effort has been performed on a reduced order, finite element model for one-dimensional (1D) fluid flow convectively coupled with a three-dimensional (3D) solid, referred to as the “advective bar” model. The purpose of this effort was to provide confidence in the proper implementation of this model within the sierra/aria thermal response code at Sandia National Laboratories. The method of manufactured solutions (MMS) is applied so that the order of convergence in error norms for successively refined meshes and timesteps is investigated. Potential pitfalls that can lead to a premature evaluation of the model's implementation are described for this verification approach when applied to this unique model. Through observation of the expected order of convergence, these verification tests provide evidence of proper implementation of the model within the codebase.","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2018-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1115/1.4041837","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45678023","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}