{"title":"Beyond Data Validation – Advanced Strategies for Assessing Data Quality for Oil Spill Investigations","authors":"L. Cook, Laurie D. Benton, Melanie Edwards","doi":"10.7901/2169-3358-2021.1.689558","DOIUrl":null,"url":null,"abstract":"\n Field sampling investigations in response to oil spill incidents are growing increasingly more complex with analytical data collected by a variety of interested parties over many years and with different investigative purposes. For the Deepwater Horizon (DWH) Oil Spill, the analytical chemistry data and toxicity study data were required to be validated in accordance with U.S. Environmental Protection Agency's (EPA's) data validation for Superfund program methods. The process of validating data according to EPA guidelines is a manual and time-consuming process focused on chemistry results for individual samples within a single data package to assess if data meet quality control criteria. In hindsight, the burden of validating all of the chemistry data appears to be excessive, and for some parameters unnecessary, which was costly and slowed the process of disseminating data. Depending on the data use (e.g., assessing human and ecological risk, qualitative oil tracking, or forensic fingerprinting), data validation may not be needed in every circumstance or for every data type.\n Publicly available water column, sediment, and oil chemistry analytical data associated with the DWH Oil Spill, obtained from the Gulf of Mexico Research Initiative Information and Data Cooperative data portal were evaluated to understand the impact, effort, accuracy, and benefit of the data validation process. Questions explored include: What data changed based on data validation reviews?How would these changes affect the associated data evaluation findings?Did data validation introduce additional errors?What data quality issues did the data validation process miss?What statistical and data analytical approaches would more efficiently identify potential data quality issues?\n Based on our evaluation of the chemical data associated with the DWH Oil Spill, new strategies to assess the quality of data associated with oil spill investigations will be presented.","PeriodicalId":14447,"journal":{"name":"International Oil Spill Conference Proceedings","volume":"231 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2021-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Oil Spill Conference Proceedings","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.7901/2169-3358-2021.1.689558","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Field sampling investigations in response to oil spill incidents are growing increasingly more complex with analytical data collected by a variety of interested parties over many years and with different investigative purposes. For the Deepwater Horizon (DWH) Oil Spill, the analytical chemistry data and toxicity study data were required to be validated in accordance with U.S. Environmental Protection Agency's (EPA's) data validation for Superfund program methods. The process of validating data according to EPA guidelines is a manual and time-consuming process focused on chemistry results for individual samples within a single data package to assess if data meet quality control criteria. In hindsight, the burden of validating all of the chemistry data appears to be excessive, and for some parameters unnecessary, which was costly and slowed the process of disseminating data. Depending on the data use (e.g., assessing human and ecological risk, qualitative oil tracking, or forensic fingerprinting), data validation may not be needed in every circumstance or for every data type.
Publicly available water column, sediment, and oil chemistry analytical data associated with the DWH Oil Spill, obtained from the Gulf of Mexico Research Initiative Information and Data Cooperative data portal were evaluated to understand the impact, effort, accuracy, and benefit of the data validation process. Questions explored include: What data changed based on data validation reviews?How would these changes affect the associated data evaluation findings?Did data validation introduce additional errors?What data quality issues did the data validation process miss?What statistical and data analytical approaches would more efficiently identify potential data quality issues?
Based on our evaluation of the chemical data associated with the DWH Oil Spill, new strategies to assess the quality of data associated with oil spill investigations will be presented.