{"title":"Quantifying uncertainty: calculating interval estimates using quality control results.","authors":"J A Schofield, K Miller, L Blume","doi":"10.1080/713844031","DOIUrl":"https://doi.org/10.1080/713844031","url":null,"abstract":"<p><p>EPA's Great Lakes National Program Office (GLNPO) is leading one of the most extensive studies of a lake ecosystem ever undertaken. The Lake Michigan Mass Balance Study (LMMB Study) is a coordinated effort among state, federal, and academic scientists to monitor tributary and atmospheric pollutant loads, develop source inventories of toxic substances, and evaluate the fate and effects of these pollutants in Lake Michigan. A key objective of the LMMB Study is to construct a mass balance model for several important contaminants in the environment: PCBs, atrazine, mercury, and trans-nonachlor. The mathematical mass balance models will provide a state-of-the-art tool for evaluating management scenarios and options for control of toxics in Lake Michigan. At the outset of the LMMB Study, managers recognized that the data gathered and the model developed from the study would be used extensively by data users responsible for making environmental, economic, and policy decisions. Environmental measurements are never true values and always contain some level of uncertainty. Decision makers, therefore, must recognize and be sufficiently comfortable with the uncertainty associated with data on which their decisions are based. The quality of data gathered in the LMMB was defined, controlled, and assessed through a variety of quality assurance (QA) activities, including QA program planning, development of QA project plans, implementation of a QA workgroup, training, data verification, and implementation of a standardized data reporting format. As part of this QA program, GLNPO has been developing quantitative assessments that define data quality at the data set level. GLNPO also is developing approaches to derive estimated concentration ranges (interval estimates) for specific field sample results (single study results) based on uncertainty. The interval estimates must be used with consideration to their derivation and the types of variability that are and are not included in the interval.</p>","PeriodicalId":77339,"journal":{"name":"Quality assurance (San Diego, Calif.)","volume":"9 3-4","pages":"229-37"},"PeriodicalIF":0.0,"publicationDate":"2001-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/713844031","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"22216676","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Establishing sensitivity requirements for environmental analyses from project data quality objectives.","authors":"T Georgian, C R Mao","doi":"10.1080/713844022","DOIUrl":"https://doi.org/10.1080/713844022","url":null,"abstract":"<p><p>This article proposes a simple strategy for establishing sensitivity requirements (quantitation limits) for environmental chemical analyses when the primary data quality objective is to determine if a contaminant of concern is greater or less than an action level (e.g., an environmental \"cleanup goal,\" regulatory limit, or risk-based decision limit). The approach assumes that the contaminant concentrations are normally distributed with constant variance (i.e., the variance is not significantly dependent upon concentration near the action level). When the total or \"field\" portion of the measurement uncertainty can be estimated, the relative uncertainty at the laboratory's quantitation limit can be used to determine requirements for analytical sensitivity. If only the laboratory component of the total uncertainty is known, the approach can be used to identify analytical methods or laboratories that will not satisfy objectives for sensitivity (e.g., when selecting methodology during project planning).</p>","PeriodicalId":77339,"journal":{"name":"Quality assurance (San Diego, Calif.)","volume":"9 3-4","pages":"191-8"},"PeriodicalIF":0.0,"publicationDate":"2001-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/713844022","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"22216782","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Required steps for the validation of a Laboratory Information Management System.","authors":"E Turner, J Bolton","doi":"10.1080/713844028","DOIUrl":"https://doi.org/10.1080/713844028","url":null,"abstract":"<p><p>The task of managing laboratory data is not a new one. Over the past two decades, the use of Laboratory Information Management Systems (LIMS) has revolutionized how laboratories manage their data. A LIMS is more than software; it has become the workhorse of the laboratory, encompassing laboratory work-flow combined with user input, data collection, instrument integration, data analysis, user notification, and delivery of information and reporting. Types of organizations that utilize LIMS vary greatly from research laboratories to manufacturing laboratories to environmental testing laboratories. Commercially-available LIMS have been around since the 1980s. In addition, many laboratories have designed, implemented, and maintained in-house LIMS. The heart of any LIMS is the software. Like other laboratory systems, the LIMS software is subject to quality control and quality assurance checks. In regulatory environments this associated QA/QC is referred to as \"system validation.\" The primary purpose of system validation is to ensure that the software is performing in a manner for which it was designed. For example, the system acceptance criteria should be established and tested against quantifiable tasks to determine if the desired outcome has been achieved. LIMS features, such as autoreporting, reproducibility, throughput, and accuracy must be quantifiable and verifiable. System validation ensures that the entire system has been properly tested, incorporates required controls, and maintains and will continue to maintain data integrity. Laboratories must establish protocols and standards for the validation process and associated documentation. Although vendors of commercial LIMS perform initial internal system validations, the system must be revalidated whenever the end user, vendor or third party adds modifications or customizations to the LIMS. Currently, detailed guidance regarding system validation of LIMS is not available to the user. The issue is addressed in Good Automated Laboratory Practices (GALP) and National Environmental Laboratory Accreditation Conference (NELAC) documents which indicate specific requirements or recommendations for operational checks and periodic testing; however, it is up to the laboratory to determine suitable methods to accomplish these tasks. Proper validation of a LIMS will allow a laboratory to comply with regulations and also provide comprehensive documentation on the system that is necessary to troubleshoot future problems.</p>","PeriodicalId":77339,"journal":{"name":"Quality assurance (San Diego, Calif.)","volume":"9 3-4","pages":"217-24"},"PeriodicalIF":0.0,"publicationDate":"2001-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/713844028","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"22216787","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Adventures in environmental data reporting: high tech, low tech, and everything in between or Wisconsin DNR's reporting systems move toward the future.","authors":"D Dinsmore","doi":"","DOIUrl":"","url":null,"abstract":"<p><p>Electronic data transmittal and data warehouses seem like obvious solutions for streamlining reporting systems and managing large bodies of data; however, regulatory agencies like Wisconsin Department of Natural Resources (DNR) face significant barriers in implementation. In addition to the development costs to the Agency, regulators may be limited by the capabilities of the regulated community and the perceived burden for small businesses and communities. Electronic systems can be implemented incrementally if supported by state regulations and processes for insuring data integrity.</p>","PeriodicalId":77339,"journal":{"name":"Quality assurance (San Diego, Calif.)","volume":"9 3-4","pages":"145-54"},"PeriodicalIF":0.0,"publicationDate":"2001-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"22216778","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The importance of a successful Quality Assurance (QA) program from a research manager's perspective.","authors":"W H Ponder","doi":"10.1080/713844023","DOIUrl":"https://doi.org/10.1080/713844023","url":null,"abstract":"<p><p>One responsibility of an EPA research manager is to ensure that data from research projects are acquired, processed, and reported in accordance with Quality Assurance (QA) requirements established by the Agency. To meet this responsibility, the research manager should understand Agency QA requirements, require an appropriate, effective Quality Assurance program to ensure that data are of known and acceptable quality for the intended use of the data, and provide support, guidance, and oversight to principal investigators in meeting QA requirements. In addition, the effectiveness of the QA effort can be enhanced if the research manager 1) ensures that principal investigators and other managers are aware that QA is viewed as an essential, integrated component of the research programs; 2) provides adequate resources (people and money) to support an effective Quality Assurance program; 3) encourages cooperative, productive interactions between researchers and Quality Assurance professionals; and 4) maintains oversight so that issues that have the potential for adversely affecting research and Quality Assurance objectives can be negotiated and corrected quickly. This presentation will discuss the Air Pollution Prevention and Control Division's Quality Assurance program and the approaches used to meet Quality Assurance requirements in the Division. The presentation will be a technical manager's perspective of the Division's requirements for and approach to Quality Assurance in its research programs. The presentation will include the design of the QA Team, the roles of members of the QA Team, training and technical aids provided by the QA Team to promote understanding of and adherence to Agency QA requirements, the interactions of the QA Team members with principal investigators, and examples of effective conflict resolution.</p>","PeriodicalId":77339,"journal":{"name":"Quality assurance (San Diego, Calif.)","volume":"9 3-4","pages":"173-8"},"PeriodicalIF":0.0,"publicationDate":"2001-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/713844023","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"22216781","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"In search of representativeness: evolving the environmental data quality model.","authors":"D M Crumbling","doi":"10.1080/713844024","DOIUrl":"https://doi.org/10.1080/713844024","url":null,"abstract":"<p><p>Environmental regulatory policy states a goal of \"sound science.\" The practice of good science is founded on the systematic identification and management of uncertainties; i.e., knowledge gaps that compromise our ability to make accurate predictions. Predicting the consequences of decisions about risk and risk reduction at contaminated sites requires an accurate model of the nature and extent of site contamination, which in turn requires measuring contaminant concentrations in complex environmental matrices. Perfecting analytical tests to perform those measurements has consumed tremendous regulatory attention for the past 20-30 years. Yet, despite great improvements in environmental analytical capability, complaints about inadequate data quality still abound. This paper argues that the first generation data quality model that equated environmental data quality with analytical quality was a useful starting point, but it is insufficient because it is blind to the repercussions of multifaceted issues collectively termed \"representativeness.\" To achieve policy goals of \"sound science\" in environmental restoration projects, the environmental data quality model must be updated to recognize and manage the uncertainties involved in generating representative data from heterogeneous environmental matrices.</p>","PeriodicalId":77339,"journal":{"name":"Quality assurance (San Diego, Calif.)","volume":"9 3-4","pages":"179-90"},"PeriodicalIF":0.0,"publicationDate":"2001-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/713844024","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"22216783","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"One fish, two fish, we QC fish: controlling data quality among more than 50 organizations over a four-year period.","authors":"L Riddick, C Simbanin","doi":"10.1080/713844027","DOIUrl":"https://doi.org/10.1080/713844027","url":null,"abstract":"<p><p>EPA is conducting a National Study of Chemical Residues in Lake Fish Tissue. The study involves five analytical laboratories, multiple sampling teams from each of the 47 participating states, several tribes, all 10 EPA Regions and several EPA program offices, with input from other federal agencies. To fulfill study objectives, state and tribal sampling teams are voluntarily collecting predator and bottom-dwelling fish from approximately 500 randomly selected lakes over a 4-year period. The fish will be analyzed for more than 300 pollutants. The long-term nature of the study, combined with the large number of participants, created several QA challenges: (1) controlling variability among sampling activities performed by different sampling teams from more than 50 organizations over a 4-year period; (2) controlling variability in lab processes over a 4-year period; (3) generating results that will meet the primary study objectives for use by OW statisticians; (4) generating results that will meet the undefined needs of more than 50 participating organizations; and (5) devising a system for evaluating and defining data quality and for reporting data quality assessments concurrently with the data to ensure that assessment efforts are streamlined and that assessments are consistent among organizations. This paper describes the QA program employed for the study and presents an interim assessment of the program's effectiveness.</p>","PeriodicalId":77339,"journal":{"name":"Quality assurance (San Diego, Calif.)","volume":"9 3-4","pages":"209-16"},"PeriodicalIF":0.0,"publicationDate":"2001-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/713844027","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"22216785","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Transforming an EPA QA/R-2 quality management plan into an ISO 9002 quality management system.","authors":"R A Kell, C M Hedin, G H Kassakhian, E S Reynolds","doi":"10.1080/713844026","DOIUrl":"https://doi.org/10.1080/713844026","url":null,"abstract":"<p><p>The Environmental Protection Agency's (EPA) Office of Emergency and Remedial Response (OERR) requires environmental data of known quality to support Superfund hazardous waste site projects. The Quality Assurance Technical Support (QATS) Program is operated by Shaw Environmental and Infrastructure, Inc. to provide EPA's Analytical Operations Center (AOC) with performance evaluation samples, reference materials, on-site laboratory auditing capabilities, data audits (including electronic media data audits), methods development, and other support services. The new QATS contract awarded in November 2000 required that the QATS Program become ISO 9000 certified. In a first for an EPA contractor, the QATS staff and management successfully transformed EPA's QA/R-2 type Quality Management Plan into a Quality Management System (QMS) that complies with the requirements of the internationally recognized ISO 9002 standard and achieved certification in the United States, Canada, and throughout Europe. The presentation describes how quality system elements of ISO 9002 were implemented on an already existing quality system. The psychological and organizational challenges of the culture change in QATS' day-to-day operations will be discussed for the benefit of other ISO 9000 aspirants.</p>","PeriodicalId":77339,"journal":{"name":"Quality assurance (San Diego, Calif.)","volume":"9 3-4","pages":"199-207"},"PeriodicalIF":0.0,"publicationDate":"2001-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/713844026","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"22216784","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
R S Wright, J S Nichol, M L Papp, P W Groff, M W Tufts
{"title":"Annual recertification program for audit standards used in the EPA PM2.5 Performance Evaluation Program.","authors":"R S Wright, J S Nichol, M L Papp, P W Groff, M W Tufts","doi":"","DOIUrl":"","url":null,"abstract":"<p><p>This paper describes procedures used to perform 152 annual recertifications of temperature, pressure, and flow rate audit standards. It discusses the metrology laboratories and the uncertainty of their recertifications. It describes the data base for the standards that tracks their recertifications and shipments. Finally, it presents some illustrative recertification results and describes what these results reveal about the audit standards and the recertifications.</p>","PeriodicalId":77339,"journal":{"name":"Quality assurance (San Diego, Calif.)","volume":"9 3-4","pages":"129-35"},"PeriodicalIF":0.0,"publicationDate":"2001-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"22216239","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Building the airplane in flight: an auditing approach to quality management system development.","authors":"M C Burson","doi":"10.1080/713844025","DOIUrl":"https://doi.org/10.1080/713844025","url":null,"abstract":"<p><p>In June of 2000, Maine DEP (in company with the other five New England states) found itself under EPA pressure to develop and document a quality management system by the end of the calendar year. In the frenzy that followed, the Department decided to use a private sector model for system development that called for a prospectively-focused QM plan that would be implemented through formal auditing. That is, instead of trying in advance to identify all the deficits in departmental quality management, and then assigning individuals and task groups to create structures to fill the gaps, Maine's QMP specifies the future desired system in broad terms. ME-DEP then uses its own cadre of trained auditors to assess current practice against the \"condition expected\" in the QMP, knowing that in many cases basic quality management practices will need to be developed. This approach assists program managers, particularly (but not exclusively) in areas sensitive to environmental data standards, in creating systems and practices that are rooted in reality, and that are perceived to add value to the Department's core work instead of just additional paperwork. Audit reports create a continuous feedback loop assuring that written procedures document actual operations. Finally, the results of auditing identify areas in which the QM system (and its plan) should be elaborated or refined, which leads to an iterative process by which quality approaches are infused in all areas of DEP operations. In the twelve months following EPA-Region I's initial approval of Maine's QMP, a total of seven audits were completed at various program levels, including two focused on critical QMP elements: Documents and Records; and Computer Hardware/Software. This paper will briefly describe the background and implementation of this approach; identify some of the factors which led to success; and describe, using selected examples, some of the early outcomes of the program.</p>","PeriodicalId":77339,"journal":{"name":"Quality assurance (San Diego, Calif.)","volume":"9 3-4","pages":"137-44"},"PeriodicalIF":0.0,"publicationDate":"2001-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/713844025","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"22216240","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}