{"title":"Minimizing Systematic Errors in Quantitative High Throughput Screening Data Using Standardization, Background Subtraction, and Non-Parametric Regression.","authors":"Mitas Ray, Keith Shockley, Grace Kissling","doi":"","DOIUrl":"","url":null,"abstract":"<p><p>Quantitative high throughput screening (qHTS) has the potential to transform traditional toxicological testing by greatly increasing throughput and lowering costs on a per chemical basis. However, before qHTS data can be utilized for toxicity assessment, systematic errors such as row, column, cluster, and edge effects in raw data readouts need to be removed. Normalization seeks to minimize effects of systematic errors. Linear (LN) normalization, such as standardization and background removal, minimizes row and column effects. Alternatively, local weighted scatterplot smoothing (LOESS or LO) minimizes cluster effects. Both approaches have been used to normalize large scale data sets in other contexts. A new method is proposed in this paper to combine these two approaches (LNLO) to account for systematic errors within and between experiments. Heat maps illustrate that the LNLO method is more effective in removing systematic error than either the LN or the LO approach alone. All analyses were performed on an estrogen receptor agonist assay data set generated as part of the Tox21 collaboration.</p>","PeriodicalId":91818,"journal":{"name":"The journal of experimental secondary science","volume":"3 2","pages":""},"PeriodicalIF":0.0,"publicationDate":"2014-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5102623/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144181474","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}