Mengyuan Xue, Diego H Gonzalez, Emmanuel Osikpa, Xue Gao, Peter B Lillehoj
{"title":"Rapid and automated interpretation of CRISPR-Cas13-based lateral flow assay test results using machine learning.","authors":"Mengyuan Xue, Diego H Gonzalez, Emmanuel Osikpa, Xue Gao, Peter B Lillehoj","doi":"10.1039/d4sd00314d","DOIUrl":null,"url":null,"abstract":"<p><p>CRISPR-Cas-based lateral flow assays (LFAs) have emerged as a promising diagnostic tool for ultrasensitive detection of nucleic acids, offering improved speed, simplicity and cost-effectiveness compared to polymerase chain reaction (PCR)-based assays. However, visual interpretation of CRISPR-Cas-based LFA test results is prone to human error, potentially leading to false-positive or false-negative outcomes when analyzing test/control lines. To address this limitation, we have developed two neural network models: one based on a fully convolutional neural network and the other on a lightweight mobile-optimized neural network for automated interpretation of CRISPR-Cas-based LFA test results. To demonstrate proof of concept, these models were applied to interpret results from a CRISPR-Cas13-based LFA for the detection of the SARS-CoV-2 N gene, a key marker for COVID-19 infection. The models were trained, evaluated, and validated using smartphone-captured images of LFA devices in various orientations with different backgrounds, lighting conditions, and image qualities. A total of 3146 images (1569 negative, 1577 positive) captured using an iPhone 13 or Samsung Galaxy A52 Android smartphone were analyzed using the trained models, which classified the LFA results within 0.2 s with 96.5% accuracy compared to the ground truth. These results demonstrate the potential of machine learning to accurately interpret test results of CRISPR-Cas-based LFAs using smartphone-captured images in real-world settings, enabling the practical use of CRISPR-Cas-based diagnostic tools for self- and at-home testing.</p>","PeriodicalId":74786,"journal":{"name":"Sensors & diagnostics","volume":" ","pages":""},"PeriodicalIF":3.5000,"publicationDate":"2024-12-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11726308/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Sensors & diagnostics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1039/d4sd00314d","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"CHEMISTRY, ANALYTICAL","Score":null,"Total":0}
引用次数: 0
Abstract
CRISPR-Cas-based lateral flow assays (LFAs) have emerged as a promising diagnostic tool for ultrasensitive detection of nucleic acids, offering improved speed, simplicity and cost-effectiveness compared to polymerase chain reaction (PCR)-based assays. However, visual interpretation of CRISPR-Cas-based LFA test results is prone to human error, potentially leading to false-positive or false-negative outcomes when analyzing test/control lines. To address this limitation, we have developed two neural network models: one based on a fully convolutional neural network and the other on a lightweight mobile-optimized neural network for automated interpretation of CRISPR-Cas-based LFA test results. To demonstrate proof of concept, these models were applied to interpret results from a CRISPR-Cas13-based LFA for the detection of the SARS-CoV-2 N gene, a key marker for COVID-19 infection. The models were trained, evaluated, and validated using smartphone-captured images of LFA devices in various orientations with different backgrounds, lighting conditions, and image qualities. A total of 3146 images (1569 negative, 1577 positive) captured using an iPhone 13 or Samsung Galaxy A52 Android smartphone were analyzed using the trained models, which classified the LFA results within 0.2 s with 96.5% accuracy compared to the ground truth. These results demonstrate the potential of machine learning to accurately interpret test results of CRISPR-Cas-based LFAs using smartphone-captured images in real-world settings, enabling the practical use of CRISPR-Cas-based diagnostic tools for self- and at-home testing.