粘附细胞培养相衬图像的自动细胞分割

Guochang Ye, Mehmet Kaya
{"title":"粘附细胞培养相衬图像的自动细胞分割","authors":"Guochang Ye, Mehmet Kaya","doi":"10.1109/BioSMART54244.2021.9677717","DOIUrl":null,"url":null,"abstract":"Cell segmentation is a critical step for performing image-based experimental analysis. This study proposes an efficient and accurate cell segmentation method. This image processing pipeline involving simple morphological operations automatically achieves cell segmentation for phase-contrast images. Manual/Visual cell segmentation serves as the control group to evaluate the proposed methodology's performance. Regarding the manual labeling data (156 images as ground truth), the proposed method achieves 90.07% as the average dice coefficient, 82.16% as the average intersection over union, and 6.52% as the average relative error on measuring cell growth area. Additionally, similar degrees of segmentation accuracy are observed on training a modified U-Net model (16848 images) individually with the ground truth and the generated data resulting from the proposed method. These results demonstrate good accuracy and high practicality of the proposed cell segmentation method capable of quantitating cell growth area and generating labeled data for deep learning cell segmentation techniques.","PeriodicalId":286026,"journal":{"name":"2021 4th International Conference on Bio-Engineering for Smart Technologies (BioSMART)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Automated Cell Segmentation for Phase-Contrast Images of Adhesion Cell Culture\",\"authors\":\"Guochang Ye, Mehmet Kaya\",\"doi\":\"10.1109/BioSMART54244.2021.9677717\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Cell segmentation is a critical step for performing image-based experimental analysis. This study proposes an efficient and accurate cell segmentation method. This image processing pipeline involving simple morphological operations automatically achieves cell segmentation for phase-contrast images. Manual/Visual cell segmentation serves as the control group to evaluate the proposed methodology's performance. Regarding the manual labeling data (156 images as ground truth), the proposed method achieves 90.07% as the average dice coefficient, 82.16% as the average intersection over union, and 6.52% as the average relative error on measuring cell growth area. Additionally, similar degrees of segmentation accuracy are observed on training a modified U-Net model (16848 images) individually with the ground truth and the generated data resulting from the proposed method. These results demonstrate good accuracy and high practicality of the proposed cell segmentation method capable of quantitating cell growth area and generating labeled data for deep learning cell segmentation techniques.\",\"PeriodicalId\":286026,\"journal\":{\"name\":\"2021 4th International Conference on Bio-Engineering for Smart Technologies (BioSMART)\",\"volume\":\"5 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-12-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 4th International Conference on Bio-Engineering for Smart Technologies (BioSMART)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/BioSMART54244.2021.9677717\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 4th International Conference on Bio-Engineering for Smart Technologies (BioSMART)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/BioSMART54244.2021.9677717","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

细胞分割是进行基于图像的实验分析的关键步骤。本研究提出了一种高效、准确的细胞分割方法。该图像处理流水线涉及简单的形态学操作,可自动实现相衬图像的细胞分割。手动/视觉细胞分割作为对照组来评估所提出的方法的性能。对于人工标注数据(156幅图像为ground truth),该方法在测量细胞生长面积时,平均骰子系数达到90.07%,平均交点/联合误差达到82.16%,平均相对误差达到6.52%。此外,在使用ground truth和由所提出的方法生成的数据单独训练改进的U-Net模型(16848张图像)时,可以观察到相似程度的分割精度。这些结果表明,所提出的细胞分割方法具有良好的准确性和实用性,能够定量细胞生长面积并为深度学习细胞分割技术生成标记数据。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Automated Cell Segmentation for Phase-Contrast Images of Adhesion Cell Culture
Cell segmentation is a critical step for performing image-based experimental analysis. This study proposes an efficient and accurate cell segmentation method. This image processing pipeline involving simple morphological operations automatically achieves cell segmentation for phase-contrast images. Manual/Visual cell segmentation serves as the control group to evaluate the proposed methodology's performance. Regarding the manual labeling data (156 images as ground truth), the proposed method achieves 90.07% as the average dice coefficient, 82.16% as the average intersection over union, and 6.52% as the average relative error on measuring cell growth area. Additionally, similar degrees of segmentation accuracy are observed on training a modified U-Net model (16848 images) individually with the ground truth and the generated data resulting from the proposed method. These results demonstrate good accuracy and high practicality of the proposed cell segmentation method capable of quantitating cell growth area and generating labeled data for deep learning cell segmentation techniques.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信