LYNSU:果蝇大脑荧光图像的自动三维神经纤层分割

IF 4.6 Q2 MATERIALS SCIENCE, BIOMATERIALS
Kai-Yi Hsu, Chi-Tin Shih, Nan-Yow Chen, Chung-Chuan Lo
{"title":"LYNSU:果蝇大脑荧光图像的自动三维神经纤层分割","authors":"Kai-Yi Hsu, Chi-Tin Shih, Nan-Yow Chen, Chung-Chuan Lo","doi":"10.3389/fninf.2024.1429670","DOIUrl":null,"url":null,"abstract":"The brain atlas, which provides information about the distribution of genes, proteins, neurons, or anatomical regions, plays a crucial role in contemporary neuroscience research. To analyze the spatial distribution of those substances based on images from different brain samples, we often need to warp and register individual brain images to a standard brain template. However, the process of warping and registration may lead to spatial errors, thereby severely reducing the accuracy of the analysis. To address this issue, we develop an automated method for segmenting neuropils in the <jats:italic>Drosophila</jats:italic> brain for fluorescence images from the <jats:italic>FlyCircuit</jats:italic> database. This technique allows future brain atlas studies to be conducted accurately at the individual level without warping and aligning to a standard brain template. Our method, LYNSU (Locating by YOLO and Segmenting by U-Net), consists of two stages. In the first stage, we use the YOLOv7 model to quickly locate neuropils and rapidly extract small-scale 3D images as input for the second stage model. This stage achieves a 99.4% accuracy rate in neuropil localization. In the second stage, we employ the 3D U-Net model to segment neuropils. LYNSU can achieve high accuracy in segmentation using a small training set consisting of images from merely 16 brains. We demonstrate LYNSU on six distinct neuropils or structures, achieving a high segmentation accuracy comparable to professional manual annotations with a 3D Intersection-over-Union (IoU) reaching up to 0.869. Our method takes only about 7 s to segment a neuropil while achieving a similar level of performance as the human annotators. To demonstrate a use case of LYNSU, we applied it to all female <jats:italic>Drosophila</jats:italic> brains from the <jats:italic>FlyCircuit</jats:italic> database to investigate the asymmetry of the mushroom bodies (MBs), the learning center of fruit flies. We used LYNSU to segment bilateral MBs and compare the volumes between left and right for each individual. Notably, of 8,703 valid brain samples, 10.14% showed bilateral volume differences that exceeded 10%. The study demonstrated the potential of the proposed method in high-throughput anatomical analysis and connectomics construction of the <jats:italic>Drosophila</jats:italic> brain.","PeriodicalId":2,"journal":{"name":"ACS Applied Bio Materials","volume":"33 1","pages":""},"PeriodicalIF":4.6000,"publicationDate":"2024-07-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"LYNSU: automated 3D neuropil segmentation of fluorescent images for Drosophila brains\",\"authors\":\"Kai-Yi Hsu, Chi-Tin Shih, Nan-Yow Chen, Chung-Chuan Lo\",\"doi\":\"10.3389/fninf.2024.1429670\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The brain atlas, which provides information about the distribution of genes, proteins, neurons, or anatomical regions, plays a crucial role in contemporary neuroscience research. To analyze the spatial distribution of those substances based on images from different brain samples, we often need to warp and register individual brain images to a standard brain template. However, the process of warping and registration may lead to spatial errors, thereby severely reducing the accuracy of the analysis. To address this issue, we develop an automated method for segmenting neuropils in the <jats:italic>Drosophila</jats:italic> brain for fluorescence images from the <jats:italic>FlyCircuit</jats:italic> database. This technique allows future brain atlas studies to be conducted accurately at the individual level without warping and aligning to a standard brain template. Our method, LYNSU (Locating by YOLO and Segmenting by U-Net), consists of two stages. In the first stage, we use the YOLOv7 model to quickly locate neuropils and rapidly extract small-scale 3D images as input for the second stage model. This stage achieves a 99.4% accuracy rate in neuropil localization. In the second stage, we employ the 3D U-Net model to segment neuropils. LYNSU can achieve high accuracy in segmentation using a small training set consisting of images from merely 16 brains. We demonstrate LYNSU on six distinct neuropils or structures, achieving a high segmentation accuracy comparable to professional manual annotations with a 3D Intersection-over-Union (IoU) reaching up to 0.869. Our method takes only about 7 s to segment a neuropil while achieving a similar level of performance as the human annotators. To demonstrate a use case of LYNSU, we applied it to all female <jats:italic>Drosophila</jats:italic> brains from the <jats:italic>FlyCircuit</jats:italic> database to investigate the asymmetry of the mushroom bodies (MBs), the learning center of fruit flies. We used LYNSU to segment bilateral MBs and compare the volumes between left and right for each individual. Notably, of 8,703 valid brain samples, 10.14% showed bilateral volume differences that exceeded 10%. The study demonstrated the potential of the proposed method in high-throughput anatomical analysis and connectomics construction of the <jats:italic>Drosophila</jats:italic> brain.\",\"PeriodicalId\":2,\"journal\":{\"name\":\"ACS Applied Bio Materials\",\"volume\":\"33 1\",\"pages\":\"\"},\"PeriodicalIF\":4.6000,\"publicationDate\":\"2024-07-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACS Applied Bio Materials\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.3389/fninf.2024.1429670\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"MATERIALS SCIENCE, BIOMATERIALS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACS Applied Bio Materials","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.3389/fninf.2024.1429670","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATERIALS SCIENCE, BIOMATERIALS","Score":null,"Total":0}
引用次数: 0

摘要

脑图谱可提供基因、蛋白质、神经元或解剖区域的分布信息,在当代神经科学研究中起着至关重要的作用。为了根据不同大脑样本的图像分析这些物质的空间分布,我们通常需要将单个大脑图像扭曲并配准到标准大脑模板上。然而,扭曲和配准过程可能会导致空间误差,从而严重降低分析的准确性。为了解决这个问题,我们开发了一种自动方法,用于根据 FlyCircuit 数据库中的荧光图像分割果蝇大脑中的神经线。这项技术使未来的脑图谱研究能够在个体水平上精确进行,而无需根据标准脑模板进行扭曲和对齐。我们的方法 LYNSU(通过 YOLO 定位和 U-Net 分割)包括两个阶段。在第一阶段,我们使用 YOLOv7 模型快速定位神经瞳孔,并快速提取小比例三维图像作为第二阶段模型的输入。这一阶段的神经瞳孔定位准确率达到 99.4%。在第二阶段,我们采用三维 U-Net 模型来分割神经瞳孔。LYNSU 只需使用由 16 个大脑图像组成的小型训练集,就能达到很高的分割准确率。我们在六种不同的神经瞳孔或结构上演示了 LYNSU,其分割准确率可与专业人工注释相媲美,三维交集-联合(IoU)高达 0.869。我们的方法只需 7 秒钟就能分割一个神经瞳孔,同时达到与人工标注相似的性能水平。为了演示 LYNSU 的使用案例,我们将其应用于 FlyCircuit 数据库中的所有雌果蝇大脑,以研究果蝇学习中心蘑菇体 (MB) 的不对称性。我们使用 LYNSU 对双侧蘑菇体进行分割,并比较每个个体的左右体积。值得注意的是,在 8703 个有效大脑样本中,10.14% 的样本显示双侧体积差异超过 10%。这项研究证明了所提出的方法在果蝇大脑高通量解剖分析和连接组学构建方面的潜力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
LYNSU: automated 3D neuropil segmentation of fluorescent images for Drosophila brains
The brain atlas, which provides information about the distribution of genes, proteins, neurons, or anatomical regions, plays a crucial role in contemporary neuroscience research. To analyze the spatial distribution of those substances based on images from different brain samples, we often need to warp and register individual brain images to a standard brain template. However, the process of warping and registration may lead to spatial errors, thereby severely reducing the accuracy of the analysis. To address this issue, we develop an automated method for segmenting neuropils in the Drosophila brain for fluorescence images from the FlyCircuit database. This technique allows future brain atlas studies to be conducted accurately at the individual level without warping and aligning to a standard brain template. Our method, LYNSU (Locating by YOLO and Segmenting by U-Net), consists of two stages. In the first stage, we use the YOLOv7 model to quickly locate neuropils and rapidly extract small-scale 3D images as input for the second stage model. This stage achieves a 99.4% accuracy rate in neuropil localization. In the second stage, we employ the 3D U-Net model to segment neuropils. LYNSU can achieve high accuracy in segmentation using a small training set consisting of images from merely 16 brains. We demonstrate LYNSU on six distinct neuropils or structures, achieving a high segmentation accuracy comparable to professional manual annotations with a 3D Intersection-over-Union (IoU) reaching up to 0.869. Our method takes only about 7 s to segment a neuropil while achieving a similar level of performance as the human annotators. To demonstrate a use case of LYNSU, we applied it to all female Drosophila brains from the FlyCircuit database to investigate the asymmetry of the mushroom bodies (MBs), the learning center of fruit flies. We used LYNSU to segment bilateral MBs and compare the volumes between left and right for each individual. Notably, of 8,703 valid brain samples, 10.14% showed bilateral volume differences that exceeded 10%. The study demonstrated the potential of the proposed method in high-throughput anatomical analysis and connectomics construction of the Drosophila brain.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
ACS Applied Bio Materials
ACS Applied Bio Materials Chemistry-Chemistry (all)
CiteScore
9.40
自引率
2.10%
发文量
464
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信