Predicting Fruit Fly Behaviour using TOLC device and DeepLabCut

Sanghoon Lee, Brayden Waugh, Garret O'Dell, Xiji Zhao, Wook-Sung Yoo, Dalhyung Kim
{"title":"Predicting Fruit Fly Behaviour using TOLC device and DeepLabCut","authors":"Sanghoon Lee, Brayden Waugh, Garret O'Dell, Xiji Zhao, Wook-Sung Yoo, Dalhyung Kim","doi":"10.1109/BIBE52308.2021.9635290","DOIUrl":null,"url":null,"abstract":"Animal behavior is an essential element in neuroscience study and noninvasive behavioral tracking of animals during experiments is crucial to many scientific pursuits. However, extracting detailed poses without markers in dynamically changing backgrounds has been a challenge. Transparent Omnidirectional Locomotion Compensator (TOLC), a tracking device, was recently developed to investigate longitudinal studies of a wide range of behavior in an unrestricted walking Drosophila without tethering and the conventional image segmentation method has been used to identify the centroids of the walking Drosophila. Since the shape or morphological features of the pixel-wise mask may vary depending on the captured images, however, the centroid calculation errors could occur when segmenting the walking Drosophila. To solve the problem, DeepLabCut, an open-source deep-learning toolbox performing markerless pose estimation on a sequence of images for quantitative behavioral analysis, was utilized to find the centroids of Drosophila melanogaster in a video recorded by TOLC. One hundred labeled images with centroids were created for the training of ResNet50 among 60,984 images and used for predicting 5,000 images in the experiment. The results of the experiment showed that the centroids predicted by the deep learning model are more accurate than the centroids from the morphological features in a specific part of the sequence of the images. Additionally, we created 200 labeled images with legs for the training of ResN et50 and predicted 5,000 images to investigate the difference between the centroids of a Drosophila melanogaster over the locations of the legs. The centroids generated from morphological features often provide incorrect information when the Drosophila melanogaster stretches out the front legs for some regions. Detailed analysis of experiment results and the future research direction with more extensive experiments are discussed.","PeriodicalId":343724,"journal":{"name":"2021 IEEE 21st International Conference on Bioinformatics and Bioengineering (BIBE)","volume":"49 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE 21st International Conference on Bioinformatics and Bioengineering (BIBE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/BIBE52308.2021.9635290","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Animal behavior is an essential element in neuroscience study and noninvasive behavioral tracking of animals during experiments is crucial to many scientific pursuits. However, extracting detailed poses without markers in dynamically changing backgrounds has been a challenge. Transparent Omnidirectional Locomotion Compensator (TOLC), a tracking device, was recently developed to investigate longitudinal studies of a wide range of behavior in an unrestricted walking Drosophila without tethering and the conventional image segmentation method has been used to identify the centroids of the walking Drosophila. Since the shape or morphological features of the pixel-wise mask may vary depending on the captured images, however, the centroid calculation errors could occur when segmenting the walking Drosophila. To solve the problem, DeepLabCut, an open-source deep-learning toolbox performing markerless pose estimation on a sequence of images for quantitative behavioral analysis, was utilized to find the centroids of Drosophila melanogaster in a video recorded by TOLC. One hundred labeled images with centroids were created for the training of ResNet50 among 60,984 images and used for predicting 5,000 images in the experiment. The results of the experiment showed that the centroids predicted by the deep learning model are more accurate than the centroids from the morphological features in a specific part of the sequence of the images. Additionally, we created 200 labeled images with legs for the training of ResN et50 and predicted 5,000 images to investigate the difference between the centroids of a Drosophila melanogaster over the locations of the legs. The centroids generated from morphological features often provide incorrect information when the Drosophila melanogaster stretches out the front legs for some regions. Detailed analysis of experiment results and the future research direction with more extensive experiments are discussed.
利用TOLC装置和DeepLabCut预测果蝇行为
动物行为是神经科学研究的重要组成部分,在实验过程中对动物进行无创行为跟踪对许多科学研究至关重要。然而,在动态变化的背景中提取没有标记的详细姿势一直是一个挑战。透明全向运动补偿器(Transparent Omnidirectional motion Compensator, TOLC)是一种跟踪装置,用于对不受约束行走的果蝇的广泛行为进行纵向研究,并使用传统的图像分割方法来识别行走的果蝇的质心。然而,由于像素掩模的形状或形态特征可能会根据捕获的图像而变化,因此在对行走的果蝇进行分割时可能会出现质心计算错误。为了解决这个问题,deepplabcut是一个开源的深度学习工具箱,对一系列图像进行无标记姿态估计,用于定量行为分析,利用它在TOLC录制的视频中找到果蝇的质心。在60,984张图像中,为ResNet50的训练创建了100张带有质心的标记图像,并在实验中用于预测5,000张图像。实验结果表明,深度学习模型预测的质心比从图像序列的特定部分的形态特征中预测的质心更准确。此外,我们创建了200张带有腿的标记图像,用于ResN et50的训练,并预测了5000张图像,以研究果蝇的质心在腿的位置上的差异。当黑腹果蝇在某些区域伸展前腿时,由形态学特征产生的质心常常提供不正确的信息。对实验结果进行了详细的分析,并对今后进行更广泛实验的研究方向进行了讨论。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信