{"title":"DeepEthoProfile-Rapid Behavior Recognition in Long-Term Recorded Home-Cage Mice.","authors":"Andrei Istudor, Alexej Schatz, York Winter","doi":"10.1523/ENEURO.0369-24.2025","DOIUrl":null,"url":null,"abstract":"<p><p>Animal behavior is crucial for understanding both normal brain function and dysfunction. To facilitate behavior analysis of mice within their home environments, we developed DeepEthoProfile, an open-source software powered by a deep convolutional neural network for efficient behavior classification. DeepEthoProfile requires no spatial cues for either training or processing and is designed to perform reliably under real laboratory conditions, tolerating variations in lighting and cage bedding. For data collection, we introduce EthoProfiler, a mobile cage rack system capable of simultaneously recording up to 10 singly housed mice. We used 36 h of manually annotated video data sampled in 5 min clips from a 48 h video database of 10 mice. This published dataset provides a reference that can facilitate further research. DeepEthoProfile achieved an overall classification accuracy of over 83%, comparable with human-level accuracy. The model also performed on par with other state-of-the-art solutions on another published dataset ( Jhuang et al., 2010). Designed for long-term experiments, DeepEthoProfile is highly efficient-capable of annotating nearly 2,000 frames per second and can be customized for various research needs.</p>","PeriodicalId":11617,"journal":{"name":"eNeuro","volume":"12 7","pages":""},"PeriodicalIF":2.7000,"publicationDate":"2025-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12265860/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"eNeuro","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1523/ENEURO.0369-24.2025","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/7/1 0:00:00","PubModel":"Print","JCR":"Q3","JCRName":"NEUROSCIENCES","Score":null,"Total":0}
引用次数: 0
Abstract
Animal behavior is crucial for understanding both normal brain function and dysfunction. To facilitate behavior analysis of mice within their home environments, we developed DeepEthoProfile, an open-source software powered by a deep convolutional neural network for efficient behavior classification. DeepEthoProfile requires no spatial cues for either training or processing and is designed to perform reliably under real laboratory conditions, tolerating variations in lighting and cage bedding. For data collection, we introduce EthoProfiler, a mobile cage rack system capable of simultaneously recording up to 10 singly housed mice. We used 36 h of manually annotated video data sampled in 5 min clips from a 48 h video database of 10 mice. This published dataset provides a reference that can facilitate further research. DeepEthoProfile achieved an overall classification accuracy of over 83%, comparable with human-level accuracy. The model also performed on par with other state-of-the-art solutions on another published dataset ( Jhuang et al., 2010). Designed for long-term experiments, DeepEthoProfile is highly efficient-capable of annotating nearly 2,000 frames per second and can be customized for various research needs.
动物行为对于理解正常的大脑功能和功能障碍至关重要。为了便于对小鼠在其家庭环境中的行为进行分析,我们开发了DeepEthoProfile,这是一个由深度卷积神经网络驱动的开源软件,用于有效的行为分类。DeepEthoProfile不需要空间线索来进行训练或处理,并且可以在真实的实验室条件下可靠地执行,可以容忍照明和笼子垫层的变化。对于数据收集,我们引入了EthoProfiler,这是一种移动笼子架系统,能够同时记录多达10只单房小鼠。我们使用了从10只小鼠的48小时视频数据库中以5分钟片段采样的36小时人工注释视频数据。这个出版的数据集提供了一个参考,可以促进进一步的研究。DeepEthoProfile实现了超过83%的总体分类准确率,与人类水平的准确率相当。该模型在另一个已发布数据集上的表现也与其他最先进的解决方案相当(Jhuang et al., 2010)。DeepEthoProfile专为长期实验设计,效率高,每秒注释近2000帧,可根据各种研究需求进行定制。
期刊介绍:
An open-access journal from the Society for Neuroscience, eNeuro publishes high-quality, broad-based, peer-reviewed research focused solely on the field of neuroscience. eNeuro embodies an emerging scientific vision that offers a new experience for authors and readers, all in support of the Society’s mission to advance understanding of the brain and nervous system.