Björn Albrecht, Alexej Schatz, Katja Frei, York Winter
{"title":"KineWheel-DeepLabCut 使用交替频闪紫外光和白光照明自动注释爪子。","authors":"Björn Albrecht, Alexej Schatz, Katja Frei, York Winter","doi":"10.1523/ENEURO.0304-23.2024","DOIUrl":null,"url":null,"abstract":"<p><p>Uncovering the relationships between neural circuits, behavior, and neural dysfunction may require rodent pose tracking. While open-source toolkits such as DeepLabCut have revolutionized markerless pose estimation using deep neural networks, the training process still requires human intervention for annotating key points of interest in video data. To further reduce human labor for neural network training, we developed a method that automatically generates annotated image datasets of rodent paw placement in a laboratory setting. It uses invisible but fluorescent markers that become temporarily visible under UV light. Through stroboscopic alternating illumination, adjacent video frames taken at 720 Hz are either UV or white light illuminated. After color filtering the UV-exposed video frames, the UV markings are identified and the paw locations are deterministically mapped. This paw information is then transferred to automatically annotate paw positions in the next white light-exposed frame that is later used for training the neural network. We demonstrate the effectiveness of our method using a KineWheel-DeepLabCut setup for the markerless tracking of the four paws of a harness-fixed mouse running on top of the transparent wheel with mirror. Our automated approach, made available open-source, achieves high-quality position annotations and significantly reduces the need for human involvement in the neural network training process, paving the way for more efficient and streamlined rodent pose tracking in neuroscience research.</p>","PeriodicalId":11617,"journal":{"name":"eNeuro","volume":null,"pages":null},"PeriodicalIF":2.7000,"publicationDate":"2024-08-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11363514/pdf/","citationCount":"0","resultStr":"{\"title\":\"KineWheel-DeepLabCut Automated Paw Annotation Using Alternating Stroboscopic UV and White Light Illumination.\",\"authors\":\"Björn Albrecht, Alexej Schatz, Katja Frei, York Winter\",\"doi\":\"10.1523/ENEURO.0304-23.2024\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Uncovering the relationships between neural circuits, behavior, and neural dysfunction may require rodent pose tracking. While open-source toolkits such as DeepLabCut have revolutionized markerless pose estimation using deep neural networks, the training process still requires human intervention for annotating key points of interest in video data. To further reduce human labor for neural network training, we developed a method that automatically generates annotated image datasets of rodent paw placement in a laboratory setting. It uses invisible but fluorescent markers that become temporarily visible under UV light. Through stroboscopic alternating illumination, adjacent video frames taken at 720 Hz are either UV or white light illuminated. After color filtering the UV-exposed video frames, the UV markings are identified and the paw locations are deterministically mapped. This paw information is then transferred to automatically annotate paw positions in the next white light-exposed frame that is later used for training the neural network. We demonstrate the effectiveness of our method using a KineWheel-DeepLabCut setup for the markerless tracking of the four paws of a harness-fixed mouse running on top of the transparent wheel with mirror. Our automated approach, made available open-source, achieves high-quality position annotations and significantly reduces the need for human involvement in the neural network training process, paving the way for more efficient and streamlined rodent pose tracking in neuroscience research.</p>\",\"PeriodicalId\":11617,\"journal\":{\"name\":\"eNeuro\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":2.7000,\"publicationDate\":\"2024-08-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11363514/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"eNeuro\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.1523/ENEURO.0304-23.2024\",\"RegionNum\":3,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2024/8/1 0:00:00\",\"PubModel\":\"Print\",\"JCR\":\"Q3\",\"JCRName\":\"NEUROSCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"eNeuro","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1523/ENEURO.0304-23.2024","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/8/1 0:00:00","PubModel":"Print","JCR":"Q3","JCRName":"NEUROSCIENCES","Score":null,"Total":0}
KineWheel-DeepLabCut Automated Paw Annotation Using Alternating Stroboscopic UV and White Light Illumination.
Uncovering the relationships between neural circuits, behavior, and neural dysfunction may require rodent pose tracking. While open-source toolkits such as DeepLabCut have revolutionized markerless pose estimation using deep neural networks, the training process still requires human intervention for annotating key points of interest in video data. To further reduce human labor for neural network training, we developed a method that automatically generates annotated image datasets of rodent paw placement in a laboratory setting. It uses invisible but fluorescent markers that become temporarily visible under UV light. Through stroboscopic alternating illumination, adjacent video frames taken at 720 Hz are either UV or white light illuminated. After color filtering the UV-exposed video frames, the UV markings are identified and the paw locations are deterministically mapped. This paw information is then transferred to automatically annotate paw positions in the next white light-exposed frame that is later used for training the neural network. We demonstrate the effectiveness of our method using a KineWheel-DeepLabCut setup for the markerless tracking of the four paws of a harness-fixed mouse running on top of the transparent wheel with mirror. Our automated approach, made available open-source, achieves high-quality position annotations and significantly reduces the need for human involvement in the neural network training process, paving the way for more efficient and streamlined rodent pose tracking in neuroscience research.
期刊介绍:
An open-access journal from the Society for Neuroscience, eNeuro publishes high-quality, broad-based, peer-reviewed research focused solely on the field of neuroscience. eNeuro embodies an emerging scientific vision that offers a new experience for authors and readers, all in support of the Society’s mission to advance understanding of the brain and nervous system.