PMotion: an advanced markerless pose estimation approach based on novel deep learning framework used to reveal neurobehavior.

IF 3.7 3区 医学 Q2 ENGINEERING, BIOMEDICAL
Xiaodong Lv, Haijie Liu, Luyao Chen, Chuankai Dai, Penghu Wei, Junwei Hao, Guoguang Zhao
{"title":"PMotion: an advanced markerless pose estimation approach based on novel deep learning framework used to reveal neurobehavior.","authors":"Xiaodong Lv,&nbsp;Haijie Liu,&nbsp;Luyao Chen,&nbsp;Chuankai Dai,&nbsp;Penghu Wei,&nbsp;Junwei Hao,&nbsp;Guoguang Zhao","doi":"10.1088/1741-2552/acd603","DOIUrl":null,"url":null,"abstract":"<p><p><i>Objective.</i>The evaluation of animals' motion behavior has played a vital role in neuromuscular biomedical research and clinical diagnostics, which reflects the changes caused by neuromodulation or neurodamage. Currently, the existing animal pose estimation methods are unreliable, unpractical, and inaccurate.<i>Approach.</i>Data augmentation (random scaling, random standard deviation Gaussian blur, random contrast, and random uniform color quantization) is adopted to augment image dataset. For the key points recognition, we present a novel efficient convolutional deep learning framework (PMotion), which combines modified ConvNext using multi-kernel feature fusion and self-defined stacked Hourglass block with SiLU activation function.<i>Main results.</i>PMotion is useful to predict the key points of dynamics of unmarked animal body joints in real time with high spatial precision. Gait quantification (step length, step height, and joint angle) was performed for the study of lateral lower limb movements with rats on a treadmill.<i>Significance.</i>The performance accuracy of PMotion on rat joint dataset was improved by 1.98, 1.46, and 0.55 pixels compared with deepposekit, deeplabcut, and stacked hourglass, respectively. This approach also may be applied for neurobehavioral studies of freely moving animals' behavior in challenging environments (e.g.<i>Drosophila melanogaster</i>and openfield-Pranav) with a high accuracy.</p>","PeriodicalId":16753,"journal":{"name":"Journal of neural engineering","volume":null,"pages":null},"PeriodicalIF":3.7000,"publicationDate":"2023-07-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of neural engineering","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1088/1741-2552/acd603","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0

Abstract

Objective.The evaluation of animals' motion behavior has played a vital role in neuromuscular biomedical research and clinical diagnostics, which reflects the changes caused by neuromodulation or neurodamage. Currently, the existing animal pose estimation methods are unreliable, unpractical, and inaccurate.Approach.Data augmentation (random scaling, random standard deviation Gaussian blur, random contrast, and random uniform color quantization) is adopted to augment image dataset. For the key points recognition, we present a novel efficient convolutional deep learning framework (PMotion), which combines modified ConvNext using multi-kernel feature fusion and self-defined stacked Hourglass block with SiLU activation function.Main results.PMotion is useful to predict the key points of dynamics of unmarked animal body joints in real time with high spatial precision. Gait quantification (step length, step height, and joint angle) was performed for the study of lateral lower limb movements with rats on a treadmill.Significance.The performance accuracy of PMotion on rat joint dataset was improved by 1.98, 1.46, and 0.55 pixels compared with deepposekit, deeplabcut, and stacked hourglass, respectively. This approach also may be applied for neurobehavioral studies of freely moving animals' behavior in challenging environments (e.g.Drosophila melanogasterand openfield-Pranav) with a high accuracy.

PMotion:一种先进的无标记姿态估计方法,基于新颖的深度学习框架,用于揭示神经行为。
目标。动物运动行为的评价反映了神经调节或神经损伤引起的变化,在神经肌肉生物医学研究和临床诊断中起着至关重要的作用。方法:采用随机缩放、随机标准差高斯模糊、随机对比度和随机均匀颜色量化等方法对图像数据集进行增强。在关键点识别方面,我们提出了一种新颖高效的卷积深度学习框架(PMotion),该框架将改进的基于多核特征融合的ConvNext和自定义的带有SiLU激活函数的堆叠沙漏块相结合。主要的结果。PMotion可以实时预测无标记动物身体关节的动态关键点,具有较高的空间精度。结果表明:与deepposekit、deepplabcut和stacked hourglass相比,PMotion在大鼠关节数据集上的表现精度分别提高了1.98、1.46和0.55个像素。该方法也可用于具有挑战性的环境中自由运动动物行为的神经行为学研究(如黑腹果蝇和野鼠),具有较高的准确性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Journal of neural engineering
Journal of neural engineering 工程技术-工程:生物医学
CiteScore
7.80
自引率
12.50%
发文量
319
审稿时长
4.2 months
期刊介绍: The goal of Journal of Neural Engineering (JNE) is to act as a forum for the interdisciplinary field of neural engineering where neuroscientists, neurobiologists and engineers can publish their work in one periodical that bridges the gap between neuroscience and engineering. The journal publishes articles in the field of neural engineering at the molecular, cellular and systems levels. The scope of the journal encompasses experimental, computational, theoretical, clinical and applied aspects of: Innovative neurotechnology; Brain-machine (computer) interface; Neural interfacing; Bioelectronic medicines; Neuromodulation; Neural prostheses; Neural control; Neuro-rehabilitation; Neurorobotics; Optical neural engineering; Neural circuits: artificial & biological; Neuromorphic engineering; Neural tissue regeneration; Neural signal processing; Theoretical and computational neuroscience; Systems neuroscience; Translational neuroscience; Neuroimaging.
文献相关原料
公司名称 产品信息 采购帮参考价格
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信