Journal of Neuroscience Methods最新文献

筛选
英文 中文
Cross-subject emotion recognition in brain-computer interface based on frequency band attention graph convolutional adversarial neural networks 基于频带注意图卷积对抗神经网络的脑机接口中的跨主体情感识别。
IF 2.7 4区 医学
Journal of Neuroscience Methods Pub Date : 2024-09-03 DOI: 10.1016/j.jneumeth.2024.110276
Shinan Chen , Yuchen Wang , Xuefen Lin , Xiaoyong Sun , Weihua Li , Weifeng Ma
{"title":"Cross-subject emotion recognition in brain-computer interface based on frequency band attention graph convolutional adversarial neural networks","authors":"Shinan Chen ,&nbsp;Yuchen Wang ,&nbsp;Xuefen Lin ,&nbsp;Xiaoyong Sun ,&nbsp;Weihua Li ,&nbsp;Weifeng Ma","doi":"10.1016/j.jneumeth.2024.110276","DOIUrl":"10.1016/j.jneumeth.2024.110276","url":null,"abstract":"<div><h3><em>Background:</em></h3><p>Emotion is an important area in neuroscience. Cross-subject emotion recognition based on electroencephalogram (EEG) data is challenging due to physiological differences between subjects. Domain gap, which refers to the different distributions of EEG data at different subjects, has attracted great attention for cross-subject emotion recognition.</p></div><div><h3><em>Comparison with existing methods:</em></h3><p>This study focuses on narrowing the domain gap between subjects through the emotional frequency bands and the relationship information between EEG channels. Emotional frequency band features represent the energy distribution of EEG data in different frequency ranges, while relationship information between EEG channels provides spatial distribution information about EEG data.</p></div><div><h3><em>New method:</em></h3><p>To achieve this, this paper proposes a model called the Frequency Band Attention Graph convolutional Adversarial neural Network (FBAGAN). This model includes three components: a feature extractor, a classifier, and a discriminator. The feature extractor consists of a layer with a frequency band attention mechanism and a graph convolutional neural network. The mechanism effectively extracts frequency band information by assigning weights and Graph Convolutional Networks can extract relationship information between EEG channels by modeling the graph structure. The discriminator then helps minimize the gap in the frequency information and relationship information between the source and target domains, improving the model’s ability to generalize.</p></div><div><h3><em>Results:</em></h3><p>The FBAGAN model is extensively tested on the SEED, SEED-IV, and DEAP datasets. The accuracy and standard deviation scores are 88.17% and 4.88, respectively, on the SEED dataset, and 77.35% and 3.72 on the SEED-IV dataset. On the DEAP dataset, the model achieves 69.64% for Arousal and 65.18% for Valence. These results outperform most existing models.</p></div><div><h3><em>Conclusions:</em></h3><p>The experiments indicate that FBAGAN effectively addresses the challenges of transferring EEG channel domain and frequency band domain, leading to improved performance.</p></div>","PeriodicalId":16415,"journal":{"name":"Journal of Neuroscience Methods","volume":null,"pages":null},"PeriodicalIF":2.7,"publicationDate":"2024-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142140348","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Reconstruction of natural images from human fMRI using a three-stage multi-level deep fusion model 利用三阶段多层次深度融合模型从人体 fMRI 重建自然图像
IF 2.7 4区 医学
Journal of Neuroscience Methods Pub Date : 2024-08-31 DOI: 10.1016/j.jneumeth.2024.110269
Lu Meng , Zhenxuan Tang , Yangqian Liu
{"title":"Reconstruction of natural images from human fMRI using a three-stage multi-level deep fusion model","authors":"Lu Meng ,&nbsp;Zhenxuan Tang ,&nbsp;Yangqian Liu","doi":"10.1016/j.jneumeth.2024.110269","DOIUrl":"10.1016/j.jneumeth.2024.110269","url":null,"abstract":"<div><h3>Background</h3><p>Image reconstruction is a critical task in brain decoding research, primarily utilizing functional magnetic resonance imaging (fMRI) data. However, due to challenges such as limited samples in fMRI data, the quality of reconstruction results often remains poor.</p></div><div><h3>New method</h3><p>We proposed a three-stage multi-level deep fusion model (TS-ML-DFM). The model employed a three-stage training process, encompassing components such as image encoders, generators, discriminators, and fMRI encoders. In this method, we incorporated distinct supplementary features derived separately from depth images and original images. Additionally, the method integrated several components, including a random shift module, dual attention module, and multi-level feature fusion module.</p></div><div><h3>Results</h3><p>In both qualitative and quantitative comparisons on the Horikawa17 and VanGerven10 datasets, our method exhibited excellent performance.</p><p>Comparison with existing methods: For example, on the primary Horikawa17 dataset, our method was compared with other leading methods based on metrics the average hash value, histogram similarity, mutual information, structural similarity accuracy, AlexNet(2), AlexNet(5), and pairwise human perceptual similarity accuracy. Compared to the second-ranked results in each metric, the proposed method achieved improvements of 0.99 %, 3.62 %, 3.73 %, 2.45 %, 3.51 %, 0.62 %, and 1.03 %, respectively. In terms of the SwAV top-level semantic metric, a substantial improvement of 10.53 % was achieved compared to the second-ranked result in the pixel-level reconstruction methods.</p></div><div><h3>Conclusions</h3><p>The TS-ML-DFM method proposed in this study, when applied to decoding brain visual patterns using fMRI data, has outperformed previous algorithms, thereby facilitating further advancements in research within this field.</p></div>","PeriodicalId":16415,"journal":{"name":"Journal of Neuroscience Methods","volume":null,"pages":null},"PeriodicalIF":2.7,"publicationDate":"2024-08-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S0165027024002140/pdfft?md5=cf2903d860a78e3a684efb8d1cc769d2&pid=1-s2.0-S0165027024002140-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142120065","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Novel comprehensive analysis of skilled reaching and grasping behavior in adult rats 对成年大鼠熟练伸手和抓握行为的新颖综合分析
IF 2.7 4区 医学
Journal of Neuroscience Methods Pub Date : 2024-08-31 DOI: 10.1016/j.jneumeth.2024.110271
Pawan Sharma , Yixuan Du , Kripi Singapuri , Debbi Moalemi Delafraz , Prithvi K. Shah
{"title":"Novel comprehensive analysis of skilled reaching and grasping behavior in adult rats","authors":"Pawan Sharma ,&nbsp;Yixuan Du ,&nbsp;Kripi Singapuri ,&nbsp;Debbi Moalemi Delafraz ,&nbsp;Prithvi K. Shah","doi":"10.1016/j.jneumeth.2024.110271","DOIUrl":"10.1016/j.jneumeth.2024.110271","url":null,"abstract":"<div><h3>Background</h3><p>Reaching and grasping (R&amp;G) in rats is commonly used as an outcome measure to investigate the effectiveness of rehabilitation or treatment strategies to recover forelimb function post spinal cord injury. Kinematic analysis has been limited to the wrist and digit movements. Kinematic profiles of the more proximal body segments that play an equally crucial role in successfully executing the task remain unexplored. Additionally, understanding of different forelimb muscle activity, their interactions, and their correlation with the kinematics of R&amp;G movement is scarce.</p></div><div><h3>New method</h3><p>In this work, novel methodologies to comprehensively assess and quantify the 3D kinematics of the proximal and distal forelimb joints along with associated muscle activity during R&amp;G movements in adult rats are developed and discussed.</p></div><div><h3>Results</h3><p>Our data show that different phases of R&amp;G identified using the novel kinematic and EMG-based approach correlate with the well-established descriptors of R&amp;G stages derived from the Whishaw scoring system. Additionally, the developed methodology allows describing the temporal activity of individual muscles and associated mechanical and physiological properties during different phases of the motor task.</p></div><div><h3>Comparison with existing method(s)</h3><p>R&amp;G phases and their sub-components are identified and quantified using the developed kinematic and EMG-based approach. Importantly, the identified R&amp;G phases closely match the well-established qualitative descriptors of the R&amp;G task proposed by Whishaw and colleagues.</p></div><div><h3>Conclusions</h3><p>The present work provides an in-depth objective analysis of kinematics and EMG activity of R&amp;G behavior, paving the way to a standardized approach to assessing this critical rodent motor function in future studies.</p></div>","PeriodicalId":16415,"journal":{"name":"Journal of Neuroscience Methods","volume":null,"pages":null},"PeriodicalIF":2.7,"publicationDate":"2024-08-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142108291","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
High quality, high throughput, and low-cost simultaneous video recording of 60 animals in operant chambers using PiRATeMC 使用 PiRATeMC 对操作室中的 60 只动物进行高质量、高通量和低成本的同步视频记录。
IF 2.7 4区 医学
Journal of Neuroscience Methods Pub Date : 2024-08-31 DOI: 10.1016/j.jneumeth.2024.110270
Jarryd Ramborger , Sumay Kalra , Joseph Mosquera , Alexander C.W. Smith , Olivier George
{"title":"High quality, high throughput, and low-cost simultaneous video recording of 60 animals in operant chambers using PiRATeMC","authors":"Jarryd Ramborger ,&nbsp;Sumay Kalra ,&nbsp;Joseph Mosquera ,&nbsp;Alexander C.W. Smith ,&nbsp;Olivier George","doi":"10.1016/j.jneumeth.2024.110270","DOIUrl":"10.1016/j.jneumeth.2024.110270","url":null,"abstract":"<div><h3>Background</h3><p>The development of Raspberry Pi-based recording devices for video analyses of drug self-administration studies has been shown to be promising in terms of affordability, customizability, and capacity to extract in-depth behavioral patterns. Yet, most video recording systems are limited to a few cameras making them incompatible with large-scale studies.</p></div><div><h3>New method</h3><p>We expanded the PiRATeMC (Pi-based Remote Acquisition Technology for Motion Capture) recording system by increasing its scale, modifying its code, and adding equipment to accommodate large-scale video acquisition, accompanied by data on throughput capabilities, video fidelity, synchronicity of devices, and comparisons between Raspberry Pi 3B+ and 4B models.</p></div><div><h3>Results</h3><p>Using PiRATeMC default recording parameters resulted in minimal storage (∼350MB/h), high throughput (&lt; ∼120 seconds/Pi), high video fidelity, and synchronicity within ∼0.02 seconds, affording the ability to simultaneously record 60 animals in individual self-administration chambers for various session lengths at a fraction of commercial costs. No consequential differences were found between Raspberry Pi models.</p></div><div><h3>Comparison with existing method(s)</h3><p>This system allows greater acquisition of video data simultaneously than other video recording systems by an order of magnitude with less storage needs and lower costs. Additionally, we report in-depth quantitative assessments of throughput, fidelity, and synchronicity, displaying real-time system capabilities.</p></div><div><h3>Conclusions</h3><p>The system presented is able to be fully installed in a month’s time by a single technician and provides a scalable, low cost, and quality-assured procedure with a high-degree of customization and synchronicity between recording devices, capable of recording a large number of subjects and timeframes with high turnover in a variety of species and settings.</p></div>","PeriodicalId":16415,"journal":{"name":"Journal of Neuroscience Methods","volume":null,"pages":null},"PeriodicalIF":2.7,"publicationDate":"2024-08-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S0165027024002152/pdfft?md5=06fea00dd45e5f4ed19740e351731e89&pid=1-s2.0-S0165027024002152-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142120064","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Pushing the boundaries of brain-computer interfacing (BCI) and neuron-electronics 推动脑机接口(BCI)和神经元电子学的发展。
IF 2.7 4区 医学
Journal of Neuroscience Methods Pub Date : 2024-08-30 DOI: 10.1016/j.jneumeth.2024.110274
Mohammed Seghir Guellil, Fatima Kies, Emad Kamil Hussein, Mohammad Shabaz, Robert E. Hampson
{"title":"Pushing the boundaries of brain-computer interfacing (BCI) and neuron-electronics","authors":"Mohammed Seghir Guellil,&nbsp;Fatima Kies,&nbsp;Emad Kamil Hussein,&nbsp;Mohammad Shabaz,&nbsp;Robert E. Hampson","doi":"10.1016/j.jneumeth.2024.110274","DOIUrl":"10.1016/j.jneumeth.2024.110274","url":null,"abstract":"","PeriodicalId":16415,"journal":{"name":"Journal of Neuroscience Methods","volume":null,"pages":null},"PeriodicalIF":2.7,"publicationDate":"2024-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142108292","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Small animal brain surgery with neither a brain atlas nor a stereotaxic frame 既没有脑图谱也没有立体定向框架的小动物脑部手术
IF 2.7 4区 医学
Journal of Neuroscience Methods Pub Date : 2024-08-28 DOI: 10.1016/j.jneumeth.2024.110272
Shaked Ron, Hadar Beeri, Ori Shinover , Noam M. Tur , Jonathan Brokman , Ben Engelhard, Yoram Gutfreund
{"title":"Small animal brain surgery with neither a brain atlas nor a stereotaxic frame","authors":"Shaked Ron,&nbsp;Hadar Beeri,&nbsp;Ori Shinover ,&nbsp;Noam M. Tur ,&nbsp;Jonathan Brokman ,&nbsp;Ben Engelhard,&nbsp;Yoram Gutfreund","doi":"10.1016/j.jneumeth.2024.110272","DOIUrl":"10.1016/j.jneumeth.2024.110272","url":null,"abstract":"<div><h3>Background</h3><p>Stereotaxic surgery is a cornerstone in brain research for the precise positioning of electrodes and probes, but its application is limited to species with available brain atlases and tailored stereotaxic frames. Addressing this limitation, we introduce an alternative technique for small animal brain surgery that requires neither an aligned brain atlas nor a stereotaxic frame.</p></div><div><h3>New method</h3><p>The new method requires an ex-vivo high-contrast MRI brain scan of one specimen and access to a micro-CT scanner. The process involves attaching miniature markers to the skull, followed by CT scanning of the head. Subsequently, MRI and CT images are co-registered using standard image processing software and the targets for brain recordings are marked in the MRI image. During surgery, the animal's head is stabilized in any convenient orientation, and the probe’s 3D position and angle are tracked using a multi-camera system. We have developed a software that utilizes the on-skull markers as fiducial points to align the CT/MRI 3D model with the surgical positioning system, and in turn instructs the surgeon how to move the probe to reach the targets within the brain.</p></div><div><h3>Results</h3><p>Our technique allows the execution of insertion tracks connecting two points in the brain. We successfully applied this method for neuropixels probe positioning in owls, quails, and mice, demonstrating its versatility.</p></div><div><h3>Comparison with existing methods</h3><p>We present an alternative to traditional stereotaxic brain surgeries that does not require established stereotaxic tools. Thus, this method is especially of advantage for research in non-standard and novel animal models.</p></div>","PeriodicalId":16415,"journal":{"name":"Journal of Neuroscience Methods","volume":null,"pages":null},"PeriodicalIF":2.7,"publicationDate":"2024-08-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142094906","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
NeuroQuantify – An image analysis software for detection and quantification of neuron cells and neurite lengths using deep learning NeuroQuantify - 利用深度学习检测和量化神经元细胞和神经元长度的图像分析软件。
IF 2.7 4区 医学
Journal of Neuroscience Methods Pub Date : 2024-08-27 DOI: 10.1016/j.jneumeth.2024.110273
Ka My Dang , Yi Jia Zhang , Tianchen Zhang , Chao Wang , Anton Sinner , Piero Coronica , Joyce K.S. Poon
{"title":"NeuroQuantify – An image analysis software for detection and quantification of neuron cells and neurite lengths using deep learning","authors":"Ka My Dang ,&nbsp;Yi Jia Zhang ,&nbsp;Tianchen Zhang ,&nbsp;Chao Wang ,&nbsp;Anton Sinner ,&nbsp;Piero Coronica ,&nbsp;Joyce K.S. Poon","doi":"10.1016/j.jneumeth.2024.110273","DOIUrl":"10.1016/j.jneumeth.2024.110273","url":null,"abstract":"<div><h3>Background</h3><p>The segmentation of cells and neurites in microscopy images of neuronal networks provides valuable quantitative information about neuron growth and neuronal differentiation, including the number of cells, neurites, neurite length and neurite orientation. This information is essential for assessing the development of neuronal networks in response to extracellular stimuli, which is useful for studying neuronal structures, for example, the study of neurodegenerative diseases and pharmaceuticals.</p></div><div><h3>New method</h3><p>We have developed NeuroQuantify, an open-source software that uses deep learning to efficiently and quickly segment cells and neurites in phase contrast microscopy images.</p></div><div><h3>Results</h3><p>NeuroQuantify offers several key features: (i) automatic detection of cells and neurites; (ii) post-processing of the images for the quantitative neurite length measurement based on segmentation of phase contrast microscopy images, and (iii) identification of neurite orientations.</p></div><div><h3>Comparison with existing methods</h3><p>NeuroQuantify overcomes some of the limitations of existing methods in the automatic and accurate analysis of neuronal structures. It has been developed for phase contrast images rather than fluorescence images. In addition to typical functionality of cell counting, NeuroQuantify also detects and counts neurites, measures the neurite lengths, and produces the neurite orientation distribution.</p></div><div><h3>Conclusions</h3><p>We offer a valuable tool to assess network development rapidly and effectively. The user-friendly NeuroQuantify software can be installed and freely downloaded from GitHub at <span><span>https://github.com/StanleyZ0528/neural-image-segmentation</span><svg><path></path></svg></span>.</p></div>","PeriodicalId":16415,"journal":{"name":"Journal of Neuroscience Methods","volume":null,"pages":null},"PeriodicalIF":2.7,"publicationDate":"2024-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S0165027024002188/pdfft?md5=0023c239982bf823c04acc4b3908a6ee&pid=1-s2.0-S0165027024002188-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142093529","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Direct dorsal root ganglia (DRG) injection in mice for analysis of adeno-associated viral (AAV) gene transfer to peripheral somatosensory neurons 直接向小鼠背根神经节 (DRG) 注射,分析腺相关病毒 (AAV) 基因向外周躯体感觉神经元的转移。
IF 2.7 4区 医学
Journal of Neuroscience Methods Pub Date : 2024-08-25 DOI: 10.1016/j.jneumeth.2024.110268
Michael O’Donnell , Arjun Fontaine , John Caldwell , Richard Weir
{"title":"Direct dorsal root ganglia (DRG) injection in mice for analysis of adeno-associated viral (AAV) gene transfer to peripheral somatosensory neurons","authors":"Michael O’Donnell ,&nbsp;Arjun Fontaine ,&nbsp;John Caldwell ,&nbsp;Richard Weir","doi":"10.1016/j.jneumeth.2024.110268","DOIUrl":"10.1016/j.jneumeth.2024.110268","url":null,"abstract":"<div><h3>Background</h3><p>Delivering optogenetic genes to the peripheral sensory nervous system provides an efficient approach to study and treat neurological disorders and offers the potential to reintroduce sensory feedback to prostheses users and those who have incurred other neuropathies. Adeno-associated viral (AAV) vectors are a common method of gene delivery due to efficiency of gene transfer and minimal toxicity. AAVs are capable of being designed to target specific tissues, with transduction efficacy determined through the combination of serotype and genetic promoter selection, as well as location of vector administration. The dorsal root ganglia (DRGs) are collections of cell bodies of sensory neurons which project from the periphery to the central nervous system (CNS). The anatomical make-up of DRGs make them an ideal injection location to target the somatosensory neurons in the peripheral nervous system (PNS).</p></div><div><h3>Comparison to existing methods</h3><p>Previous studies have detailed methods of direct DRG injection in rats and dorsal horn injection in mice, however, due to the size and anatomical differences between rats and strains of mice, there is only one other published method for AAV injection into murine DRGs for transduction of peripheral sensory neurons using a different methodology.</p></div><div><h3>New Method/Results</h3><p>Here, we detail the necessary materials and methods required to inject AAVs into the L3 and L4 DRGs of mice, as well as how to harvest the sciatic nerve and L3/L4 DRGs for analysis. This methodology results in optogenetic expression in both the L3/L4 DRGs and sciatic nerve and can be adapted to inject any DRG.</p></div>","PeriodicalId":16415,"journal":{"name":"Journal of Neuroscience Methods","volume":null,"pages":null},"PeriodicalIF":2.7,"publicationDate":"2024-08-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142080601","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Effects of transcranial magnetic stimulation on axonal regeneration in the corticospinal tract of female rats with spinal cord injury 经颅磁刺激对脊髓损伤雌性大鼠皮质脊髓束轴突再生的影响
IF 2.7 4区 医学
Journal of Neuroscience Methods Pub Date : 2024-08-25 DOI: 10.1016/j.jneumeth.2024.110267
Mengxuan Hu , Zewen Tang , Huijun Li , Qian Lei , Qingqin Xu , Junhong Su , Ying Huang , Shi Chen , Hemu Chen
{"title":"Effects of transcranial magnetic stimulation on axonal regeneration in the corticospinal tract of female rats with spinal cord injury","authors":"Mengxuan Hu ,&nbsp;Zewen Tang ,&nbsp;Huijun Li ,&nbsp;Qian Lei ,&nbsp;Qingqin Xu ,&nbsp;Junhong Su ,&nbsp;Ying Huang ,&nbsp;Shi Chen ,&nbsp;Hemu Chen","doi":"10.1016/j.jneumeth.2024.110267","DOIUrl":"10.1016/j.jneumeth.2024.110267","url":null,"abstract":"<div><h3>Background</h3><p>This study investigates the potential of transcranial magnetic stimulation (TMS) to enhance spinal cord axon regeneration by modulating corticospinal pathways and improving motor nerve function recovery in rats with spinal cord injury (SCI).</p></div><div><h3>New method</h3><p>TMS is a non-invasive neuromodulation technique that generates a magnetic field to activate neurons in the brain, leading to depolarization and modulation of cortical activity. Initially utilized for brain physiology research, TMS has evolved into a diagnostic and prognostic tool in clinical settings, with increasing interest in its therapeutic applications. However, its potential for treating motor dysfunction in SCI has been underexplored.</p></div><div><h3>Results</h3><p>The TMS intervention group exhibited significant improvements compared to the control group across behavioral assessments, neurophysiological measurements, pathological analysis, and immunological markers.</p></div><div><h3>Comparison with existing methods</h3><p>Unlike most studies that focus on localized spinal cord injury or muscle treatments, this study leverages the non-invasive, painless, and highly penetrating nature of TMS to focus on the corticospinal tracts, exploring its therapeutic potential for SCI.</p></div><div><h3>Conclusions</h3><p>TMS enhances motor function recovery in rats with SCI by restoring corticospinal pathway integrity and promoting axonal regeneration. These findings highlight TMS as a promising therapeutic option for SCI patients with currently limited treatment alternatives.</p></div>","PeriodicalId":16415,"journal":{"name":"Journal of Neuroscience Methods","volume":null,"pages":null},"PeriodicalIF":2.7,"publicationDate":"2024-08-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142080602","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Computer-controlled electrical stimulation of facial muscles by facial neuromuscular electrical stimulation (fNMES): Hardware and software solutions 通过面部神经肌肉电刺激(fNMES)对面部肌肉进行计算机控制的电刺激:硬件和软件解决方案。
IF 2.7 4区 医学
Journal of Neuroscience Methods Pub Date : 2024-08-24 DOI: 10.1016/j.jneumeth.2024.110266
J. Baker , T. Efthimiou , A. Gartus , A. Elsenaar , M. Mehu , S. Korb
{"title":"Computer-controlled electrical stimulation of facial muscles by facial neuromuscular electrical stimulation (fNMES): Hardware and software solutions","authors":"J. Baker ,&nbsp;T. Efthimiou ,&nbsp;A. Gartus ,&nbsp;A. Elsenaar ,&nbsp;M. Mehu ,&nbsp;S. Korb","doi":"10.1016/j.jneumeth.2024.110266","DOIUrl":"10.1016/j.jneumeth.2024.110266","url":null,"abstract":"<div><h3>Background</h3><p>Computer controlled electrical stimulation of facial muscles is a promising method to study facial feedback effects, though little guidance is available for new adopters.</p></div><div><h3>New Method</h3><p>Facial Neuromuscular Electrical Stimulation (fNMES) offers a spatially and temporally precise means of manipulating facial muscles during experiments, and can be combined with EEG to study the neurological basis of facial feedback effects. Precise delivery of stimulation requires hardware and software solutions to integrate stimulators and a stimulus-presenting computer. We provide open-source hardware schematics and relevant computer code in order to achieve this integration, so as to facilitate the use of fNMES in the laboratory.</p></div><div><h3>Results</h3><p>Hardware schematics are provided for the building of a bespoke control module, which allows researchers to finely control stimulator output whilst participants complete computer tasks. In addition, we published code that new adopters of NMES can use within their experiments to control the module and send event triggers to another computer. These hard- and software solutions were successfully used to investigate the effects of facial muscle activation on felt and perceived emotion. We summarise these findings and discuss the integration of fNMES with EEG and peripheral physiological measures.</p></div><div><h3>Comparison with existing methods</h3><p>Our inexpensive hardware solution allows fNMES parameters to be computer controlled, and thus allows to stimulate facial muscles with high precision. This opens up new possibilities to investigate, for example, facial feedback effects.</p></div><div><h3>Conclusions</h3><p>We provide tools and guidance to build a control module in order to precisely deliver electrical stimulation to facial muscles using a stimulus computer (while recording EEG or other peripheral physiology).</p></div>","PeriodicalId":16415,"journal":{"name":"Journal of Neuroscience Methods","volume":null,"pages":null},"PeriodicalIF":2.7,"publicationDate":"2024-08-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S0165027024002115/pdfft?md5=c8af56b3ab0fbc5ed92dc3f20fd9331a&pid=1-s2.0-S0165027024002115-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142073048","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信