An Yu , Mannut Singh , Abhineet Pandey , Elizabeth Dybas , Aditya Agarwal , Yifan Kao , Guangliang Zhao , Tzu-Jen Kao , Xin Li , Damian S. Shin , Ming-Ching Chang
{"title":"Integrating manual preprocessing with automated feature extraction for improved rodent seizure classification","authors":"An Yu , Mannut Singh , Abhineet Pandey , Elizabeth Dybas , Aditya Agarwal , Yifan Kao , Guangliang Zhao , Tzu-Jen Kao , Xin Li , Damian S. Shin , Ming-Ching Chang","doi":"10.1016/j.yebeh.2025.110306","DOIUrl":null,"url":null,"abstract":"<div><h3>Hypothesis/Objective</h3><div>Rodent models of epilepsy can help with the search for more effective drug candidates or neuromodulatory therapies. Yet, preclinical screening of candidate options for anti-epileptic drugs (AED) using rodent models may require hours of video monitoring. Data processing is also time-consuming, subjective, and error-prone. This study aims to develop an AI-enabled quantitative analysis of rodent behavior, including epilepsy stage classification.</div></div><div><h3>Approach/Method</h3><div>We leveraged deep learning and computer vision techniques to develop a semi-automatic pipeline and framework for animal seizure detection and recognition, which requires manual preprocessing of the dataset. Our hybrid approach combines model-based and data-driven methods but is dependent on manually preprocessed and segmented video clips to facilitate the automatic classification of epilepsy stages.</div></div><div><h3>Results</h3><div>We collected two datasets comprising rat skeleton keypoints and seizure behavior videos in the lab. The proposed method, PoseC3D, for rat seizure stage classification of the collected database achieved an accuracy between 64.7–90.3 % when tested on four different seizure phenotypes using the Racine classification scale.</div></div><div><h3>Conclusions</h3><div>This study demonstrates the feasibility of video-based seizure stage detection and classification for rodent models of temporal lobe seizures using a semi-automatic pipeline that requires manual preprocessing of data. However, our method is not capable of fully automated seizure detection and has not been tested on unseen animals, which limits its generalizability and applicability for broader use. Despite these limitations, the approach underscores our ability to undertake quantitative analysis of rodent behavior, which can also support other studies of animal behavior involving motor functions and future considerations for non-motor symptomology such as mood disorders.</div></div>","PeriodicalId":11847,"journal":{"name":"Epilepsy & Behavior","volume":"165 ","pages":"Article 110306"},"PeriodicalIF":2.3000,"publicationDate":"2025-02-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Epilepsy & Behavior","FirstCategoryId":"3","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1525505025000459","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"BEHAVIORAL SCIENCES","Score":null,"Total":0}
引用次数: 0
Abstract
Hypothesis/Objective
Rodent models of epilepsy can help with the search for more effective drug candidates or neuromodulatory therapies. Yet, preclinical screening of candidate options for anti-epileptic drugs (AED) using rodent models may require hours of video monitoring. Data processing is also time-consuming, subjective, and error-prone. This study aims to develop an AI-enabled quantitative analysis of rodent behavior, including epilepsy stage classification.
Approach/Method
We leveraged deep learning and computer vision techniques to develop a semi-automatic pipeline and framework for animal seizure detection and recognition, which requires manual preprocessing of the dataset. Our hybrid approach combines model-based and data-driven methods but is dependent on manually preprocessed and segmented video clips to facilitate the automatic classification of epilepsy stages.
Results
We collected two datasets comprising rat skeleton keypoints and seizure behavior videos in the lab. The proposed method, PoseC3D, for rat seizure stage classification of the collected database achieved an accuracy between 64.7–90.3 % when tested on four different seizure phenotypes using the Racine classification scale.
Conclusions
This study demonstrates the feasibility of video-based seizure stage detection and classification for rodent models of temporal lobe seizures using a semi-automatic pipeline that requires manual preprocessing of data. However, our method is not capable of fully automated seizure detection and has not been tested on unseen animals, which limits its generalizability and applicability for broader use. Despite these limitations, the approach underscores our ability to undertake quantitative analysis of rodent behavior, which can also support other studies of animal behavior involving motor functions and future considerations for non-motor symptomology such as mood disorders.
期刊介绍:
Epilepsy & Behavior is the fastest-growing international journal uniquely devoted to the rapid dissemination of the most current information available on the behavioral aspects of seizures and epilepsy.
Epilepsy & Behavior presents original peer-reviewed articles based on laboratory and clinical research. Topics are drawn from a variety of fields, including clinical neurology, neurosurgery, neuropsychiatry, neuropsychology, neurophysiology, neuropharmacology, and neuroimaging.
From September 2012 Epilepsy & Behavior stopped accepting Case Reports for publication in the journal. From this date authors who submit to Epilepsy & Behavior will be offered a transfer or asked to resubmit their Case Reports to its new sister journal, Epilepsy & Behavior Case Reports.