James L. Evans, Matthew T. Bramlet, Connor Davey, Eliot Bethke, Aaron T. Anderson, Graham Huesmann, Yogatheesan Varatharajah, Andres Maldonado, Jennifer R. Amos, Bradley P. Sutton
{"title":"SEEG4D: a tool for 4D visualization of stereoelectroencephalography data","authors":"James L. Evans, Matthew T. Bramlet, Connor Davey, Eliot Bethke, Aaron T. Anderson, Graham Huesmann, Yogatheesan Varatharajah, Andres Maldonado, Jennifer R. Amos, Bradley P. Sutton","doi":"10.3389/fninf.2024.1465231","DOIUrl":null,"url":null,"abstract":"Epilepsy is a prevalent and serious neurological condition which impacts millions of people worldwide. Stereoelectroencephalography (sEEG) is used in cases of drug resistant epilepsy to aid in surgical resection planning due to its high spatial resolution and ability to visualize seizure onset zones. For accurate localization of the seizure focus, sEEG studies combine pre-implantation magnetic resonance imaging, post-implant computed tomography to visualize electrodes, and temporally recorded sEEG electrophysiological data. Many tools exist to assist in merging multimodal spatial information; however, few allow for an integrated spatiotemporal view of the electrical activity. In the current work, we present SEEG4D, an automated tool to merge spatial and temporal data into a complete, four-dimensional virtual reality (VR) object with temporal electrophysiology that enables the simultaneous viewing of anatomy and seizure activity for seizure localization and presurgical planning. We developed an automated, containerized pipeline to segment tissues and electrode contacts. Contacts are aligned with electrical activity and then animated based on relative power. SEEG4D generates models which can be loaded into VR platforms for viewing and planning with the surgical team. Automated contact segmentation locations are within 1 mm of trained raters and models generated show signal propagation along electrodes. Critically, spatial–temporal information communicated through our models in a VR space have potential to enhance sEEG pre-surgical planning.","PeriodicalId":2,"journal":{"name":"ACS Applied Bio Materials","volume":null,"pages":null},"PeriodicalIF":4.6000,"publicationDate":"2024-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACS Applied Bio Materials","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.3389/fninf.2024.1465231","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATERIALS SCIENCE, BIOMATERIALS","Score":null,"Total":0}
引用次数: 0
Abstract
Epilepsy is a prevalent and serious neurological condition which impacts millions of people worldwide. Stereoelectroencephalography (sEEG) is used in cases of drug resistant epilepsy to aid in surgical resection planning due to its high spatial resolution and ability to visualize seizure onset zones. For accurate localization of the seizure focus, sEEG studies combine pre-implantation magnetic resonance imaging, post-implant computed tomography to visualize electrodes, and temporally recorded sEEG electrophysiological data. Many tools exist to assist in merging multimodal spatial information; however, few allow for an integrated spatiotemporal view of the electrical activity. In the current work, we present SEEG4D, an automated tool to merge spatial and temporal data into a complete, four-dimensional virtual reality (VR) object with temporal electrophysiology that enables the simultaneous viewing of anatomy and seizure activity for seizure localization and presurgical planning. We developed an automated, containerized pipeline to segment tissues and electrode contacts. Contacts are aligned with electrical activity and then animated based on relative power. SEEG4D generates models which can be loaded into VR platforms for viewing and planning with the surgical team. Automated contact segmentation locations are within 1 mm of trained raters and models generated show signal propagation along electrodes. Critically, spatial–temporal information communicated through our models in a VR space have potential to enhance sEEG pre-surgical planning.