BRAND: a platform for closed-loop experiments with deep network models

IF 3.7 3区 医学 Q2 ENGINEERING, BIOMEDICAL
Yahia H Ali, Kevin Bodkin, Mattia Rigotti-Thompson, Kushant Patel, Nicholas S Card, Bareesh Bhaduri, Samuel R Nason-Tomaszewski, Domenick M Mifsud, Xianda Hou, Claire Nicolas, Shane Allcroft, Leigh R Hochberg, Nicholas Au Yong, Sergey D Stavisky, Lee E Miller, David M Brandman, Chethan Pandarinath
{"title":"BRAND: a platform for closed-loop experiments with deep network models","authors":"Yahia H Ali, Kevin Bodkin, Mattia Rigotti-Thompson, Kushant Patel, Nicholas S Card, Bareesh Bhaduri, Samuel R Nason-Tomaszewski, Domenick M Mifsud, Xianda Hou, Claire Nicolas, Shane Allcroft, Leigh R Hochberg, Nicholas Au Yong, Sergey D Stavisky, Lee E Miller, David M Brandman, Chethan Pandarinath","doi":"10.1088/1741-2552/ad3b3a","DOIUrl":null,"url":null,"abstract":"<italic toggle=\"yes\">Objective.</italic> Artificial neural networks (ANNs) are state-of-the-art tools for modeling and decoding neural activity, but deploying them in closed-loop experiments with tight timing constraints is challenging due to their limited support in existing real-time frameworks. Researchers need a platform that fully supports high-level languages for running ANNs (e.g. Python and Julia) while maintaining support for languages that are critical for low-latency data acquisition and processing (e.g. C and C++). <italic toggle=\"yes\">Approach.</italic> To address these needs, we introduce the Backend for Realtime Asynchronous Neural Decoding (BRAND). BRAND comprises Linux processes, termed <italic toggle=\"yes\">nodes</italic>, which communicate with each other in a <italic toggle=\"yes\">graph</italic> via streams of data. Its asynchronous design allows for acquisition, control, and analysis to be executed in parallel on streams of data that may operate at different timescales. BRAND uses Redis, an in-memory database, to send data between nodes, which enables fast inter-process communication and supports 54 different programming languages. Thus, developers can easily deploy existing ANN models in BRAND with minimal implementation changes. <italic toggle=\"yes\">Main results.</italic> In our tests, BRAND achieved &lt;600 microsecond latency between processes when sending large quantities of data (1024 channels of 30 kHz neural data in 1 ms chunks). BRAND runs a brain-computer interface with a recurrent neural network (RNN) decoder with less than 8 ms of latency from neural data input to decoder prediction. In a real-world demonstration of the system, participant T11 in the BrainGate2 clinical trial (ClinicalTrials.gov Identifier: NCT00912041) performed a standard cursor control task, in which 30 kHz signal processing, RNN decoding, task control, and graphics were all executed in BRAND. This system also supports real-time inference with complex latent variable models like Latent Factor Analysis via Dynamical Systems. <italic toggle=\"yes\">Significance.</italic> By providing a framework that is fast, modular, and language-agnostic, BRAND lowers the barriers to integrating the latest tools in neuroscience and machine learning into closed-loop experiments.","PeriodicalId":16753,"journal":{"name":"Journal of neural engineering","volume":"48 1","pages":""},"PeriodicalIF":3.7000,"publicationDate":"2024-04-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of neural engineering","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1088/1741-2552/ad3b3a","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0

Abstract

Objective. Artificial neural networks (ANNs) are state-of-the-art tools for modeling and decoding neural activity, but deploying them in closed-loop experiments with tight timing constraints is challenging due to their limited support in existing real-time frameworks. Researchers need a platform that fully supports high-level languages for running ANNs (e.g. Python and Julia) while maintaining support for languages that are critical for low-latency data acquisition and processing (e.g. C and C++). Approach. To address these needs, we introduce the Backend for Realtime Asynchronous Neural Decoding (BRAND). BRAND comprises Linux processes, termed nodes, which communicate with each other in a graph via streams of data. Its asynchronous design allows for acquisition, control, and analysis to be executed in parallel on streams of data that may operate at different timescales. BRAND uses Redis, an in-memory database, to send data between nodes, which enables fast inter-process communication and supports 54 different programming languages. Thus, developers can easily deploy existing ANN models in BRAND with minimal implementation changes. Main results. In our tests, BRAND achieved <600 microsecond latency between processes when sending large quantities of data (1024 channels of 30 kHz neural data in 1 ms chunks). BRAND runs a brain-computer interface with a recurrent neural network (RNN) decoder with less than 8 ms of latency from neural data input to decoder prediction. In a real-world demonstration of the system, participant T11 in the BrainGate2 clinical trial (ClinicalTrials.gov Identifier: NCT00912041) performed a standard cursor control task, in which 30 kHz signal processing, RNN decoding, task control, and graphics were all executed in BRAND. This system also supports real-time inference with complex latent variable models like Latent Factor Analysis via Dynamical Systems. Significance. By providing a framework that is fast, modular, and language-agnostic, BRAND lowers the barriers to integrating the latest tools in neuroscience and machine learning into closed-loop experiments.
BRAND:深度网络模型闭环实验平台
目的。人工神经网络(ANN)是对神经活动进行建模和解码的最先进工具,但由于现有实时框架对其支持有限,在具有严格时间限制的闭环实验中部署人工神经网络极具挑战性。研究人员需要一个完全支持高级语言的平台来运行 ANNs(如 Python 和 Julia),同时保持对低延迟数据采集和处理关键语言(如 C 和 C++)的支持。方法。为了满足这些需求,我们引入了实时异步神经解码后端(BRAND)。BRAND 由 Linux 进程(称为节点)组成,节点通过数据流在图中相互通信。它的异步设计允许在不同时间尺度的数据流上并行执行采集、控制和分析。BRAND 使用内存数据库 Redis 在节点之间发送数据,从而实现了快速的进程间通信,并支持 54 种不同的编程语言。因此,开发人员只需对 BRAND 的实现进行最小的改动,就能在 BRAND 中轻松部署现有的 ANN 模型。主要结果在我们的测试中,BRAND 在发送大量数据(1024 个通道的 30 kHz 神经数据,每块 1 毫秒)时,进程间的延迟时间为 600 微秒。BRAND 通过递归神经网络 (RNN) 解码器运行脑机接口,从神经数据输入到解码器预测的延迟时间不到 8 毫秒。在该系统的实际演示中,BrainGate2 临床试验(ClinicalTrials.gov Identifier:NCT00912041)的参与者 T11 执行了一项标准光标控制任务,其中 30 kHz 信号处理、RNN 解码、任务控制和图形都在 BRAND 中执行。该系统还支持复杂潜变量模型的实时推理,如通过动态系统进行潜因素分析。意义重大。通过提供快速、模块化和语言无关的框架,BRAND 降低了将神经科学和机器学习的最新工具集成到闭环实验中的门槛。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Journal of neural engineering
Journal of neural engineering 工程技术-工程:生物医学
CiteScore
7.80
自引率
12.50%
发文量
319
审稿时长
4.2 months
期刊介绍: The goal of Journal of Neural Engineering (JNE) is to act as a forum for the interdisciplinary field of neural engineering where neuroscientists, neurobiologists and engineers can publish their work in one periodical that bridges the gap between neuroscience and engineering. The journal publishes articles in the field of neural engineering at the molecular, cellular and systems levels. The scope of the journal encompasses experimental, computational, theoretical, clinical and applied aspects of: Innovative neurotechnology; Brain-machine (computer) interface; Neural interfacing; Bioelectronic medicines; Neuromodulation; Neural prostheses; Neural control; Neuro-rehabilitation; Neurorobotics; Optical neural engineering; Neural circuits: artificial & biological; Neuromorphic engineering; Neural tissue regeneration; Neural signal processing; Theoretical and computational neuroscience; Systems neuroscience; Translational neuroscience; Neuroimaging.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信