使用内存计算硬件的深度贝叶斯主动学习。

IF 12 Q1 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS
Yudeng Lin, Bin Gao, Jianshi Tang, Qingtian Zhang, He Qian, Huaqiang Wu
{"title":"使用内存计算硬件的深度贝叶斯主动学习。","authors":"Yudeng Lin, Bin Gao, Jianshi Tang, Qingtian Zhang, He Qian, Huaqiang Wu","doi":"10.1038/s43588-024-00744-y","DOIUrl":null,"url":null,"abstract":"Labeling data is a time-consuming, labor-intensive and costly procedure for many artificial intelligence tasks. Deep Bayesian active learning (DBAL) boosts labeling efficiency exponentially, substantially reducing costs. However, DBAL demands high-bandwidth data transfer and probabilistic computing, posing great challenges for conventional deterministic hardware. Here we propose a memristor stochastic gradient Langevin dynamics in situ learning method that uses the stochastic of memristor modulation to learn efficiency, enabling DBAL within the computation-in-memory (CIM) framework. To prove the feasibility and effectiveness of the proposed method, we implemented in-memory DBAL on a memristor-based stochastic CIM system and successfully demonstrated a robot’s skill learning task. The inherent stochastic characteristics of memristors allow a four-layer memristor Bayesian deep neural network to efficiently identify and learn from uncertain samples. Compared with cutting-edge conventional complementary metal-oxide-semiconductor-based hardware implementation, the stochastic CIM system achieves a remarkable 44% boost in speed and could conserve 153 times more energy. This study introduces an in-memory deep Bayesian active learning framework that uses the stochastic properties of memristors for in situ probabilistic computations. This framework can greatly improve the efficiency and speed of artificial intelligence learning tasks, as demonstrated with a robot skill-learning task.","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":"5 1","pages":"27-36"},"PeriodicalIF":12.0000,"publicationDate":"2024-12-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11774754/pdf/","citationCount":"0","resultStr":"{\"title\":\"Deep Bayesian active learning using in-memory computing hardware\",\"authors\":\"Yudeng Lin, Bin Gao, Jianshi Tang, Qingtian Zhang, He Qian, Huaqiang Wu\",\"doi\":\"10.1038/s43588-024-00744-y\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Labeling data is a time-consuming, labor-intensive and costly procedure for many artificial intelligence tasks. Deep Bayesian active learning (DBAL) boosts labeling efficiency exponentially, substantially reducing costs. However, DBAL demands high-bandwidth data transfer and probabilistic computing, posing great challenges for conventional deterministic hardware. Here we propose a memristor stochastic gradient Langevin dynamics in situ learning method that uses the stochastic of memristor modulation to learn efficiency, enabling DBAL within the computation-in-memory (CIM) framework. To prove the feasibility and effectiveness of the proposed method, we implemented in-memory DBAL on a memristor-based stochastic CIM system and successfully demonstrated a robot’s skill learning task. The inherent stochastic characteristics of memristors allow a four-layer memristor Bayesian deep neural network to efficiently identify and learn from uncertain samples. Compared with cutting-edge conventional complementary metal-oxide-semiconductor-based hardware implementation, the stochastic CIM system achieves a remarkable 44% boost in speed and could conserve 153 times more energy. This study introduces an in-memory deep Bayesian active learning framework that uses the stochastic properties of memristors for in situ probabilistic computations. This framework can greatly improve the efficiency and speed of artificial intelligence learning tasks, as demonstrated with a robot skill-learning task.\",\"PeriodicalId\":74246,\"journal\":{\"name\":\"Nature computational science\",\"volume\":\"5 1\",\"pages\":\"27-36\"},\"PeriodicalIF\":12.0000,\"publicationDate\":\"2024-12-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11774754/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Nature computational science\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.nature.com/articles/s43588-024-00744-y\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Nature computational science","FirstCategoryId":"1085","ListUrlMain":"https://www.nature.com/articles/s43588-024-00744-y","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

摘要

对于许多人工智能任务来说,标记数据是一个耗时、劳动密集型和昂贵的过程。深度贝叶斯主动学习(DBAL)大大提高了标注效率,大大降低了成本。然而,DBAL需要高带宽的数据传输和概率计算,这对传统的确定性硬件提出了很大的挑战。在这里,我们提出了一种记忆电阻器随机梯度朗之万动态原位学习方法,该方法利用记忆电阻器调制的随机性来学习效率,使DBAL在内存计算(CIM)框架内实现。为了证明该方法的可行性和有效性,我们在一个基于忆阻器的随机CIM系统上实现了内存DBAL,并成功地演示了机器人的技能学习任务。记忆电阻器固有的随机特性使四层记忆电阻器贝叶斯深度神经网络能够有效地从不确定样本中识别和学习。与传统的基于互补金属氧化物半导体的硬件实现相比,随机CIM系统的速度提高了44%,节省了153倍的能源。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Deep Bayesian active learning using in-memory computing hardware

Deep Bayesian active learning using in-memory computing hardware
Labeling data is a time-consuming, labor-intensive and costly procedure for many artificial intelligence tasks. Deep Bayesian active learning (DBAL) boosts labeling efficiency exponentially, substantially reducing costs. However, DBAL demands high-bandwidth data transfer and probabilistic computing, posing great challenges for conventional deterministic hardware. Here we propose a memristor stochastic gradient Langevin dynamics in situ learning method that uses the stochastic of memristor modulation to learn efficiency, enabling DBAL within the computation-in-memory (CIM) framework. To prove the feasibility and effectiveness of the proposed method, we implemented in-memory DBAL on a memristor-based stochastic CIM system and successfully demonstrated a robot’s skill learning task. The inherent stochastic characteristics of memristors allow a four-layer memristor Bayesian deep neural network to efficiently identify and learn from uncertain samples. Compared with cutting-edge conventional complementary metal-oxide-semiconductor-based hardware implementation, the stochastic CIM system achieves a remarkable 44% boost in speed and could conserve 153 times more energy. This study introduces an in-memory deep Bayesian active learning framework that uses the stochastic properties of memristors for in situ probabilistic computations. This framework can greatly improve the efficiency and speed of artificial intelligence learning tasks, as demonstrated with a robot skill-learning task.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
11.70
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信