Jonathan M. Goodwill, N. Prasad, B. Hoskins, M. Daniels, A. Madhavan, L. Wan, T. Santos, M. Tran, J. Katine, P. Braganca, M. Stiles, J. McClelland
{"title":"二值神经网络在无源磁隧道结阵列上的实现","authors":"Jonathan M. Goodwill, N. Prasad, B. Hoskins, M. Daniels, A. Madhavan, L. Wan, T. Santos, M. Tran, J. Katine, P. Braganca, M. Stiles, J. McClelland","doi":"10.1109/TMRC56419.2022.9918590","DOIUrl":null,"url":null,"abstract":"Magnetic tunnel junctions (MTJs) provide an attractive platform for implementing neural networks because of their simplicity, non-volatility, and scalability. However, in hardware realizations, device variations, write errors, and parasitic resistance degrade performance. To quantify such effects, we perform inference experiments on a 2-layer perceptron constructed from a 15 x 15 passive array of MTJs, examining classification accuracy and write fidelity. Despite imperfections, we achieve median accuracy of 95.3% with proper tuning of network parameters. The success of this tuning process shows that new metrics are needed to characterize and optimize networks reproduced in mixed signal hardware.","PeriodicalId":432413,"journal":{"name":"2022 IEEE 33rd Magnetic Recording Conference (TMRC)","volume":"48 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Implementation of a Binary Neural Network on a Passive Array of Magnetic Tunnel Junctions\",\"authors\":\"Jonathan M. Goodwill, N. Prasad, B. Hoskins, M. Daniels, A. Madhavan, L. Wan, T. Santos, M. Tran, J. Katine, P. Braganca, M. Stiles, J. McClelland\",\"doi\":\"10.1109/TMRC56419.2022.9918590\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Magnetic tunnel junctions (MTJs) provide an attractive platform for implementing neural networks because of their simplicity, non-volatility, and scalability. However, in hardware realizations, device variations, write errors, and parasitic resistance degrade performance. To quantify such effects, we perform inference experiments on a 2-layer perceptron constructed from a 15 x 15 passive array of MTJs, examining classification accuracy and write fidelity. Despite imperfections, we achieve median accuracy of 95.3% with proper tuning of network parameters. The success of this tuning process shows that new metrics are needed to characterize and optimize networks reproduced in mixed signal hardware.\",\"PeriodicalId\":432413,\"journal\":{\"name\":\"2022 IEEE 33rd Magnetic Recording Conference (TMRC)\",\"volume\":\"48 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-12-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE 33rd Magnetic Recording Conference (TMRC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/TMRC56419.2022.9918590\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE 33rd Magnetic Recording Conference (TMRC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TMRC56419.2022.9918590","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
摘要
磁隧道结(MTJs)因其简单、无波动性和可扩展性而为实现神经网络提供了一个有吸引力的平台。然而,在硬件实现中,器件变化、写错误和寄生电阻会降低性能。为了量化这种影响,我们在一个由15 x 15被动mtj阵列构建的2层感知器上进行了推理实验,检查了分类精度和写入保真度。尽管存在缺陷,但通过适当调整网络参数,我们实现了95.3%的中位数精度。这种调整过程的成功表明,需要新的指标来表征和优化在混合信号硬件中再现的网络。
Implementation of a Binary Neural Network on a Passive Array of Magnetic Tunnel Junctions
Magnetic tunnel junctions (MTJs) provide an attractive platform for implementing neural networks because of their simplicity, non-volatility, and scalability. However, in hardware realizations, device variations, write errors, and parasitic resistance degrade performance. To quantify such effects, we perform inference experiments on a 2-layer perceptron constructed from a 15 x 15 passive array of MTJs, examining classification accuracy and write fidelity. Despite imperfections, we achieve median accuracy of 95.3% with proper tuning of network parameters. The success of this tuning process shows that new metrics are needed to characterize and optimize networks reproduced in mixed signal hardware.