EstraNet: An Efficient Shift-Invariant Transformer Network for Side-Channel Analysis

Suvadeep Hajra, Siddhartha Chowdhury, Debdeep Mukhopadhyay
{"title":"EstraNet: An Efficient Shift-Invariant Transformer Network for Side-Channel Analysis","authors":"Suvadeep Hajra, Siddhartha Chowdhury, Debdeep Mukhopadhyay","doi":"10.46586/tches.v2024.i1.336-374","DOIUrl":null,"url":null,"abstract":"Deep Learning (DL) based Side-Channel Analysis (SCA) has been extremely popular recently. DL-based SCA can easily break implementations protected by masking countermeasures. DL-based SCA has also been highly successful against implementations protected by various trace desynchronization-based countermeasures like random delay, clock jitter and shuffling. Over the years, many DL models have been explored to perform SCA. Recently, Transformer Network (TN) based model has also been introduced for SCA. Though the previously introduced TN-based model is successful against implementations jointly protected by masking and random delay countermeasures, it is not scalable to long traces (having a length greater than a few thousand) due to its quadratic time and memory complexity. This work proposes a novel shift-invariant TN-based model with linear time and memory complexity. he contributions of the work are two-fold. First, we introduce a novel TN-based model called EstraNet for SCA. EstraNet has linear time and memory complexity in trace length, significantly improving over the previously proposed TN-based model’s quadratic time and memory cost. EstraNet is also shift-invariant, making it highly effective against countermeasures like random delay and clock jitter. Secondly, we evaluated EstraNet on three SCA datasets of masked implementations with random delay and clock jitter effect. Our experimental results show that EstraNet significantly outperforms several benchmark models, demonstrating up to an order of magnitude reduction in the number of attack traces required to reach guessing entropy 1.","PeriodicalId":321490,"journal":{"name":"IACR Transactions on Cryptographic Hardware and Embedded Systems","volume":"41 12","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IACR Transactions on Cryptographic Hardware and Embedded Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.46586/tches.v2024.i1.336-374","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Deep Learning (DL) based Side-Channel Analysis (SCA) has been extremely popular recently. DL-based SCA can easily break implementations protected by masking countermeasures. DL-based SCA has also been highly successful against implementations protected by various trace desynchronization-based countermeasures like random delay, clock jitter and shuffling. Over the years, many DL models have been explored to perform SCA. Recently, Transformer Network (TN) based model has also been introduced for SCA. Though the previously introduced TN-based model is successful against implementations jointly protected by masking and random delay countermeasures, it is not scalable to long traces (having a length greater than a few thousand) due to its quadratic time and memory complexity. This work proposes a novel shift-invariant TN-based model with linear time and memory complexity. he contributions of the work are two-fold. First, we introduce a novel TN-based model called EstraNet for SCA. EstraNet has linear time and memory complexity in trace length, significantly improving over the previously proposed TN-based model’s quadratic time and memory cost. EstraNet is also shift-invariant, making it highly effective against countermeasures like random delay and clock jitter. Secondly, we evaluated EstraNet on three SCA datasets of masked implementations with random delay and clock jitter effect. Our experimental results show that EstraNet significantly outperforms several benchmark models, demonstrating up to an order of magnitude reduction in the number of attack traces required to reach guessing entropy 1.
EstraNet:用于侧信道分析的高效移位不变变压器网络
基于深度学习(DL)的侧信道分析(SCA)最近非常流行。基于dl的SCA可以很容易地破坏通过屏蔽对策保护的实现。基于dl的SCA对于各种基于跟踪去同步的对策(如随机延迟、时钟抖动和变换)保护的实现也非常成功。多年来,已经探索了许多DL模型来执行SCA。最近,基于变压器网络(TN)的SCA模型也被引入。尽管之前引入的基于tn的模型成功地对抗了由屏蔽和随机延迟对抗联合保护的实现,但由于其二次的时间和内存复杂性,它不能扩展到长迹线(长度大于几千)。本文提出了一种新颖的基于位移不变tn的线性时间和记忆复杂度模型。他对这项工作的贡献是双重的。首先,我们为SCA引入一种新的基于tn的模型,称为EstraNet。EstraNet在跟踪长度上具有线性的时间和内存复杂度,显著改善了之前提出的基于tn的模型的二次时间和内存成本。EstraNet也是位移不变的,这使得它对随机延迟和时钟抖动等对抗措施非常有效。其次,我们在三个具有随机延迟和时钟抖动效应的屏蔽实现的SCA数据集上评估了EstraNet。我们的实验结果表明,EstraNet显著优于几个基准模型,表明达到猜测熵1所需的攻击痕迹数量减少了一个数量级。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信