6Former: Transformer-Based IPv6 Address Generation

Li-Yu Daisy Liu, Xing Li
{"title":"6Former: Transformer-Based IPv6 Address Generation","authors":"Li-Yu Daisy Liu, Xing Li","doi":"10.1109/ISCC58397.2023.10218311","DOIUrl":null,"url":null,"abstract":"Active network scanning in IPv6 is hindered by the vast address space of IPv6. Researchers have proposed various target generation methods, which are proved effective for reducing scanning space, to solve this problem. However, the current landscape of address generation methods is characterized by either low hit rates or limited applicability. To overcome these limitations, we propose 6Former, a novel target generation system based on Transformer. 6Former integrates a discriminator and a generator to improve hit rates and overcome usage scenarios limitations. Our experimental findings demonstrate that 6Former improves hit rates by a minimum of 38.6% over state-of-the-art generation approaches, while reducing time consumption by 31.6% in comparison to other language model-based methods.","PeriodicalId":265337,"journal":{"name":"2023 IEEE Symposium on Computers and Communications (ISCC)","volume":"9 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE Symposium on Computers and Communications (ISCC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISCC58397.2023.10218311","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Active network scanning in IPv6 is hindered by the vast address space of IPv6. Researchers have proposed various target generation methods, which are proved effective for reducing scanning space, to solve this problem. However, the current landscape of address generation methods is characterized by either low hit rates or limited applicability. To overcome these limitations, we propose 6Former, a novel target generation system based on Transformer. 6Former integrates a discriminator and a generator to improve hit rates and overcome usage scenarios limitations. Our experimental findings demonstrate that 6Former improves hit rates by a minimum of 38.6% over state-of-the-art generation approaches, while reducing time consumption by 31.6% in comparison to other language model-based methods.
former:基于变压器的IPv6地址生成
IPv6庞大的地址空间阻碍了主动网络扫描。为了解决这一问题,研究人员提出了各种目标生成方法,并证明了这些方法可以有效地减少扫描空间。然而,当前的地址生成方法的特点是命中率低或适用性有限。为了克服这些限制,我们提出了一种新的基于Transformer的目标生成系统6Former, 6Former集成了一个鉴别器和一个生成器,以提高命中率并克服使用场景的限制。我们的实验结果表明,与最先进的生成方法相比,6Former至少提高了38.6%的命中率,同时与其他基于语言模型的方法相比,减少了31.6%的时间消耗。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信