{"title":"面向近似电路设计探索的领域特定生成预训练模型","authors":"Sipei Yi;Weichuan Zuo;Hongyi Wu;Ruicheng Dai;Weikang Qian;Jienan Chen","doi":"10.1109/JETCAS.2025.3568606","DOIUrl":null,"url":null,"abstract":"Automatically designing fast and low-cost digital circuits is challenging because of the discrete nature of circuits and the enormous design space, particularly in the exploration of approximate circuits. However, recent advances in generative artificial intelligence (GAI) have shed light to address these challenges. In this work, we present GPTAC, a domain-specific generative pre-trained (GPT) model customized for designing approximate circuits. By specifying the desired circuit accuracy or area, GPTAC can automatically generate an approximate circuit using its generative capabilities. We represent circuits using domain-specific language tokens, refined through a hardware description language keyword filter applied to gate-level code. This representation enables GPTAC to effectively learn approximate circuits from existing datasets by leveraging the GPT language model, as the training data can be directly derived from gate-level code. Additionally, by focusing on a domain-specific language, only a limited set of keywords is maintained, facilitating faster model convergence. To improve the success rate of the generated circuits, we introduce a circuit check rule that masks the GPTAC inference results when necessary. The experiment indicated that GPTAC is capable of producing approximate multipliers in under 15 seconds while utilizing merely 4GB of GPU memory, achieving a 10-40% reduction in area relative to the accuracy multiplier depending on various accuracy needs.","PeriodicalId":48827,"journal":{"name":"IEEE Journal on Emerging and Selected Topics in Circuits and Systems","volume":"15 2","pages":"349-360"},"PeriodicalIF":3.8000,"publicationDate":"2025-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"GPTAC: Domain-Specific Generative Pre-Trained Model for Approximate Circuit Design Exploration\",\"authors\":\"Sipei Yi;Weichuan Zuo;Hongyi Wu;Ruicheng Dai;Weikang Qian;Jienan Chen\",\"doi\":\"10.1109/JETCAS.2025.3568606\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Automatically designing fast and low-cost digital circuits is challenging because of the discrete nature of circuits and the enormous design space, particularly in the exploration of approximate circuits. However, recent advances in generative artificial intelligence (GAI) have shed light to address these challenges. In this work, we present GPTAC, a domain-specific generative pre-trained (GPT) model customized for designing approximate circuits. By specifying the desired circuit accuracy or area, GPTAC can automatically generate an approximate circuit using its generative capabilities. We represent circuits using domain-specific language tokens, refined through a hardware description language keyword filter applied to gate-level code. This representation enables GPTAC to effectively learn approximate circuits from existing datasets by leveraging the GPT language model, as the training data can be directly derived from gate-level code. Additionally, by focusing on a domain-specific language, only a limited set of keywords is maintained, facilitating faster model convergence. To improve the success rate of the generated circuits, we introduce a circuit check rule that masks the GPTAC inference results when necessary. The experiment indicated that GPTAC is capable of producing approximate multipliers in under 15 seconds while utilizing merely 4GB of GPU memory, achieving a 10-40% reduction in area relative to the accuracy multiplier depending on various accuracy needs.\",\"PeriodicalId\":48827,\"journal\":{\"name\":\"IEEE Journal on Emerging and Selected Topics in Circuits and Systems\",\"volume\":\"15 2\",\"pages\":\"349-360\"},\"PeriodicalIF\":3.8000,\"publicationDate\":\"2025-03-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Journal on Emerging and Selected Topics in Circuits and Systems\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10994814/\",\"RegionNum\":2,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Journal on Emerging and Selected Topics in Circuits and Systems","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10994814/","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
GPTAC: Domain-Specific Generative Pre-Trained Model for Approximate Circuit Design Exploration
Automatically designing fast and low-cost digital circuits is challenging because of the discrete nature of circuits and the enormous design space, particularly in the exploration of approximate circuits. However, recent advances in generative artificial intelligence (GAI) have shed light to address these challenges. In this work, we present GPTAC, a domain-specific generative pre-trained (GPT) model customized for designing approximate circuits. By specifying the desired circuit accuracy or area, GPTAC can automatically generate an approximate circuit using its generative capabilities. We represent circuits using domain-specific language tokens, refined through a hardware description language keyword filter applied to gate-level code. This representation enables GPTAC to effectively learn approximate circuits from existing datasets by leveraging the GPT language model, as the training data can be directly derived from gate-level code. Additionally, by focusing on a domain-specific language, only a limited set of keywords is maintained, facilitating faster model convergence. To improve the success rate of the generated circuits, we introduce a circuit check rule that masks the GPTAC inference results when necessary. The experiment indicated that GPTAC is capable of producing approximate multipliers in under 15 seconds while utilizing merely 4GB of GPU memory, achieving a 10-40% reduction in area relative to the accuracy multiplier depending on various accuracy needs.
期刊介绍:
The IEEE Journal on Emerging and Selected Topics in Circuits and Systems is published quarterly and solicits, with particular emphasis on emerging areas, special issues on topics that cover the entire scope of the IEEE Circuits and Systems (CAS) Society, namely the theory, analysis, design, tools, and implementation of circuits and systems, spanning their theoretical foundations, applications, and architectures for signal and information processing.