{"title":"Computationally Efficient Design of an LNA Input Matching Network Using Automatic Differentiation","authors":"Kiran A. Shila","doi":"10.1109/JMW.2025.3568779","DOIUrl":null,"url":null,"abstract":"We present a method for the design of an LNA input matching network using automatic differentiation (AD), a technique made popular by machine learning. The input matching network consists of a non-uniform suspended stripline transformer, directly optimized with AD-provided gradients. Compared to the standard approach of finite-differences, AD provides orders of magnitude faster optimization time for gradient-based solvers. This dramatic speedup reduces the iteration time during design and enables the exploration of more complex geometries. The LNA designed with this approach improves over a previous two-section uniform-line design, achieving an average noise temperature of (11.53 <inline-formula><tex-math>$\\pm$</tex-math></inline-formula> 0.42) K over the frequency range of 0.7 GHz to 2 GHz at room temperature. We optimized the geometry in under 5 s, <inline-formula><tex-math>$40$</tex-math></inline-formula>x faster than optimizing with finite-differences.","PeriodicalId":93296,"journal":{"name":"IEEE journal of microwaves","volume":"5 4","pages":"972-982"},"PeriodicalIF":4.9000,"publicationDate":"2025-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11021605","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE journal of microwaves","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/11021605/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
We present a method for the design of an LNA input matching network using automatic differentiation (AD), a technique made popular by machine learning. The input matching network consists of a non-uniform suspended stripline transformer, directly optimized with AD-provided gradients. Compared to the standard approach of finite-differences, AD provides orders of magnitude faster optimization time for gradient-based solvers. This dramatic speedup reduces the iteration time during design and enables the exploration of more complex geometries. The LNA designed with this approach improves over a previous two-section uniform-line design, achieving an average noise temperature of (11.53 $\pm$ 0.42) K over the frequency range of 0.7 GHz to 2 GHz at room temperature. We optimized the geometry in under 5 s, $40$x faster than optimizing with finite-differences.