{"title":"Looping LMS versus fast least squares algorithms: who gets there first?","authors":"M. Alberi, R. Casas, I. Fijalkow, C. R. Johnson","doi":"10.1109/SPAWC.1999.783077","DOIUrl":null,"url":null,"abstract":"This paper analytically compares, in terms of the convergence time, fast least squares estimation algorithms for channel identification and equalization to looping LMS (LLMS), a scheme which repeatedly applies the least mean squares algorithm to a block of received data. In this study, the convergence time is defined as the actual time (in seconds) taken by an algorithm to reach a desired performance. The old theme on LMS and fast least squares algorithms convergence is revisited from a novel perspective: the comparison is made from a complexity viewpoint, which not only takes into account the statistical properties of studied algorithms but also the number of floating point operations.","PeriodicalId":365086,"journal":{"name":"1999 2nd IEEE Workshop on Signal Processing Advances in Wireless Communications (Cat. No.99EX304)","volume":"28 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1999-05-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"1999 2nd IEEE Workshop on Signal Processing Advances in Wireless Communications (Cat. No.99EX304)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SPAWC.1999.783077","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
This paper analytically compares, in terms of the convergence time, fast least squares estimation algorithms for channel identification and equalization to looping LMS (LLMS), a scheme which repeatedly applies the least mean squares algorithm to a block of received data. In this study, the convergence time is defined as the actual time (in seconds) taken by an algorithm to reach a desired performance. The old theme on LMS and fast least squares algorithms convergence is revisited from a novel perspective: the comparison is made from a complexity viewpoint, which not only takes into account the statistical properties of studied algorithms but also the number of floating point operations.