{"title":"关于正交化EM与近端梯度下降的等价性的注记。","authors":"James Yang, Trevor Hastie","doi":"10.1080/00401706.2024.2430204","DOIUrl":null,"url":null,"abstract":"<p><p>Xiong et al. (2016) develop a method called orthogonalizing EM (OEM) to solve penalized regression problems for tall data. While OEM is developed in the context of the EM algorithm, we show that it is, in fact, an instance of proximal gradient descent, a popular first-order convex optimization algorithm.</p>","PeriodicalId":22208,"journal":{"name":"Technometrics","volume":"67 2","pages":"267-269"},"PeriodicalIF":2.3000,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12140180/pdf/","citationCount":"0","resultStr":"{\"title\":\"Note on the Equivalence of Orthogonalizing EM and Proximal Gradient Descent.\",\"authors\":\"James Yang, Trevor Hastie\",\"doi\":\"10.1080/00401706.2024.2430204\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Xiong et al. (2016) develop a method called orthogonalizing EM (OEM) to solve penalized regression problems for tall data. While OEM is developed in the context of the EM algorithm, we show that it is, in fact, an instance of proximal gradient descent, a popular first-order convex optimization algorithm.</p>\",\"PeriodicalId\":22208,\"journal\":{\"name\":\"Technometrics\",\"volume\":\"67 2\",\"pages\":\"267-269\"},\"PeriodicalIF\":2.3000,\"publicationDate\":\"2025-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12140180/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Technometrics\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1080/00401706.2024.2430204\",\"RegionNum\":3,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2024/12/23 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"Q1\",\"JCRName\":\"STATISTICS & PROBABILITY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Technometrics","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1080/00401706.2024.2430204","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/12/23 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
Note on the Equivalence of Orthogonalizing EM and Proximal Gradient Descent.
Xiong et al. (2016) develop a method called orthogonalizing EM (OEM) to solve penalized regression problems for tall data. While OEM is developed in the context of the EM algorithm, we show that it is, in fact, an instance of proximal gradient descent, a popular first-order convex optimization algorithm.
期刊介绍:
Technometrics is a Journal of Statistics for the Physical, Chemical, and Engineering Sciences, and is published Quarterly by the American Society for Quality and the American Statistical Association.Since its inception in 1959, the mission of Technometrics has been to contribute to the development and use of statistical methods in the physical, chemical, and engineering sciences.