{"title":"高维不完全数据的广义Nesterov加速二阶潜因子模型。","authors":"Weiling Li, Renfang Wang, Xin Luo","doi":"10.1109/TNNLS.2023.3321915","DOIUrl":null,"url":null,"abstract":"<p><p>High-dimensional and incomplete (HDI) data are frequently encountered in big date-related applications for describing restricted observed interactions among large node sets. How to perform accurate and efficient representation learning on such HDI data is a hot yet thorny issue. A latent factor (LF) model has proven to be efficient in addressing it. However, the objective function of an LF model is nonconvex. Commonly adopted first-order methods cannot approach its second-order stationary point, thereby resulting in accuracy loss. On the other hand, traditional second-order methods are impractical for LF models since they suffer from high computational costs due to the required operations on the objective's huge Hessian matrix. In order to address this issue, this study proposes a generalized Nesterov-accelerated second-order LF (GNSLF) model that integrates twofold conceptions: 1) acquiring proper second-order step efficiently by adopting a Hessian-vector algorithm and 2) embedding the second-order step into a generalized Nesterov's acceleration (GNA) method for speeding up its linear search process. The analysis focuses on the local convergence for GNSLF's nonconvex cost function instead of the global convergence has been taken; its local convergence properties have been provided with theoretical proofs. Experimental results on six HDI data cases demonstrate that GNSLF performs better than state-of-the-art LF models in accuracy for missing data estimation with high efficiency, i.e., a second-order model can be accelerated by incorporating GNA without accuracy loss.</p>","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"PP ","pages":""},"PeriodicalIF":10.2000,"publicationDate":"2023-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Generalized Nesterov-Accelerated Second-Order Latent Factor Model for High-Dimensional and Incomplete Data.\",\"authors\":\"Weiling Li, Renfang Wang, Xin Luo\",\"doi\":\"10.1109/TNNLS.2023.3321915\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>High-dimensional and incomplete (HDI) data are frequently encountered in big date-related applications for describing restricted observed interactions among large node sets. How to perform accurate and efficient representation learning on such HDI data is a hot yet thorny issue. A latent factor (LF) model has proven to be efficient in addressing it. However, the objective function of an LF model is nonconvex. Commonly adopted first-order methods cannot approach its second-order stationary point, thereby resulting in accuracy loss. On the other hand, traditional second-order methods are impractical for LF models since they suffer from high computational costs due to the required operations on the objective's huge Hessian matrix. In order to address this issue, this study proposes a generalized Nesterov-accelerated second-order LF (GNSLF) model that integrates twofold conceptions: 1) acquiring proper second-order step efficiently by adopting a Hessian-vector algorithm and 2) embedding the second-order step into a generalized Nesterov's acceleration (GNA) method for speeding up its linear search process. The analysis focuses on the local convergence for GNSLF's nonconvex cost function instead of the global convergence has been taken; its local convergence properties have been provided with theoretical proofs. Experimental results on six HDI data cases demonstrate that GNSLF performs better than state-of-the-art LF models in accuracy for missing data estimation with high efficiency, i.e., a second-order model can be accelerated by incorporating GNA without accuracy loss.</p>\",\"PeriodicalId\":13303,\"journal\":{\"name\":\"IEEE transactions on neural networks and learning systems\",\"volume\":\"PP \",\"pages\":\"\"},\"PeriodicalIF\":10.2000,\"publicationDate\":\"2023-10-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE transactions on neural networks and learning systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1109/TNNLS.2023.3321915\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1109/TNNLS.2023.3321915","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
A Generalized Nesterov-Accelerated Second-Order Latent Factor Model for High-Dimensional and Incomplete Data.
High-dimensional and incomplete (HDI) data are frequently encountered in big date-related applications for describing restricted observed interactions among large node sets. How to perform accurate and efficient representation learning on such HDI data is a hot yet thorny issue. A latent factor (LF) model has proven to be efficient in addressing it. However, the objective function of an LF model is nonconvex. Commonly adopted first-order methods cannot approach its second-order stationary point, thereby resulting in accuracy loss. On the other hand, traditional second-order methods are impractical for LF models since they suffer from high computational costs due to the required operations on the objective's huge Hessian matrix. In order to address this issue, this study proposes a generalized Nesterov-accelerated second-order LF (GNSLF) model that integrates twofold conceptions: 1) acquiring proper second-order step efficiently by adopting a Hessian-vector algorithm and 2) embedding the second-order step into a generalized Nesterov's acceleration (GNA) method for speeding up its linear search process. The analysis focuses on the local convergence for GNSLF's nonconvex cost function instead of the global convergence has been taken; its local convergence properties have been provided with theoretical proofs. Experimental results on six HDI data cases demonstrate that GNSLF performs better than state-of-the-art LF models in accuracy for missing data estimation with high efficiency, i.e., a second-order model can be accelerated by incorporating GNA without accuracy loss.
期刊介绍:
The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.