{"title":"并行机器学习框架中的稀疏最小二乘方法","authors":"R. Natarajan, Vikas Sindhwani, S. Tatikonda","doi":"10.1109/ICDMW.2009.106","DOIUrl":null,"url":null,"abstract":"We describe parallel methods for solving large-scale, high-dimensional, sparse least-squares problems that arise in machine learning applications such as document classification. The basic idea is to solve a two-class response problem using a fast regression technique based on minimizing a loss function, which consists of an empirical squared-error term, and one or more regularization terms. We consider the use of Lenclos-based methods for solving these regularized least-squares problems, with the parallel implementation in the Parallel MachineLearning (PML) framework, and performance results on the IBM Blue Gene/P parallel computer.","PeriodicalId":351078,"journal":{"name":"2009 IEEE International Conference on Data Mining Workshops","volume":"99 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2009-12-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Sparse Least-Squares Methods in the Parallel Machine Learning (PML) Framework\",\"authors\":\"R. Natarajan, Vikas Sindhwani, S. Tatikonda\",\"doi\":\"10.1109/ICDMW.2009.106\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We describe parallel methods for solving large-scale, high-dimensional, sparse least-squares problems that arise in machine learning applications such as document classification. The basic idea is to solve a two-class response problem using a fast regression technique based on minimizing a loss function, which consists of an empirical squared-error term, and one or more regularization terms. We consider the use of Lenclos-based methods for solving these regularized least-squares problems, with the parallel implementation in the Parallel MachineLearning (PML) framework, and performance results on the IBM Blue Gene/P parallel computer.\",\"PeriodicalId\":351078,\"journal\":{\"name\":\"2009 IEEE International Conference on Data Mining Workshops\",\"volume\":\"99 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2009-12-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2009 IEEE International Conference on Data Mining Workshops\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICDMW.2009.106\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2009 IEEE International Conference on Data Mining Workshops","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDMW.2009.106","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
摘要
我们描述了用于解决机器学习应用(如文档分类)中出现的大规模,高维,稀疏最小二乘问题的并行方法。其基本思想是使用基于最小化损失函数的快速回归技术来解决两类响应问题,损失函数由经验平方误差项和一个或多个正则化项组成。我们考虑使用基于lenclos的方法来解决这些正则化最小二乘问题,并在并行机器学习(PML)框架中并行实现,以及在IBM Blue Gene/P并行计算机上的性能结果。
Sparse Least-Squares Methods in the Parallel Machine Learning (PML) Framework
We describe parallel methods for solving large-scale, high-dimensional, sparse least-squares problems that arise in machine learning applications such as document classification. The basic idea is to solve a two-class response problem using a fast regression technique based on minimizing a loss function, which consists of an empirical squared-error term, and one or more regularization terms. We consider the use of Lenclos-based methods for solving these regularized least-squares problems, with the parallel implementation in the Parallel MachineLearning (PML) framework, and performance results on the IBM Blue Gene/P parallel computer.