Tyler H. Chang, Layne T. Watson, Sven Leyffer, Thomas C. H. Lux, Hussain M. J. Almohri
{"title":"关于算法 1012 的备注:利用大型数据集计算投影","authors":"Tyler H. Chang, Layne T. Watson, Sven Leyffer, Thomas C. H. Lux, Hussain M. J. Almohri","doi":"10.1145/3656581","DOIUrl":null,"url":null,"abstract":"<p>In ACM TOMS Algorithm 1012, the <monospace>DELAUNAYSPARSE</monospace> software is given for performing Delaunay interpolation in medium to high dimensions. When extrapolating outside the convex hull of the training set, <monospace>DELAUNAYSPARSE</monospace> calls the nonnegative least squares solver <monospace>DWNNLS</monospace> to compute projections onto the convex hull. However, <monospace>DWNNLS</monospace> and many other available sum of squares optimization solvers were not intended for usage with many variable problems, which result from the large training sets that are typical in machine learning applications. Thus, a new <monospace>PROJECT</monospace> subroutine is given, based on the highly customizable quadratic program solver <monospace>BQPD</monospace>. This solution is shown to be as robust as <monospace>DELAUNAYSPARSE</monospace> for projection onto both synthetic and real-world data sets, where other available solvers frequently fail. Although it is intended as an update for <monospace>DELAUNAYSPARSE</monospace>, due to the difficulty and prevalence of the problem, this solution is likely to be of external interest as well.</p>","PeriodicalId":50935,"journal":{"name":"ACM Transactions on Mathematical Software","volume":null,"pages":null},"PeriodicalIF":2.7000,"publicationDate":"2024-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Remark on Algorithm 1012: Computing projections with large data sets\",\"authors\":\"Tyler H. Chang, Layne T. Watson, Sven Leyffer, Thomas C. H. Lux, Hussain M. J. Almohri\",\"doi\":\"10.1145/3656581\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>In ACM TOMS Algorithm 1012, the <monospace>DELAUNAYSPARSE</monospace> software is given for performing Delaunay interpolation in medium to high dimensions. When extrapolating outside the convex hull of the training set, <monospace>DELAUNAYSPARSE</monospace> calls the nonnegative least squares solver <monospace>DWNNLS</monospace> to compute projections onto the convex hull. However, <monospace>DWNNLS</monospace> and many other available sum of squares optimization solvers were not intended for usage with many variable problems, which result from the large training sets that are typical in machine learning applications. Thus, a new <monospace>PROJECT</monospace> subroutine is given, based on the highly customizable quadratic program solver <monospace>BQPD</monospace>. This solution is shown to be as robust as <monospace>DELAUNAYSPARSE</monospace> for projection onto both synthetic and real-world data sets, where other available solvers frequently fail. Although it is intended as an update for <monospace>DELAUNAYSPARSE</monospace>, due to the difficulty and prevalence of the problem, this solution is likely to be of external interest as well.</p>\",\"PeriodicalId\":50935,\"journal\":{\"name\":\"ACM Transactions on Mathematical Software\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":2.7000,\"publicationDate\":\"2024-04-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACM Transactions on Mathematical Software\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1145/3656581\",\"RegionNum\":1,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, SOFTWARE ENGINEERING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Transactions on Mathematical Software","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1145/3656581","RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
Remark on Algorithm 1012: Computing projections with large data sets
In ACM TOMS Algorithm 1012, the DELAUNAYSPARSE software is given for performing Delaunay interpolation in medium to high dimensions. When extrapolating outside the convex hull of the training set, DELAUNAYSPARSE calls the nonnegative least squares solver DWNNLS to compute projections onto the convex hull. However, DWNNLS and many other available sum of squares optimization solvers were not intended for usage with many variable problems, which result from the large training sets that are typical in machine learning applications. Thus, a new PROJECT subroutine is given, based on the highly customizable quadratic program solver BQPD. This solution is shown to be as robust as DELAUNAYSPARSE for projection onto both synthetic and real-world data sets, where other available solvers frequently fail. Although it is intended as an update for DELAUNAYSPARSE, due to the difficulty and prevalence of the problem, this solution is likely to be of external interest as well.
期刊介绍:
As a scientific journal, ACM Transactions on Mathematical Software (TOMS) documents the theoretical underpinnings of numeric, symbolic, algebraic, and geometric computing applications. It focuses on analysis and construction of algorithms and programs, and the interaction of programs and architecture. Algorithms documented in TOMS are available as the Collected Algorithms of the ACM at calgo.acm.org.