最小分位数回归通过现代优化

D. Bertsimas, R. Mazumder
{"title":"最小分位数回归通过现代优化","authors":"D. Bertsimas, R. Mazumder","doi":"10.1214/14-AOS1223","DOIUrl":null,"url":null,"abstract":"We address the Least Quantile of Squares (LQS) (and in particular the Least Median of Squares) regression problem using modern optimization methods. We propose a Mixed Integer Optimization (MIO) formulation of the LQS problem which allows us to find a provably global optimal solution for the LQS problem. Our MIO framework has the appealing characteristic that if we terminate the algorithm early, we obtain a solution with a guarantee on its sub-optimality. We also propose continuous optimization methods based on first-order subdifferential methods, sequential linear optimization and hybrid combinations of them to obtain near optimal solutions to the LQS problem. The MIO algorithm is found to benefit significantly from high quality solutions delivered by our continuous optimization based methods. We further show that the MIO approach leads to (a) an optimal solution for any dataset, where the data-points $(y_i,\\mathbf{x}_i)$'s are not necessarily in general position, (b) a simple proof of the breakdown point of the LQS objective value that holds for any dataset and (c) an extension to situations where there are polyhedral constraints on the regression coefficient vector. We report computational results with both synthetic and real-world datasets showing that the MIO algorithm with warm starts from the continuous optimization methods solve small ($n=100$) and medium ($n=500$) size problems to provable optimality in under two hours, and outperform all publicly available methods for large-scale ($n={}$10,000) LQS problems.","PeriodicalId":8446,"journal":{"name":"arXiv: Computation","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2013-10-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"65","resultStr":"{\"title\":\"Least quantile regression via modern optimization\",\"authors\":\"D. Bertsimas, R. Mazumder\",\"doi\":\"10.1214/14-AOS1223\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We address the Least Quantile of Squares (LQS) (and in particular the Least Median of Squares) regression problem using modern optimization methods. We propose a Mixed Integer Optimization (MIO) formulation of the LQS problem which allows us to find a provably global optimal solution for the LQS problem. Our MIO framework has the appealing characteristic that if we terminate the algorithm early, we obtain a solution with a guarantee on its sub-optimality. We also propose continuous optimization methods based on first-order subdifferential methods, sequential linear optimization and hybrid combinations of them to obtain near optimal solutions to the LQS problem. The MIO algorithm is found to benefit significantly from high quality solutions delivered by our continuous optimization based methods. We further show that the MIO approach leads to (a) an optimal solution for any dataset, where the data-points $(y_i,\\\\mathbf{x}_i)$'s are not necessarily in general position, (b) a simple proof of the breakdown point of the LQS objective value that holds for any dataset and (c) an extension to situations where there are polyhedral constraints on the regression coefficient vector. We report computational results with both synthetic and real-world datasets showing that the MIO algorithm with warm starts from the continuous optimization methods solve small ($n=100$) and medium ($n=500$) size problems to provable optimality in under two hours, and outperform all publicly available methods for large-scale ($n={}$10,000) LQS problems.\",\"PeriodicalId\":8446,\"journal\":{\"name\":\"arXiv: Computation\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2013-10-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"65\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv: Computation\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1214/14-AOS1223\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv: Computation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1214/14-AOS1223","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 65

摘要

我们使用现代优化方法解决了最小二乘分位数(LQS)(特别是最小二乘中位数)回归问题。我们提出了LQS问题的混合整数优化(MIO)公式,使我们能够找到LQS问题的可证明的全局最优解。我们的MIO框架有一个吸引人的特点,即如果我们提前终止算法,我们可以得到一个保证其次优性的解。我们还提出了基于一阶次微分法、顺序线性优化及其混合组合的连续优化方法,以获得LQS问题的近最优解。我们发现,MIO算法从我们基于持续优化的方法提供的高质量解决方案中受益匪浅。我们进一步表明,MIO方法导致(a)任何数据集的最优解,其中数据点$(y_i,\mathbf{x}_i)$'s不一定在一般位置,(b)对任何数据集都适用的LQS目标值的击穿点的简单证明,以及(c)对回归系数向量存在多面体约束的情况的扩展。我们报告了合成数据集和实际数据集的计算结果,结果表明,从连续优化方法开始的热启动MIO算法在两小时内解决了小($n=100$)和中($n=500$)规模的问题到可证明的最优性,并且优于所有公开可用的大规模($n={}$10,000) LQS问题。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Least quantile regression via modern optimization
We address the Least Quantile of Squares (LQS) (and in particular the Least Median of Squares) regression problem using modern optimization methods. We propose a Mixed Integer Optimization (MIO) formulation of the LQS problem which allows us to find a provably global optimal solution for the LQS problem. Our MIO framework has the appealing characteristic that if we terminate the algorithm early, we obtain a solution with a guarantee on its sub-optimality. We also propose continuous optimization methods based on first-order subdifferential methods, sequential linear optimization and hybrid combinations of them to obtain near optimal solutions to the LQS problem. The MIO algorithm is found to benefit significantly from high quality solutions delivered by our continuous optimization based methods. We further show that the MIO approach leads to (a) an optimal solution for any dataset, where the data-points $(y_i,\mathbf{x}_i)$'s are not necessarily in general position, (b) a simple proof of the breakdown point of the LQS objective value that holds for any dataset and (c) an extension to situations where there are polyhedral constraints on the regression coefficient vector. We report computational results with both synthetic and real-world datasets showing that the MIO algorithm with warm starts from the continuous optimization methods solve small ($n=100$) and medium ($n=500$) size problems to provable optimality in under two hours, and outperform all publicly available methods for large-scale ($n={}$10,000) LQS problems.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信