Reza Mirzaeifard, Vinay Chakravarthi Gogineni, Naveen K. D. Venkategowda, Stefan Werner
{"title":"非凸稀疏惩罚的分布分位数回归","authors":"Reza Mirzaeifard, Vinay Chakravarthi Gogineni, Naveen K. D. Venkategowda, Stefan Werner","doi":"10.1109/SSP53291.2023.10208080","DOIUrl":null,"url":null,"abstract":"The surge in data generated by IoT sensors has increased the need for scalable and efficient data analysis methods, particularly for robust algorithms like quantile regression, which can be tailored to meet a variety of situations, including nonlinear relationships, distributions with heavy tails, and outliers. This paper presents a sub-gradient-based algorithm for distributed quantile regression with non-convex, and non-smooth sparse penalties such as the Minimax Concave Penalty (MCP) and Smoothly Clipped Absolute Deviation (SCAD). These penalties selectively shrink non-active coefficients towards zero, addressing the limitations of traditional penalties like the l1-penalty in sparse models. Existing quantile regression algorithms with non-convex penalties are designed for centralized cases, whereas our proposed method can be applied to distributed quantile regression using non-convex penalties, thereby improving estimation accuracy. We provide a convergence proof for our proposed algorithm and demonstrate through numerical simulations that it outperforms state-of-the-art algorithms in sparse and moderately sparse scenarios.","PeriodicalId":296346,"journal":{"name":"2023 IEEE Statistical Signal Processing Workshop (SSP)","volume":"29 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Distributed Quantile Regression with Non-Convex Sparse Penalties\",\"authors\":\"Reza Mirzaeifard, Vinay Chakravarthi Gogineni, Naveen K. D. Venkategowda, Stefan Werner\",\"doi\":\"10.1109/SSP53291.2023.10208080\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The surge in data generated by IoT sensors has increased the need for scalable and efficient data analysis methods, particularly for robust algorithms like quantile regression, which can be tailored to meet a variety of situations, including nonlinear relationships, distributions with heavy tails, and outliers. This paper presents a sub-gradient-based algorithm for distributed quantile regression with non-convex, and non-smooth sparse penalties such as the Minimax Concave Penalty (MCP) and Smoothly Clipped Absolute Deviation (SCAD). These penalties selectively shrink non-active coefficients towards zero, addressing the limitations of traditional penalties like the l1-penalty in sparse models. Existing quantile regression algorithms with non-convex penalties are designed for centralized cases, whereas our proposed method can be applied to distributed quantile regression using non-convex penalties, thereby improving estimation accuracy. We provide a convergence proof for our proposed algorithm and demonstrate through numerical simulations that it outperforms state-of-the-art algorithms in sparse and moderately sparse scenarios.\",\"PeriodicalId\":296346,\"journal\":{\"name\":\"2023 IEEE Statistical Signal Processing Workshop (SSP)\",\"volume\":\"29 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-07-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 IEEE Statistical Signal Processing Workshop (SSP)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SSP53291.2023.10208080\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE Statistical Signal Processing Workshop (SSP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SSP53291.2023.10208080","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Distributed Quantile Regression with Non-Convex Sparse Penalties
The surge in data generated by IoT sensors has increased the need for scalable and efficient data analysis methods, particularly for robust algorithms like quantile regression, which can be tailored to meet a variety of situations, including nonlinear relationships, distributions with heavy tails, and outliers. This paper presents a sub-gradient-based algorithm for distributed quantile regression with non-convex, and non-smooth sparse penalties such as the Minimax Concave Penalty (MCP) and Smoothly Clipped Absolute Deviation (SCAD). These penalties selectively shrink non-active coefficients towards zero, addressing the limitations of traditional penalties like the l1-penalty in sparse models. Existing quantile regression algorithms with non-convex penalties are designed for centralized cases, whereas our proposed method can be applied to distributed quantile regression using non-convex penalties, thereby improving estimation accuracy. We provide a convergence proof for our proposed algorithm and demonstrate through numerical simulations that it outperforms state-of-the-art algorithms in sparse and moderately sparse scenarios.