Stochastic Dual Coordinate Ascent for Learning Sign Constrained Linear Predictors

Miya Nakajima, Rikuto Mochida, Yuya Takada, Tsuyoshi Kato
{"title":"Stochastic Dual Coordinate Ascent for Learning Sign Constrained Linear Predictors","authors":"Miya Nakajima, Rikuto Mochida, Yuya Takada, Tsuyoshi Kato","doi":"10.5121/csit.2023.131801","DOIUrl":null,"url":null,"abstract":"Sign constraints are a handy representation of domain-specific prior knowledge that can be incorporated to machine learning. Under the sign constraints, the signs of the weight coefficients for linear predictors cannot be flipped from the ones specified in advance according to the prior knowledge. This paper presents new stochastic dual coordinate ascent (SDCA) algorithms that find the minimizer of the empirical risk under the sign constraints. Generic surrogate loss functions can be plugged into the proposed algorithm with the strong convergence guarantee inherited from the vanilla SDCA. A technical contribution of this work is the finding of an efficient algorithm that performs the SDCA update with a cost linear to the number of input features which coincides with the SDCA update without the sign constraints. Eventually, the computational cost O(nd) is achieved to attain an ϵ-accuracy solution. Pattern recognition experiments were carried out using a classification task for microbiological water quality analysis. The experimental results demonstrate the powerful prediction performance of the sign constraints.","PeriodicalId":91205,"journal":{"name":"Artificial intelligence and applications (Commerce, Calif.)","volume":"33 4","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Artificial intelligence and applications (Commerce, Calif.)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5121/csit.2023.131801","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Sign constraints are a handy representation of domain-specific prior knowledge that can be incorporated to machine learning. Under the sign constraints, the signs of the weight coefficients for linear predictors cannot be flipped from the ones specified in advance according to the prior knowledge. This paper presents new stochastic dual coordinate ascent (SDCA) algorithms that find the minimizer of the empirical risk under the sign constraints. Generic surrogate loss functions can be plugged into the proposed algorithm with the strong convergence guarantee inherited from the vanilla SDCA. A technical contribution of this work is the finding of an efficient algorithm that performs the SDCA update with a cost linear to the number of input features which coincides with the SDCA update without the sign constraints. Eventually, the computational cost O(nd) is achieved to attain an ϵ-accuracy solution. Pattern recognition experiments were carried out using a classification task for microbiological water quality analysis. The experimental results demonstrate the powerful prediction performance of the sign constraints.
学习符号约束线性预测器的随机双坐标上升
符号约束是一种方便的特定领域先验知识的表示,可以合并到机器学习中。在符号约束下,线性预测器的权重系数的符号不能从事先根据先验知识指定的符号中翻转过来。提出了一种新的随机双坐标上升(SDCA)算法,该算法在符号约束下求经验风险的最小值。该算法继承了传统SDCA算法的强收敛性保证,并引入了通用代理损失函数。这项工作的一个技术贡献是发现了一种有效的算法,该算法以与输入特征数量线性的代价执行SDCA更新,该特征与没有符号约束的SDCA更新相一致。最终,计算成本为0 (nd),得到ϵ-accuracy解。利用分类任务对微生物水质分析进行模式识别实验。实验结果证明了符号约束的强大预测性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信