基于梯度下降学习的可解释模糊聚类回归算法

Javier Viaña, Stephan Ralescu, A. Ralescu, Kelly Cohen, V. Kreinovich
{"title":"基于梯度下降学习的可解释模糊聚类回归算法","authors":"Javier Viaña, Stephan Ralescu, A. Ralescu, Kelly Cohen, V. Kreinovich","doi":"10.20517/ces.2022.14","DOIUrl":null,"url":null,"abstract":"We propose an algorithm for n-dimensional regression problems with continuous variables. Its main property is explainability, which we identify as the ability to understand the algorithm’s decisions from a human perspective. This has been achieved thanks to the simplicity of the architecture, the lack of hidden layers (as opposed to deep neural networks used for this same task) and the linguistic nature of its fuzzy inference system. First, the algorithm divides the joint input-output space into clusters that are subsequently approximated using linear functions. Then, we fit a Cauchy membership function to each cluster, therefore identifying them as fuzzy sets. The prediction of each linear regression is merged using a Takagi-Sugeno-Kang approach to generate the prediction of the model. Finally, the parameters of the algorithm (those from the linear functions and Cauchy membership functions) are fine-tuned using Gradient Descent optimization. In order to validate this algorithm, we considered three different scenarios: The first two are simple one-input and two-input problems with artificial data, which allow visual inspection of the results. In the third scenario we use real data for the prediction of the power generated by a Combined Cycle Power Plant. The results obtained in this last problem (3.513 RMSE and 2.649 MAE) outperform the state of the art (3.787 RMSE and 2.818 MAE).","PeriodicalId":72652,"journal":{"name":"Complex engineering systems (Alhambra, Calif.)","volume":"1 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Explainable fuzzy cluster-based regression algorithm with gradient descent learning\",\"authors\":\"Javier Viaña, Stephan Ralescu, A. Ralescu, Kelly Cohen, V. Kreinovich\",\"doi\":\"10.20517/ces.2022.14\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We propose an algorithm for n-dimensional regression problems with continuous variables. Its main property is explainability, which we identify as the ability to understand the algorithm’s decisions from a human perspective. This has been achieved thanks to the simplicity of the architecture, the lack of hidden layers (as opposed to deep neural networks used for this same task) and the linguistic nature of its fuzzy inference system. First, the algorithm divides the joint input-output space into clusters that are subsequently approximated using linear functions. Then, we fit a Cauchy membership function to each cluster, therefore identifying them as fuzzy sets. The prediction of each linear regression is merged using a Takagi-Sugeno-Kang approach to generate the prediction of the model. Finally, the parameters of the algorithm (those from the linear functions and Cauchy membership functions) are fine-tuned using Gradient Descent optimization. In order to validate this algorithm, we considered three different scenarios: The first two are simple one-input and two-input problems with artificial data, which allow visual inspection of the results. In the third scenario we use real data for the prediction of the power generated by a Combined Cycle Power Plant. The results obtained in this last problem (3.513 RMSE and 2.649 MAE) outperform the state of the art (3.787 RMSE and 2.818 MAE).\",\"PeriodicalId\":72652,\"journal\":{\"name\":\"Complex engineering systems (Alhambra, Calif.)\",\"volume\":\"1 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Complex engineering systems (Alhambra, Calif.)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.20517/ces.2022.14\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Complex engineering systems (Alhambra, Calif.)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.20517/ces.2022.14","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

提出了一种求解连续变量n维回归问题的算法。它的主要属性是可解释性,我们将其定义为从人类角度理解算法决策的能力。这要归功于架构的简单性,缺乏隐藏层(与用于同一任务的深度神经网络相反)以及其模糊推理系统的语言特性。首先,该算法将联合输入输出空间划分为簇,然后使用线性函数进行近似。然后,我们对每个聚类拟合柯西隶属函数,从而将它们识别为模糊集。使用Takagi-Sugeno-Kang方法合并每个线性回归的预测以生成模型的预测。最后,采用梯度下降优化方法对算法参数(线性函数和柯西隶属函数)进行微调。为了验证该算法,我们考虑了三种不同的场景:前两种是简单的单输入和双输入问题,使用人工数据,允许对结果进行视觉检查。在第三种情况下,我们使用实际数据来预测联合循环发电厂的发电量。在最后一个问题中获得的结果(3.513 RMSE和2.649 MAE)优于目前的状态(3.787 RMSE和2.818 MAE)。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Explainable fuzzy cluster-based regression algorithm with gradient descent learning
We propose an algorithm for n-dimensional regression problems with continuous variables. Its main property is explainability, which we identify as the ability to understand the algorithm’s decisions from a human perspective. This has been achieved thanks to the simplicity of the architecture, the lack of hidden layers (as opposed to deep neural networks used for this same task) and the linguistic nature of its fuzzy inference system. First, the algorithm divides the joint input-output space into clusters that are subsequently approximated using linear functions. Then, we fit a Cauchy membership function to each cluster, therefore identifying them as fuzzy sets. The prediction of each linear regression is merged using a Takagi-Sugeno-Kang approach to generate the prediction of the model. Finally, the parameters of the algorithm (those from the linear functions and Cauchy membership functions) are fine-tuned using Gradient Descent optimization. In order to validate this algorithm, we considered three different scenarios: The first two are simple one-input and two-input problems with artificial data, which allow visual inspection of the results. In the third scenario we use real data for the prediction of the power generated by a Combined Cycle Power Plant. The results obtained in this last problem (3.513 RMSE and 2.649 MAE) outperform the state of the art (3.787 RMSE and 2.818 MAE).
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
1.70
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信