{"title":"可解释的多准则决策:三向决策视角","authors":"Chengjun Shi, Yiyu Yao","doi":"10.1016/j.ijar.2025.109528","DOIUrl":null,"url":null,"abstract":"<div><div>This paper proposes an Explainable Multi-Criteria Decision-Making (XMCDM) framework that constructs trilevel explanations with respect to classic multi-criteria decision-making methods. The framework consists of explainable data preparation, explainable decision analysis, and explainable decision support, which integrates ideas from three-way decision and symbols-meaning-value spaces. First, we briefly introduce the key concepts at each level and list potential issues to be resolved, including gathering multi-criteria data, interpreting multi-criteria decision-making working principles, and offering effective outcome presentation. We examine existing literature that solves part of those questions and point out that rule-based explanations may be applicable and efficient to explain ranking/ordering results. Then, we discuss two methods that generate three-way rankings with respect to an individual criterion and integrate three-way rankings with multi-criteria ranking. We modify the Iterative Dichotomiser 3 algorithm to build rule-based explanations. Finally, after giving a small illustrative example, we design experiments on five real-life datasets, test explainability of three classic multi-criteria decision-making methods, and tune the thresholds. The experimental results demonstrate that our proposed framework is feasible and adaptable to various data characteristics.</div></div>","PeriodicalId":13842,"journal":{"name":"International Journal of Approximate Reasoning","volume":"187 ","pages":"Article 109528"},"PeriodicalIF":3.0000,"publicationDate":"2025-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Explainable multi-criteria decision-making: A three-way decision perspective\",\"authors\":\"Chengjun Shi, Yiyu Yao\",\"doi\":\"10.1016/j.ijar.2025.109528\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>This paper proposes an Explainable Multi-Criteria Decision-Making (XMCDM) framework that constructs trilevel explanations with respect to classic multi-criteria decision-making methods. The framework consists of explainable data preparation, explainable decision analysis, and explainable decision support, which integrates ideas from three-way decision and symbols-meaning-value spaces. First, we briefly introduce the key concepts at each level and list potential issues to be resolved, including gathering multi-criteria data, interpreting multi-criteria decision-making working principles, and offering effective outcome presentation. We examine existing literature that solves part of those questions and point out that rule-based explanations may be applicable and efficient to explain ranking/ordering results. Then, we discuss two methods that generate three-way rankings with respect to an individual criterion and integrate three-way rankings with multi-criteria ranking. We modify the Iterative Dichotomiser 3 algorithm to build rule-based explanations. Finally, after giving a small illustrative example, we design experiments on five real-life datasets, test explainability of three classic multi-criteria decision-making methods, and tune the thresholds. The experimental results demonstrate that our proposed framework is feasible and adaptable to various data characteristics.</div></div>\",\"PeriodicalId\":13842,\"journal\":{\"name\":\"International Journal of Approximate Reasoning\",\"volume\":\"187 \",\"pages\":\"Article 109528\"},\"PeriodicalIF\":3.0000,\"publicationDate\":\"2025-07-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Approximate Reasoning\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0888613X25001690\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Approximate Reasoning","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0888613X25001690","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Explainable multi-criteria decision-making: A three-way decision perspective
This paper proposes an Explainable Multi-Criteria Decision-Making (XMCDM) framework that constructs trilevel explanations with respect to classic multi-criteria decision-making methods. The framework consists of explainable data preparation, explainable decision analysis, and explainable decision support, which integrates ideas from three-way decision and symbols-meaning-value spaces. First, we briefly introduce the key concepts at each level and list potential issues to be resolved, including gathering multi-criteria data, interpreting multi-criteria decision-making working principles, and offering effective outcome presentation. We examine existing literature that solves part of those questions and point out that rule-based explanations may be applicable and efficient to explain ranking/ordering results. Then, we discuss two methods that generate three-way rankings with respect to an individual criterion and integrate three-way rankings with multi-criteria ranking. We modify the Iterative Dichotomiser 3 algorithm to build rule-based explanations. Finally, after giving a small illustrative example, we design experiments on five real-life datasets, test explainability of three classic multi-criteria decision-making methods, and tune the thresholds. The experimental results demonstrate that our proposed framework is feasible and adaptable to various data characteristics.
期刊介绍:
The International Journal of Approximate Reasoning is intended to serve as a forum for the treatment of imprecision and uncertainty in Artificial and Computational Intelligence, covering both the foundations of uncertainty theories, and the design of intelligent systems for scientific and engineering applications. It publishes high-quality research papers describing theoretical developments or innovative applications, as well as review articles on topics of general interest.
Relevant topics include, but are not limited to, probabilistic reasoning and Bayesian networks, imprecise probabilities, random sets, belief functions (Dempster-Shafer theory), possibility theory, fuzzy sets, rough sets, decision theory, non-additive measures and integrals, qualitative reasoning about uncertainty, comparative probability orderings, game-theoretic probability, default reasoning, nonstandard logics, argumentation systems, inconsistency tolerant reasoning, elicitation techniques, philosophical foundations and psychological models of uncertain reasoning.
Domains of application for uncertain reasoning systems include risk analysis and assessment, information retrieval and database design, information fusion, machine learning, data and web mining, computer vision, image and signal processing, intelligent data analysis, statistics, multi-agent systems, etc.