{"title":"基于仿真的确定性和随机多准则模型的比较:秩散度分析","authors":"David M. Mahalak","doi":"10.1002/mcda.70016","DOIUrl":null,"url":null,"abstract":"<div>\n \n <p>Although Stochastic Multicriteria Acceptability Analysis (SMAA) has been widely applied in real-world decision problems, limited research has examined the structural conditions that lead to rank disagreement between deterministic and stochastic model outputs. This paper addresses that gap through a simulation-based analysis of 50 randomly generated decision problems. First, one-hot encoded vectors were developed to compare the deterministic top-ranked alternatives with their SMAA rank acceptability distributions to evaluate rank divergence. Descriptive statistics showed that cases with disagreement had a substantially higher mean Jensen–Shannon Distance (JSD) (0.79) in comparison to non-divergent cases (0.43). Moreover, scatterplot analysis revealed that divergent cases typically have high JSD values (≥ 0.6), low rank-1 acceptability (≤ 0.2), and high rank expectation (≥ 4). Second, statistical techniques were used to compare differences between structural features, i.e., criteria, alternatives, minimum and maximum criteria. Furthermore, the Criteria Balance Score (CBS) was developed to quantify criteria type imbalance, where values of 0 show perfect balance and scores close to 1 demonstrate disparity. Results showed that divergent cases included decision problems with statistically significant larger model complexity, i.e., number of criteria, and criteria type min/max balance, which was an unexpected finding. Third, threshold-based analyses revealed that 62.5% of divergent cases included decision structures with 10 or more criteria, and that 75% of diverging cases with CBS below 0.20 had a min/max criteria type difference of 0 or 1. Finally, consistency in divergence patterns was independently explored within four multicriteria decision analysis models. Findings suggest that divergence is largely a function of decision space characteristics, rather than idiosyncrasies of individual models. Together, these findings provide real-world decision makers, analysts, and researchers with practical, evidence-based thresholds for instances when deterministic results may not be robust. By identifying these structural warnings in advance, decision makers can increase stakeholder trust and reliability in the decision-making process.</p>\n </div>","PeriodicalId":45876,"journal":{"name":"Journal of Multi-Criteria Decision Analysis","volume":"32 2","pages":""},"PeriodicalIF":2.4000,"publicationDate":"2025-08-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Simulation-Based Comparison of Deterministic and Stochastic Multicriteria Models: Analyzing Rank Divergence\",\"authors\":\"David M. Mahalak\",\"doi\":\"10.1002/mcda.70016\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div>\\n \\n <p>Although Stochastic Multicriteria Acceptability Analysis (SMAA) has been widely applied in real-world decision problems, limited research has examined the structural conditions that lead to rank disagreement between deterministic and stochastic model outputs. This paper addresses that gap through a simulation-based analysis of 50 randomly generated decision problems. First, one-hot encoded vectors were developed to compare the deterministic top-ranked alternatives with their SMAA rank acceptability distributions to evaluate rank divergence. Descriptive statistics showed that cases with disagreement had a substantially higher mean Jensen–Shannon Distance (JSD) (0.79) in comparison to non-divergent cases (0.43). Moreover, scatterplot analysis revealed that divergent cases typically have high JSD values (≥ 0.6), low rank-1 acceptability (≤ 0.2), and high rank expectation (≥ 4). Second, statistical techniques were used to compare differences between structural features, i.e., criteria, alternatives, minimum and maximum criteria. Furthermore, the Criteria Balance Score (CBS) was developed to quantify criteria type imbalance, where values of 0 show perfect balance and scores close to 1 demonstrate disparity. Results showed that divergent cases included decision problems with statistically significant larger model complexity, i.e., number of criteria, and criteria type min/max balance, which was an unexpected finding. Third, threshold-based analyses revealed that 62.5% of divergent cases included decision structures with 10 or more criteria, and that 75% of diverging cases with CBS below 0.20 had a min/max criteria type difference of 0 or 1. Finally, consistency in divergence patterns was independently explored within four multicriteria decision analysis models. Findings suggest that divergence is largely a function of decision space characteristics, rather than idiosyncrasies of individual models. Together, these findings provide real-world decision makers, analysts, and researchers with practical, evidence-based thresholds for instances when deterministic results may not be robust. By identifying these structural warnings in advance, decision makers can increase stakeholder trust and reliability in the decision-making process.</p>\\n </div>\",\"PeriodicalId\":45876,\"journal\":{\"name\":\"Journal of Multi-Criteria Decision Analysis\",\"volume\":\"32 2\",\"pages\":\"\"},\"PeriodicalIF\":2.4000,\"publicationDate\":\"2025-08-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Multi-Criteria Decision Analysis\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1002/mcda.70016\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"MANAGEMENT\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Multi-Criteria Decision Analysis","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/mcda.70016","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"MANAGEMENT","Score":null,"Total":0}
A Simulation-Based Comparison of Deterministic and Stochastic Multicriteria Models: Analyzing Rank Divergence
Although Stochastic Multicriteria Acceptability Analysis (SMAA) has been widely applied in real-world decision problems, limited research has examined the structural conditions that lead to rank disagreement between deterministic and stochastic model outputs. This paper addresses that gap through a simulation-based analysis of 50 randomly generated decision problems. First, one-hot encoded vectors were developed to compare the deterministic top-ranked alternatives with their SMAA rank acceptability distributions to evaluate rank divergence. Descriptive statistics showed that cases with disagreement had a substantially higher mean Jensen–Shannon Distance (JSD) (0.79) in comparison to non-divergent cases (0.43). Moreover, scatterplot analysis revealed that divergent cases typically have high JSD values (≥ 0.6), low rank-1 acceptability (≤ 0.2), and high rank expectation (≥ 4). Second, statistical techniques were used to compare differences between structural features, i.e., criteria, alternatives, minimum and maximum criteria. Furthermore, the Criteria Balance Score (CBS) was developed to quantify criteria type imbalance, where values of 0 show perfect balance and scores close to 1 demonstrate disparity. Results showed that divergent cases included decision problems with statistically significant larger model complexity, i.e., number of criteria, and criteria type min/max balance, which was an unexpected finding. Third, threshold-based analyses revealed that 62.5% of divergent cases included decision structures with 10 or more criteria, and that 75% of diverging cases with CBS below 0.20 had a min/max criteria type difference of 0 or 1. Finally, consistency in divergence patterns was independently explored within four multicriteria decision analysis models. Findings suggest that divergence is largely a function of decision space characteristics, rather than idiosyncrasies of individual models. Together, these findings provide real-world decision makers, analysts, and researchers with practical, evidence-based thresholds for instances when deterministic results may not be robust. By identifying these structural warnings in advance, decision makers can increase stakeholder trust and reliability in the decision-making process.
期刊介绍:
The Journal of Multi-Criteria Decision Analysis was launched in 1992, and from the outset has aimed to be the repository of choice for papers covering all aspects of MCDA/MCDM. The journal provides an international forum for the presentation and discussion of all aspects of research, application and evaluation of multi-criteria decision analysis, and publishes material from a variety of disciplines and all schools of thought. Papers addressing mathematical, theoretical, and behavioural aspects are welcome, as are case studies, applications and evaluation of techniques and methodologies.