基于Shapley值的多项逻辑模型属性重要性分析方法

IF 2.8 3区 经济学 Q1 ECONOMICS
Patricio Salas , Rodrigo De la Fuente , Sebastian Astroza , Juan Antonio Carrasco
{"title":"基于Shapley值的多项逻辑模型属性重要性分析方法","authors":"Patricio Salas ,&nbsp;Rodrigo De la Fuente ,&nbsp;Sebastian Astroza ,&nbsp;Juan Antonio Carrasco","doi":"10.1016/j.jocm.2025.100538","DOIUrl":null,"url":null,"abstract":"<div><div>This paper investigates the use of Shapley values-based methods to determine the importance of attributes in discrete choice models, specifically within a Multinomial Logit (MNL) framework. We extend the Shapley decomposition Shorrocks (2013) method from linear models. Additionally, the SHAP method Lundberg and Lee (2017) idea is applied to assess the impact of attributes on individual-level choice probability predictions. A simulation study demonstrates the effectiveness of these approaches under various experimental conditions, including attributes in several ranges and interaction terms. Finally, an empirical application is conducted using well-known travel mode choice datasets. The simulation results show that Shapley values accurately capture the global importance of attributes on goodness-of-fit. The SHAP method provides transparency in MNL model predictions, clarifying how changes in attribute values influence choice probabilities for each decision-maker. These methods offer a complementary perspective to traditional metrics like elasticities and traditional relative importance analysis Orme(2006). In the empirical application, Shapley decomposition highlights the most relevant attributes, while SHAP values uncover individual-level impacts that might not be apparent through elasticities alone. Global and individual-level analysis offers a more comprehensive understanding of attribute importance. In summary, integrating Shapley values with traditional metrics ensures a robust analysis, aiding practitioners and policymakers in making informed decisions based on broad trends and specific impacts.</div></div>","PeriodicalId":46863,"journal":{"name":"Journal of Choice Modelling","volume":"54 ","pages":"Article 100538"},"PeriodicalIF":2.8000,"publicationDate":"2025-01-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Analysis of attribute importance in multinomial logit models using Shapley values-based methods\",\"authors\":\"Patricio Salas ,&nbsp;Rodrigo De la Fuente ,&nbsp;Sebastian Astroza ,&nbsp;Juan Antonio Carrasco\",\"doi\":\"10.1016/j.jocm.2025.100538\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>This paper investigates the use of Shapley values-based methods to determine the importance of attributes in discrete choice models, specifically within a Multinomial Logit (MNL) framework. We extend the Shapley decomposition Shorrocks (2013) method from linear models. Additionally, the SHAP method Lundberg and Lee (2017) idea is applied to assess the impact of attributes on individual-level choice probability predictions. A simulation study demonstrates the effectiveness of these approaches under various experimental conditions, including attributes in several ranges and interaction terms. Finally, an empirical application is conducted using well-known travel mode choice datasets. The simulation results show that Shapley values accurately capture the global importance of attributes on goodness-of-fit. The SHAP method provides transparency in MNL model predictions, clarifying how changes in attribute values influence choice probabilities for each decision-maker. These methods offer a complementary perspective to traditional metrics like elasticities and traditional relative importance analysis Orme(2006). In the empirical application, Shapley decomposition highlights the most relevant attributes, while SHAP values uncover individual-level impacts that might not be apparent through elasticities alone. Global and individual-level analysis offers a more comprehensive understanding of attribute importance. In summary, integrating Shapley values with traditional metrics ensures a robust analysis, aiding practitioners and policymakers in making informed decisions based on broad trends and specific impacts.</div></div>\",\"PeriodicalId\":46863,\"journal\":{\"name\":\"Journal of Choice Modelling\",\"volume\":\"54 \",\"pages\":\"Article 100538\"},\"PeriodicalIF\":2.8000,\"publicationDate\":\"2025-01-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Choice Modelling\",\"FirstCategoryId\":\"96\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1755534525000016\",\"RegionNum\":3,\"RegionCategory\":\"经济学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ECONOMICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Choice Modelling","FirstCategoryId":"96","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1755534525000016","RegionNum":3,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ECONOMICS","Score":null,"Total":0}
引用次数: 0

摘要

本文研究了使用基于Shapley值的方法来确定离散选择模型中属性的重要性,特别是在多项式Logit (MNL)框架内。我们从线性模型扩展了Shapley分解Shorrocks(2013)方法。此外,应用SHAP方法Lundberg和Lee(2017)的想法来评估属性对个人层面选择概率预测的影响。仿真研究证明了这些方法在各种实验条件下的有效性,包括几个范围和相互作用项的属性。最后,利用知名的出行方式选择数据集进行了实证应用。仿真结果表明,Shapley值能够准确地捕捉属性在拟合优度上的全局重要性。SHAP方法为MNL模型预测提供了透明度,阐明了属性值的变化如何影响每个决策者的选择概率。这些方法为弹性和传统的相对重要性分析等传统指标提供了补充视角。在实证应用中,Shapley分解突出了最相关的属性,而SHAP值揭示了个人层面的影响,这些影响可能仅通过弹性不明显。整体和个人层面的分析提供了对属性重要性更全面的理解。综上所述,将Shapley价值观与传统指标相结合可确保进行可靠的分析,帮助从业者和政策制定者根据广泛的趋势和具体的影响做出明智的决策。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Analysis of attribute importance in multinomial logit models using Shapley values-based methods
This paper investigates the use of Shapley values-based methods to determine the importance of attributes in discrete choice models, specifically within a Multinomial Logit (MNL) framework. We extend the Shapley decomposition Shorrocks (2013) method from linear models. Additionally, the SHAP method Lundberg and Lee (2017) idea is applied to assess the impact of attributes on individual-level choice probability predictions. A simulation study demonstrates the effectiveness of these approaches under various experimental conditions, including attributes in several ranges and interaction terms. Finally, an empirical application is conducted using well-known travel mode choice datasets. The simulation results show that Shapley values accurately capture the global importance of attributes on goodness-of-fit. The SHAP method provides transparency in MNL model predictions, clarifying how changes in attribute values influence choice probabilities for each decision-maker. These methods offer a complementary perspective to traditional metrics like elasticities and traditional relative importance analysis Orme(2006). In the empirical application, Shapley decomposition highlights the most relevant attributes, while SHAP values uncover individual-level impacts that might not be apparent through elasticities alone. Global and individual-level analysis offers a more comprehensive understanding of attribute importance. In summary, integrating Shapley values with traditional metrics ensures a robust analysis, aiding practitioners and policymakers in making informed decisions based on broad trends and specific impacts.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
4.10
自引率
12.50%
发文量
31
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信