训练效果不同的竞争对手之间的联合学习和信息共享

Jiajun Meng , Jing Chen , Dongfang Zhao
{"title":"训练效果不同的竞争对手之间的联合学习和信息共享","authors":"Jiajun Meng ,&nbsp;Jing Chen ,&nbsp;Dongfang Zhao","doi":"10.1016/j.ject.2024.12.003","DOIUrl":null,"url":null,"abstract":"<div><div>Federated Learning (FL) is an innovative technique that allows multiple firms to collaborate in training machine learning models while preserving data privacy. This is especially important in industries where data is sensitive or subject to regulations like the General Data Protection Regulation (GDPR). Despite its substantial benefits, the adoption of FL in competitive markets faces significant challenges, particularly due to concerns about training effectiveness and price competition. In practice, data from different firms may not be independently and identically distributed (non-IID) and heterogenous, which can lead to differences in model training effectiveness when aggregated through FL. This paper explores how initial product quality, data volume, and training effectiveness affect the formation of FL. We develop a theoretical model to analyze firms’ decisions between adopting machine learning (ML) independently or collaborating through FL. Our results show that when the initial product quality is high, FL can never be formed. Moreover, when the initial product quality is low, and when data volume is low and firms’ training effectiveness differences are small, FL is more likely to form. This is because the competition intensification effect is dominated by the market expansion effect of FL. However, when there is a significant difference in training effectiveness, firms are less likely to adopt FL due to concerns about competitive disadvantage (i.e., the market expansion effect is dominated by the competition intensification effect). This paper contributes to the literature on FL by addressing the strategic decisions firms face in competitive markets and providing insights into how FL designers and policymakers can encourage the formation of FL.</div></div>","PeriodicalId":100776,"journal":{"name":"Journal of Economy and Technology","volume":"3 ","pages":"Pages 1-9"},"PeriodicalIF":0.0000,"publicationDate":"2025-01-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Federated learning and information sharing between competitors with different training effectiveness\",\"authors\":\"Jiajun Meng ,&nbsp;Jing Chen ,&nbsp;Dongfang Zhao\",\"doi\":\"10.1016/j.ject.2024.12.003\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Federated Learning (FL) is an innovative technique that allows multiple firms to collaborate in training machine learning models while preserving data privacy. This is especially important in industries where data is sensitive or subject to regulations like the General Data Protection Regulation (GDPR). Despite its substantial benefits, the adoption of FL in competitive markets faces significant challenges, particularly due to concerns about training effectiveness and price competition. In practice, data from different firms may not be independently and identically distributed (non-IID) and heterogenous, which can lead to differences in model training effectiveness when aggregated through FL. This paper explores how initial product quality, data volume, and training effectiveness affect the formation of FL. We develop a theoretical model to analyze firms’ decisions between adopting machine learning (ML) independently or collaborating through FL. Our results show that when the initial product quality is high, FL can never be formed. Moreover, when the initial product quality is low, and when data volume is low and firms’ training effectiveness differences are small, FL is more likely to form. This is because the competition intensification effect is dominated by the market expansion effect of FL. However, when there is a significant difference in training effectiveness, firms are less likely to adopt FL due to concerns about competitive disadvantage (i.e., the market expansion effect is dominated by the competition intensification effect). This paper contributes to the literature on FL by addressing the strategic decisions firms face in competitive markets and providing insights into how FL designers and policymakers can encourage the formation of FL.</div></div>\",\"PeriodicalId\":100776,\"journal\":{\"name\":\"Journal of Economy and Technology\",\"volume\":\"3 \",\"pages\":\"Pages 1-9\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2025-01-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Economy and Technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2949948825000046\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Economy and Technology","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2949948825000046","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

联邦学习(FL)是一种创新技术,允许多家公司在保护数据隐私的同时合作训练机器学习模型。这在数据敏感或受《通用数据保护条例》(GDPR)等法规约束的行业尤为重要。尽管有巨大的好处,但在竞争激烈的市场中采用FL面临着重大挑战,特别是由于对培训有效性和价格竞争的关注。在实践中,来自不同公司的数据可能不是独立和同分布的(非iid)和异构的,这可能导致通过FL进行汇总时模型训练效果的差异。本文探讨了初始产品质量、数据量、我们建立了一个理论模型来分析企业在独立采用机器学习(ML)还是通过FL合作采用机器学习(ML)之间的决策。我们的结果表明,当初始产品质量很高时,FL永远不会形成。此外,当初始产品质量较低、数据量较低、企业培训有效性差异较小时,FL更容易形成。这是因为竞争加剧效应受培训效果的市场扩张效应支配,而当培训效果存在显著差异时,企业由于担心竞争劣势(即市场扩张效应受竞争加剧效应支配)而不太可能采用培训效果。本文通过解决企业在竞争市场中面临的战略决策,并为FL设计师和政策制定者如何鼓励FL的形成提供见解,从而对FL的文献做出了贡献。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Federated learning and information sharing between competitors with different training effectiveness
Federated Learning (FL) is an innovative technique that allows multiple firms to collaborate in training machine learning models while preserving data privacy. This is especially important in industries where data is sensitive or subject to regulations like the General Data Protection Regulation (GDPR). Despite its substantial benefits, the adoption of FL in competitive markets faces significant challenges, particularly due to concerns about training effectiveness and price competition. In practice, data from different firms may not be independently and identically distributed (non-IID) and heterogenous, which can lead to differences in model training effectiveness when aggregated through FL. This paper explores how initial product quality, data volume, and training effectiveness affect the formation of FL. We develop a theoretical model to analyze firms’ decisions between adopting machine learning (ML) independently or collaborating through FL. Our results show that when the initial product quality is high, FL can never be formed. Moreover, when the initial product quality is low, and when data volume is low and firms’ training effectiveness differences are small, FL is more likely to form. This is because the competition intensification effect is dominated by the market expansion effect of FL. However, when there is a significant difference in training effectiveness, firms are less likely to adopt FL due to concerns about competitive disadvantage (i.e., the market expansion effect is dominated by the competition intensification effect). This paper contributes to the literature on FL by addressing the strategic decisions firms face in competitive markets and providing insights into how FL designers and policymakers can encourage the formation of FL.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信