{"title":"A fault-tolerant and scalable boosting method over vertically partitioned data","authors":"Hai Jiang, Songtao Shang, Peng Liu, Tong Yi","doi":"10.1049/cit2.12339","DOIUrl":null,"url":null,"abstract":"<p>Vertical federated learning (VFL) can learn a common machine learning model over vertically partitioned datasets. However, VFL are faced with these thorny problems: (1) both the training and prediction are very vulnerable to stragglers; (2) most VFL methods can only support a specific machine learning model. Suppose that VFL incorporates the features of centralised learning, then the above issues can be alleviated. With that in mind, this paper proposes a new VFL scheme, called FedBoost, which makes private parties upload the compressed partial order relations to the honest but curious server before training and prediction. The server can build a machine learning model and predict samples on the union of coded data. The theoretical analysis indicates that the absence of any private party will not affect the training and prediction as long as a round of communication is achieved. Our scheme can support canonical tree-based models such as Tree Boosting methods and Random Forests. The experimental results also demonstrate the availability of our scheme.</p>","PeriodicalId":46211,"journal":{"name":"CAAI Transactions on Intelligence Technology","volume":"9 5","pages":"1092-1100"},"PeriodicalIF":8.4000,"publicationDate":"2024-06-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1049/cit2.12339","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"CAAI Transactions on Intelligence Technology","FirstCategoryId":"94","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1049/cit2.12339","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Vertical federated learning (VFL) can learn a common machine learning model over vertically partitioned datasets. However, VFL are faced with these thorny problems: (1) both the training and prediction are very vulnerable to stragglers; (2) most VFL methods can only support a specific machine learning model. Suppose that VFL incorporates the features of centralised learning, then the above issues can be alleviated. With that in mind, this paper proposes a new VFL scheme, called FedBoost, which makes private parties upload the compressed partial order relations to the honest but curious server before training and prediction. The server can build a machine learning model and predict samples on the union of coded data. The theoretical analysis indicates that the absence of any private party will not affect the training and prediction as long as a round of communication is achieved. Our scheme can support canonical tree-based models such as Tree Boosting methods and Random Forests. The experimental results also demonstrate the availability of our scheme.
期刊介绍:
CAAI Transactions on Intelligence Technology is a leading venue for original research on the theoretical and experimental aspects of artificial intelligence technology. We are a fully open access journal co-published by the Institution of Engineering and Technology (IET) and the Chinese Association for Artificial Intelligence (CAAI) providing research which is openly accessible to read and share worldwide.