{"title":"用于无监督领域适应的具有判别转移特征和实例选择功能的核极端学习机","authors":"Shaofei Zang, Huimin Li, Nannan Lu, Chao Ma, Jiwei Gao, Jianwei Ma, Jinfeng Lv","doi":"10.1007/s11063-024-11677-y","DOIUrl":null,"url":null,"abstract":"<p>The goal of domain adaptation (DA) is to develop a robust decision model on the source domain effectively generalize to the target domain data. State-of-the-art domain adaptation methods typically focus on finding an optimal inter-domain invariant feature representation or helpful instances from the source domain. In this paper, we propose a kernel extreme learning machine with discriminative transfer features and instance selection (KELM-DTF-IS) for unsupervised domain adaptation tasks, which consists of two steps: discriminative transfer feature extraction and classification with instance selection. At the feature extraction stage, we extend cross domain mean approximation(CDMA) by incorporating a penalty term and develop discriminative cross domain mean approximation (d-CDMA) to optimize the category separability between instances. Subsequently, d-CDMA is integrated into kernel ELM-AutoEncoder(KELM-AE) for extracting inter-domain invariant features. During the classification process, our approach uses CDMA metrics to compute a weights to each source instances based on their impact in reducing distribution differences between domains. Instances with a greater effect receive higher weights and vice versa. These weights are then used to distinguish and select source domain instances before incorporating them into weight KELM for proposing an adaptive classifier. Finally, we apply our approach to conduct classification experiments on publicly available domain adaptation datasets, and the results demonstrate its superiority over KELM and numerous other domain adaptation approaches.\n</p>","PeriodicalId":51144,"journal":{"name":"Neural Processing Letters","volume":"15 1","pages":""},"PeriodicalIF":2.6000,"publicationDate":"2024-08-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Kernel Extreme Learning Machine with Discriminative Transfer Feature and Instance Selection for Unsupervised Domain Adaptation\",\"authors\":\"Shaofei Zang, Huimin Li, Nannan Lu, Chao Ma, Jiwei Gao, Jianwei Ma, Jinfeng Lv\",\"doi\":\"10.1007/s11063-024-11677-y\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>The goal of domain adaptation (DA) is to develop a robust decision model on the source domain effectively generalize to the target domain data. State-of-the-art domain adaptation methods typically focus on finding an optimal inter-domain invariant feature representation or helpful instances from the source domain. In this paper, we propose a kernel extreme learning machine with discriminative transfer features and instance selection (KELM-DTF-IS) for unsupervised domain adaptation tasks, which consists of two steps: discriminative transfer feature extraction and classification with instance selection. At the feature extraction stage, we extend cross domain mean approximation(CDMA) by incorporating a penalty term and develop discriminative cross domain mean approximation (d-CDMA) to optimize the category separability between instances. Subsequently, d-CDMA is integrated into kernel ELM-AutoEncoder(KELM-AE) for extracting inter-domain invariant features. During the classification process, our approach uses CDMA metrics to compute a weights to each source instances based on their impact in reducing distribution differences between domains. Instances with a greater effect receive higher weights and vice versa. These weights are then used to distinguish and select source domain instances before incorporating them into weight KELM for proposing an adaptive classifier. Finally, we apply our approach to conduct classification experiments on publicly available domain adaptation datasets, and the results demonstrate its superiority over KELM and numerous other domain adaptation approaches.\\n</p>\",\"PeriodicalId\":51144,\"journal\":{\"name\":\"Neural Processing Letters\",\"volume\":\"15 1\",\"pages\":\"\"},\"PeriodicalIF\":2.6000,\"publicationDate\":\"2024-08-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Processing Letters\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1007/s11063-024-11677-y\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Processing Letters","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s11063-024-11677-y","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Kernel Extreme Learning Machine with Discriminative Transfer Feature and Instance Selection for Unsupervised Domain Adaptation
The goal of domain adaptation (DA) is to develop a robust decision model on the source domain effectively generalize to the target domain data. State-of-the-art domain adaptation methods typically focus on finding an optimal inter-domain invariant feature representation or helpful instances from the source domain. In this paper, we propose a kernel extreme learning machine with discriminative transfer features and instance selection (KELM-DTF-IS) for unsupervised domain adaptation tasks, which consists of two steps: discriminative transfer feature extraction and classification with instance selection. At the feature extraction stage, we extend cross domain mean approximation(CDMA) by incorporating a penalty term and develop discriminative cross domain mean approximation (d-CDMA) to optimize the category separability between instances. Subsequently, d-CDMA is integrated into kernel ELM-AutoEncoder(KELM-AE) for extracting inter-domain invariant features. During the classification process, our approach uses CDMA metrics to compute a weights to each source instances based on their impact in reducing distribution differences between domains. Instances with a greater effect receive higher weights and vice versa. These weights are then used to distinguish and select source domain instances before incorporating them into weight KELM for proposing an adaptive classifier. Finally, we apply our approach to conduct classification experiments on publicly available domain adaptation datasets, and the results demonstrate its superiority over KELM and numerous other domain adaptation approaches.
期刊介绍:
Neural Processing Letters is an international journal publishing research results and innovative ideas on all aspects of artificial neural networks. Coverage includes theoretical developments, biological models, new formal modes, learning, applications, software and hardware developments, and prospective researches.
The journal promotes fast exchange of information in the community of neural network researchers and users. The resurgence of interest in the field of artificial neural networks since the beginning of the 1980s is coupled to tremendous research activity in specialized or multidisciplinary groups. Research, however, is not possible without good communication between people and the exchange of information, especially in a field covering such different areas; fast communication is also a key aspect, and this is the reason for Neural Processing Letters