{"title":"2 .无源通用域自适应的熵判别和能量优化","authors":"Meng Shen, A. J. Ma, PongChi Yuen","doi":"10.1109/ICME55011.2023.00460","DOIUrl":null,"url":null,"abstract":"Universal domain adaptation (UniDA) transfers knowledge under both distribution and category shifts. Most UniDA methods accessible to source-domain data during model adaptation may result in privacy policy violation and source-data transfer inefficiency. To address this issue, we propose a novel source-free UniDA method coupling confidence-guided entropy discrimination and likelihood-induced energy optimization. The entropy-based separation of target-known and unknown classes is too conservative for known-class prediction. Thus, we derive the confidence-guided entropy by scaling the normalized prediction score with the known-class confidence, that more known-class samples are correctly predicted. Due to difficult estimation of the marginal distribution without source-domain data, we constrain the target-domain marginal distribution by maximizing (minimizing) the known (unknown)-class likelihood, which equals free energy optimization. Theoretically, the overall optimization amounts to decreasing and increasing internal energy of known and unknown classes in physics, respectively. Extensive experiments demonstrate the superiority of the proposed method.","PeriodicalId":321830,"journal":{"name":"2023 IEEE International Conference on Multimedia and Expo (ICME)","volume":"10 2","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"E2: Entropy Discrimination and Energy Optimization for Source-free Universal Domain Adaptation\",\"authors\":\"Meng Shen, A. J. Ma, PongChi Yuen\",\"doi\":\"10.1109/ICME55011.2023.00460\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Universal domain adaptation (UniDA) transfers knowledge under both distribution and category shifts. Most UniDA methods accessible to source-domain data during model adaptation may result in privacy policy violation and source-data transfer inefficiency. To address this issue, we propose a novel source-free UniDA method coupling confidence-guided entropy discrimination and likelihood-induced energy optimization. The entropy-based separation of target-known and unknown classes is too conservative for known-class prediction. Thus, we derive the confidence-guided entropy by scaling the normalized prediction score with the known-class confidence, that more known-class samples are correctly predicted. Due to difficult estimation of the marginal distribution without source-domain data, we constrain the target-domain marginal distribution by maximizing (minimizing) the known (unknown)-class likelihood, which equals free energy optimization. Theoretically, the overall optimization amounts to decreasing and increasing internal energy of known and unknown classes in physics, respectively. Extensive experiments demonstrate the superiority of the proposed method.\",\"PeriodicalId\":321830,\"journal\":{\"name\":\"2023 IEEE International Conference on Multimedia and Expo (ICME)\",\"volume\":\"10 2\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 IEEE International Conference on Multimedia and Expo (ICME)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICME55011.2023.00460\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE International Conference on Multimedia and Expo (ICME)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICME55011.2023.00460","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
E2: Entropy Discrimination and Energy Optimization for Source-free Universal Domain Adaptation
Universal domain adaptation (UniDA) transfers knowledge under both distribution and category shifts. Most UniDA methods accessible to source-domain data during model adaptation may result in privacy policy violation and source-data transfer inefficiency. To address this issue, we propose a novel source-free UniDA method coupling confidence-guided entropy discrimination and likelihood-induced energy optimization. The entropy-based separation of target-known and unknown classes is too conservative for known-class prediction. Thus, we derive the confidence-guided entropy by scaling the normalized prediction score with the known-class confidence, that more known-class samples are correctly predicted. Due to difficult estimation of the marginal distribution without source-domain data, we constrain the target-domain marginal distribution by maximizing (minimizing) the known (unknown)-class likelihood, which equals free energy optimization. Theoretically, the overall optimization amounts to decreasing and increasing internal energy of known and unknown classes in physics, respectively. Extensive experiments demonstrate the superiority of the proposed method.