{"title":"Transitive Transfer Learning","authors":"Ben Tan, Yangqiu Song, Erheng Zhong, Qiang Yang","doi":"10.1145/2783258.2783295","DOIUrl":null,"url":null,"abstract":"Transfer learning, which leverages knowledge from source domains to enhance learning ability in a target domain, has been proven effective in various applications. One major limitation of transfer learning is that the source and target domains should be directly related. If there is little overlap between the two domains, performing knowledge transfer between these domains will not be effective. Inspired by human transitive inference and learning ability, whereby two seemingly unrelated concepts can be connected by a string of intermediate bridges using auxiliary concepts, in this paper we study a novel learning problem: Transitive Transfer Learning (abbreviated to TTL). TTL is aimed at breaking the large domain distances and transfer knowledge even when the source and target domains share few factors directly. For example, when the source and target domains are documents and images respectively, TTL could use some annotated images as the intermediate domain to bridge them. To solve the TTL problem, we propose a learning framework to mimic the human learning process. The framework is composed of an intermediate domain selection component and a knowledge transfer component. Extensive empirical evidence shows that the framework yields state-of-the-art classification accuracies on several classification data sets.","PeriodicalId":243428,"journal":{"name":"Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining","volume":"55 3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-08-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"163","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2783258.2783295","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 163
Abstract
Transfer learning, which leverages knowledge from source domains to enhance learning ability in a target domain, has been proven effective in various applications. One major limitation of transfer learning is that the source and target domains should be directly related. If there is little overlap between the two domains, performing knowledge transfer between these domains will not be effective. Inspired by human transitive inference and learning ability, whereby two seemingly unrelated concepts can be connected by a string of intermediate bridges using auxiliary concepts, in this paper we study a novel learning problem: Transitive Transfer Learning (abbreviated to TTL). TTL is aimed at breaking the large domain distances and transfer knowledge even when the source and target domains share few factors directly. For example, when the source and target domains are documents and images respectively, TTL could use some annotated images as the intermediate domain to bridge them. To solve the TTL problem, we propose a learning framework to mimic the human learning process. The framework is composed of an intermediate domain selection component and a knowledge transfer component. Extensive empirical evidence shows that the framework yields state-of-the-art classification accuracies on several classification data sets.
迁移学习是一种利用源领域的知识来增强目标领域的学习能力的方法,已被证明在各种应用中是有效的。迁移学习的一个主要限制是源域和目标域应该直接相关。如果两个领域之间的重叠很少,那么在这些领域之间进行知识转移将是无效的。受人类传递推理和学习能力的启发,我们研究了一个新的学习问题:传递迁移学习(transitive Transfer learning,简称TTL),即两个看似不相关的概念可以通过一系列使用辅助概念的中间桥梁连接起来。TTL旨在打破大的领域距离,并在源领域和目标领域直接共享很少因素的情况下传输知识。例如,当源域和目标域分别是文档和图像时,TTL可以使用一些带注释的图像作为中间域来桥接它们。为了解决TTL问题,我们提出了一个模仿人类学习过程的学习框架。该框架由中间领域选择组件和知识转移组件组成。大量的经验证据表明,该框架在几个分类数据集上产生了最先进的分类精度。