{"title":"学会指导学习","authors":"Dagui Chen, Feng Chen","doi":"10.1145/3305275.3305288","DOIUrl":null,"url":null,"abstract":"One reason why deep neural networks require lots of data is that most current training methods are only driven by the task goal information. We propose a novel instructor which can guide networks to learn useful abstraction. Since the instructor provides additional learning power, the efficiency of data is significantly improved. To get appropriate instructor, we design a generative instructor mechanism which supports learning an instructor generator from multiple tasks. The generator can generate the corresponding instructor for different tasks by using fast weights. Experiment results demonstrate the efficiency and robustness of the generated instructor. Meanwhile, our generator also shows the property relating to continuous learning.","PeriodicalId":370976,"journal":{"name":"Proceedings of the International Symposium on Big Data and Artificial Intelligence","volume":"291 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-12-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Learning to Instruct Learning\",\"authors\":\"Dagui Chen, Feng Chen\",\"doi\":\"10.1145/3305275.3305288\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"One reason why deep neural networks require lots of data is that most current training methods are only driven by the task goal information. We propose a novel instructor which can guide networks to learn useful abstraction. Since the instructor provides additional learning power, the efficiency of data is significantly improved. To get appropriate instructor, we design a generative instructor mechanism which supports learning an instructor generator from multiple tasks. The generator can generate the corresponding instructor for different tasks by using fast weights. Experiment results demonstrate the efficiency and robustness of the generated instructor. Meanwhile, our generator also shows the property relating to continuous learning.\",\"PeriodicalId\":370976,\"journal\":{\"name\":\"Proceedings of the International Symposium on Big Data and Artificial Intelligence\",\"volume\":\"291 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-12-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the International Symposium on Big Data and Artificial Intelligence\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3305275.3305288\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the International Symposium on Big Data and Artificial Intelligence","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3305275.3305288","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
One reason why deep neural networks require lots of data is that most current training methods are only driven by the task goal information. We propose a novel instructor which can guide networks to learn useful abstraction. Since the instructor provides additional learning power, the efficiency of data is significantly improved. To get appropriate instructor, we design a generative instructor mechanism which supports learning an instructor generator from multiple tasks. The generator can generate the corresponding instructor for different tasks by using fast weights. Experiment results demonstrate the efficiency and robustness of the generated instructor. Meanwhile, our generator also shows the property relating to continuous learning.