探索元学习:参数化图像分类的学习到学习过程

Chaehan So
{"title":"探索元学习:参数化图像分类的学习到学习过程","authors":"Chaehan So","doi":"10.1109/ICAIIC51459.2021.9415205","DOIUrl":null,"url":null,"abstract":"Meta-learning has emerged as a new paradigm in AI to challenge the limitation of conventional deep learning to acquire only task-specific knowledge. Meta-learning transcends this limitation by extracting the general concepts when learning tasks to apply these concepts later when learning new tasks. One popular meta-learning approach is model-agnostic meta-learning (MAML) which learns tasks by optimizing parameters towards highest generalizability of future tasks. The present paper applied a practical implementation of MAML to conduct an image classification task. Results showed that performance on learning new tasks neared training performance without overfitting. Furthermore, optimal values for inner-loop and outer-loop learning rate were close to default parameter values. Smaller batch sizes with more epochs improved learning in earlier epochs compared to larger batch sizes with fewer epochs. These findings show that MAML is able to transfer the concepts extracted during training effectively on to new tasks which it had not been trained on, similarly to how humans transfer knowledge.","PeriodicalId":432977,"journal":{"name":"2021 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)","volume":"69 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-04-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Exploring Meta Learning: Parameterizing the Learning-to-learn Process for Image Classification\",\"authors\":\"Chaehan So\",\"doi\":\"10.1109/ICAIIC51459.2021.9415205\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Meta-learning has emerged as a new paradigm in AI to challenge the limitation of conventional deep learning to acquire only task-specific knowledge. Meta-learning transcends this limitation by extracting the general concepts when learning tasks to apply these concepts later when learning new tasks. One popular meta-learning approach is model-agnostic meta-learning (MAML) which learns tasks by optimizing parameters towards highest generalizability of future tasks. The present paper applied a practical implementation of MAML to conduct an image classification task. Results showed that performance on learning new tasks neared training performance without overfitting. Furthermore, optimal values for inner-loop and outer-loop learning rate were close to default parameter values. Smaller batch sizes with more epochs improved learning in earlier epochs compared to larger batch sizes with fewer epochs. These findings show that MAML is able to transfer the concepts extracted during training effectively on to new tasks which it had not been trained on, similarly to how humans transfer knowledge.\",\"PeriodicalId\":432977,\"journal\":{\"name\":\"2021 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)\",\"volume\":\"69 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-04-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICAIIC51459.2021.9415205\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAIIC51459.2021.9415205","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

摘要

元学习已经成为人工智能领域的一种新范式,它挑战了传统深度学习仅获取特定任务知识的局限性。元学习超越了这一限制,它在学习任务时提取一般概念,以便在以后学习新任务时应用这些概念。一种流行的元学习方法是模型不可知元学习(MAML),它通过优化参数来实现未来任务的最高可泛化性来学习任务。本文应用MAML的一个实际实现来进行图像分类任务。结果表明,学习新任务的表现接近训练表现,没有过拟合。此外,内环和外环学习率的最优值接近默认参数值。与具有更少epoch的更大batch大小相比,具有更多epoch的较小batch大小改善了早期epoch的学习。这些发现表明,MAML能够将训练过程中提取的概念有效地转移到它没有接受过训练的新任务上,类似于人类如何转移知识。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Exploring Meta Learning: Parameterizing the Learning-to-learn Process for Image Classification
Meta-learning has emerged as a new paradigm in AI to challenge the limitation of conventional deep learning to acquire only task-specific knowledge. Meta-learning transcends this limitation by extracting the general concepts when learning tasks to apply these concepts later when learning new tasks. One popular meta-learning approach is model-agnostic meta-learning (MAML) which learns tasks by optimizing parameters towards highest generalizability of future tasks. The present paper applied a practical implementation of MAML to conduct an image classification task. Results showed that performance on learning new tasks neared training performance without overfitting. Furthermore, optimal values for inner-loop and outer-loop learning rate were close to default parameter values. Smaller batch sizes with more epochs improved learning in earlier epochs compared to larger batch sizes with fewer epochs. These findings show that MAML is able to transfer the concepts extracted during training effectively on to new tasks which it had not been trained on, similarly to how humans transfer knowledge.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信