From Learning to Meta-Learning: Reduced Training Overhead and Complexity for Communication Systems

O. Simeone, Sangwoo Park, Joonhyuk Kang
{"title":"From Learning to Meta-Learning: Reduced Training Overhead and Complexity for Communication Systems","authors":"O. Simeone, Sangwoo Park, Joonhyuk Kang","doi":"10.1109/6GSUMMIT49458.2020.9083856","DOIUrl":null,"url":null,"abstract":"Machine learning methods adapt the parameters of a model, constrained to lie in a given model class, by using a fixed learning procedure based on data or active observations. Adaptation is done on a per-task basis, and retraining is needed when the system configuration changes. The resulting inefficiency in terms of data and training time requirements can be mitigated, if domain knowledge is available, by selecting a suitable model class and learning procedure, collectively known as inductive bias. However, it is generally difficult to encode prior knowledge into an inductive bias, particularly with black-box model classes such as neural networks. Meta-learning provides a way to automatize the selection of an inductive bias. Meta-learning leverages data or active observations from tasks that are expected to be related to future, and a priori unknown, tasks of interest. With a meta-trained inductive bias, training of a machine learning model can be potentially carried out with reduced training data and/or time complexity. This paper provides a high-level introduction to meta-learning with applications to communication systems.","PeriodicalId":385212,"journal":{"name":"2020 2nd 6G Wireless Summit (6G SUMMIT)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-01-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"50","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 2nd 6G Wireless Summit (6G SUMMIT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/6GSUMMIT49458.2020.9083856","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 50

Abstract

Machine learning methods adapt the parameters of a model, constrained to lie in a given model class, by using a fixed learning procedure based on data or active observations. Adaptation is done on a per-task basis, and retraining is needed when the system configuration changes. The resulting inefficiency in terms of data and training time requirements can be mitigated, if domain knowledge is available, by selecting a suitable model class and learning procedure, collectively known as inductive bias. However, it is generally difficult to encode prior knowledge into an inductive bias, particularly with black-box model classes such as neural networks. Meta-learning provides a way to automatize the selection of an inductive bias. Meta-learning leverages data or active observations from tasks that are expected to be related to future, and a priori unknown, tasks of interest. With a meta-trained inductive bias, training of a machine learning model can be potentially carried out with reduced training data and/or time complexity. This paper provides a high-level introduction to meta-learning with applications to communication systems.
从学习到元学习:减少通信系统的训练开销和复杂性
机器学习方法通过使用基于数据或主动观察的固定学习过程来适应模型的参数,这些参数被限制在给定的模型类中。适应是在每个任务的基础上完成的,当系统配置发生变化时需要重新培训。如果领域知识可用,可以通过选择合适的模型类和学习过程(统称为归纳偏差)来减轻数据和训练时间要求方面的低效率。然而,通常很难将先验知识编码为归纳偏差,特别是在黑盒模型类(如神经网络)中。元学习提供了一种自动选择归纳偏差的方法。元学习利用来自预期与未来相关的任务的数据或主动观察,以及先验未知的感兴趣的任务。使用元训练归纳偏差,机器学习模型的训练可以在减少训练数据和/或时间复杂性的情况下进行。本文简要介绍了元学习在通信系统中的应用。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信