广义学习向量量化分类器族中的自适应学习参数转移

S. Bischoff, M. Mendenhall, Andrew Rice, J. Vasquez
{"title":"广义学习向量量化分类器族中的自适应学习参数转移","authors":"S. Bischoff, M. Mendenhall, Andrew Rice, J. Vasquez","doi":"10.1109/WHISPERS.2010.5594950","DOIUrl":null,"url":null,"abstract":"Many methods of hyperspectral data classification require the adjustment of learning parameters for their success. To this end, one may fix the learning parameters, offer a functional-based parameter decay, or use a step-wise decrement of the learning parameters after a fixed number of training steps. Each of the three methods described rely on the expertise of user and do not necessarily lend themselves well to time-sensitive solutions. Classification methods based on the optimization of a cost function offer a clear advantage as this cost function can be used to adapt the learning schedule of the learning machine thus speeding convergence. We demonstrate this concept applied to variants of Sato & Yamada's Generalized Learning Vector Quantization and transition to the next set of learn rates at the appropriate time in the learning process. Experiments show that, by monitoring the stationarity of the cost function, one can automatically transition to the next learning parameter set significantly decreasing training times.","PeriodicalId":193944,"journal":{"name":"2010 2nd Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing","volume":"2 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2010-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Adapting learning parameter transition in the Generalized Learning Vector Quantization family of classifiers\",\"authors\":\"S. Bischoff, M. Mendenhall, Andrew Rice, J. Vasquez\",\"doi\":\"10.1109/WHISPERS.2010.5594950\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Many methods of hyperspectral data classification require the adjustment of learning parameters for their success. To this end, one may fix the learning parameters, offer a functional-based parameter decay, or use a step-wise decrement of the learning parameters after a fixed number of training steps. Each of the three methods described rely on the expertise of user and do not necessarily lend themselves well to time-sensitive solutions. Classification methods based on the optimization of a cost function offer a clear advantage as this cost function can be used to adapt the learning schedule of the learning machine thus speeding convergence. We demonstrate this concept applied to variants of Sato & Yamada's Generalized Learning Vector Quantization and transition to the next set of learn rates at the appropriate time in the learning process. Experiments show that, by monitoring the stationarity of the cost function, one can automatically transition to the next learning parameter set significantly decreasing training times.\",\"PeriodicalId\":193944,\"journal\":{\"name\":\"2010 2nd Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing\",\"volume\":\"2 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2010-06-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2010 2nd Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/WHISPERS.2010.5594950\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2010 2nd Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/WHISPERS.2010.5594950","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

摘要

许多高光谱数据分类方法需要调整学习参数才能成功。为此,可以固定学习参数,提供基于函数的参数衰减,或者在固定数量的训练步骤后使用学习参数的逐步递减。所描述的三种方法中的每一种都依赖于用户的专业知识,并且不一定适合于时间敏感的解决方案。基于代价函数优化的分类方法提供了一个明显的优势,因为这个代价函数可以用来适应学习机的学习进度,从而加快收敛速度。我们将这一概念应用于Sato & Yamada广义学习向量量化的变体,并在学习过程中的适当时间过渡到下一组学习率。实验表明,通过监测代价函数的平稳性,可以自动过渡到下一个学习参数集,大大减少了训练时间。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Adapting learning parameter transition in the Generalized Learning Vector Quantization family of classifiers
Many methods of hyperspectral data classification require the adjustment of learning parameters for their success. To this end, one may fix the learning parameters, offer a functional-based parameter decay, or use a step-wise decrement of the learning parameters after a fixed number of training steps. Each of the three methods described rely on the expertise of user and do not necessarily lend themselves well to time-sensitive solutions. Classification methods based on the optimization of a cost function offer a clear advantage as this cost function can be used to adapt the learning schedule of the learning machine thus speeding convergence. We demonstrate this concept applied to variants of Sato & Yamada's Generalized Learning Vector Quantization and transition to the next set of learn rates at the appropriate time in the learning process. Experiments show that, by monitoring the stationarity of the cost function, one can automatically transition to the next learning parameter set significantly decreasing training times.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信