An active learning method under very limited initial labeled data

Yue Zhao, Q. Ji
{"title":"An active learning method under very limited initial labeled data","authors":"Yue Zhao, Q. Ji","doi":"10.1109/ICAL.2010.5585339","DOIUrl":null,"url":null,"abstract":"Active learning methods seek to reduce the number of labeled instances needed to train an effective classifier. Most current methods assume the availability of some reasonable amount of initially labeled training data so that the learners can be trained with sufficient quality. However, for many applications, the amount of initial training data is often limited, this will affect the quality of the initial learners, which, in turn, affect the performance of the active learning methods. In this paper, we introduce a new non-parametric active learning strategy that can perform well even under very limited initial training data. Our method selects the query instance that simultaneously maximizes its label uncertainty and the classification accuracy on the unlabelled test data. Our method hence avoids selecting outliers and does not require good initial learner. The experimental results with benchmark datasets show that our method outperforms state of the art methods especially when the amount of the initially labeled data is small or when the quality of the initially labeled data is poor.","PeriodicalId":393739,"journal":{"name":"2010 IEEE International Conference on Automation and Logistics","volume":"47 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2010-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2010 IEEE International Conference on Automation and Logistics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAL.2010.5585339","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Active learning methods seek to reduce the number of labeled instances needed to train an effective classifier. Most current methods assume the availability of some reasonable amount of initially labeled training data so that the learners can be trained with sufficient quality. However, for many applications, the amount of initial training data is often limited, this will affect the quality of the initial learners, which, in turn, affect the performance of the active learning methods. In this paper, we introduce a new non-parametric active learning strategy that can perform well even under very limited initial training data. Our method selects the query instance that simultaneously maximizes its label uncertainty and the classification accuracy on the unlabelled test data. Our method hence avoids selecting outliers and does not require good initial learner. The experimental results with benchmark datasets show that our method outperforms state of the art methods especially when the amount of the initially labeled data is small or when the quality of the initially labeled data is poor.
在非常有限的初始标记数据下的主动学习方法
主动学习方法寻求减少训练有效分类器所需的标记实例的数量。目前的大多数方法都假设有一些合理数量的初始标记训练数据的可用性,以便能够以足够的质量训练学习者。然而,对于许多应用来说,初始训练数据的数量往往是有限的,这将影响初始学习者的质量,进而影响主动学习方法的性能。在本文中,我们引入了一种新的非参数主动学习策略,即使在非常有限的初始训练数据下也能表现良好。我们的方法选择查询实例,同时最大限度地提高其标签不确定性和对未标记测试数据的分类精度。因此,我们的方法避免了选择异常值,并且不需要良好的初始学习器。基于基准数据集的实验结果表明,该方法在初始标记数据量较小或初始标记数据质量较差的情况下优于现有方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信