Simple tag-based subclass representations for visually-varied image classes

Xinchao Li, Peng Xu, Yue Shi, M. Larson, A. Hanjalic
{"title":"Simple tag-based subclass representations for visually-varied image classes","authors":"Xinchao Li, Peng Xu, Yue Shi, M. Larson, A. Hanjalic","doi":"10.1109/CBMI.2016.7500265","DOIUrl":null,"url":null,"abstract":"In this paper, we present a subclass-representation approach that predicts the probability of a social image belonging to one particular class. We explore the co-occurrence of user-contributed tags to find subclasses with a strong connection to the top level class. We then project each image onto the resulting subclass space, generating a subclass representation for the image. The advantage of our tag-based subclasses is that they have a chance of being more visually stable and easier to model than top-level classes. Our contribution is to demonstrate that a simple and inexpensive method for generating sub-class representations has the ability to improve classification results in the case of tag classes that are visually highly heterogenous. The approach is evaluated on a set of 1 million photos with 10 top-level classes, from the dataset released by the ACM Multimedia 2013 Yahoo! Large-scale Flickr-tag Image Classification Grand Challenge. Experiments show that the proposed system delivers sound performance for visually diverse classes compared with methods that directly model top classes.","PeriodicalId":356608,"journal":{"name":"2016 14th International Workshop on Content-Based Multimedia Indexing (CBMI)","volume":"34 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 14th International Workshop on Content-Based Multimedia Indexing (CBMI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CBMI.2016.7500265","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

In this paper, we present a subclass-representation approach that predicts the probability of a social image belonging to one particular class. We explore the co-occurrence of user-contributed tags to find subclasses with a strong connection to the top level class. We then project each image onto the resulting subclass space, generating a subclass representation for the image. The advantage of our tag-based subclasses is that they have a chance of being more visually stable and easier to model than top-level classes. Our contribution is to demonstrate that a simple and inexpensive method for generating sub-class representations has the ability to improve classification results in the case of tag classes that are visually highly heterogenous. The approach is evaluated on a set of 1 million photos with 10 top-level classes, from the dataset released by the ACM Multimedia 2013 Yahoo! Large-scale Flickr-tag Image Classification Grand Challenge. Experiments show that the proposed system delivers sound performance for visually diverse classes compared with methods that directly model top classes.
用于视觉变化的图像类的简单的基于标记的子类表示
在本文中,我们提出了一种子类表示方法来预测社会图像属于一个特定类别的概率。我们探索用户贡献标签的共存,以找到与顶级类有强连接的子类。然后我们将每个图像投影到生成的子类空间上,为图像生成一个子类表示。我们基于标记的子类的优点是,它们有机会在视觉上比顶级类更稳定,更容易建模。我们的贡献是证明了一种简单而廉价的方法,用于生成子类表示,能够在视觉上高度异构的标记类的情况下改善分类结果。该方法在一组100万张照片上进行了评估,其中包含10个顶级类,这些照片来自ACM Multimedia 2013 Yahoo!大规模闪烁标签图像分类大挑战。实验表明,与直接建模顶级类的方法相比,所提出的系统在视觉上不同的类上提供了良好的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信