An improved superpixel-based saliency detection method

Xin Wang, Yunyan Zhou, Chen Ning
{"title":"An improved superpixel-based saliency detection method","authors":"Xin Wang, Yunyan Zhou, Chen Ning","doi":"10.1109/ICIVC.2017.7984648","DOIUrl":null,"url":null,"abstract":"In this paper, an improved saliency detection method based on superpixel is proposed. First, the original image is segmented into a number of superpixels by simple linear iterative clustering, each of which has the consistent color and texture characteristics. Second, two different methods, namely, the sparse representation-based method as well as a center-surrounding idea-based approach, are applied to these superpixels to compute the initial saliency map and a center-surrounding map, respectively. Then these two maps are integrated in an additive way to obtain a modified saliency map. Compared to the initial saliency map, the modified one is more precise. Third, for the segmented superpixels, a normalized cut-based clustering method is used to cluster them into several clustering areas, and then the salient values in the same clustering area are averaged. Consequently, we can get a much more uniform saliency map. Experimental results show that, compared with the classical algorithms, the proposed method achieves a better performance since it can highlight the salient objects evenly and restrain the background clutters effectively.","PeriodicalId":181522,"journal":{"name":"2017 2nd International Conference on Image, Vision and Computing (ICIVC)","volume":"39 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 2nd International Conference on Image, Vision and Computing (ICIVC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICIVC.2017.7984648","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

In this paper, an improved saliency detection method based on superpixel is proposed. First, the original image is segmented into a number of superpixels by simple linear iterative clustering, each of which has the consistent color and texture characteristics. Second, two different methods, namely, the sparse representation-based method as well as a center-surrounding idea-based approach, are applied to these superpixels to compute the initial saliency map and a center-surrounding map, respectively. Then these two maps are integrated in an additive way to obtain a modified saliency map. Compared to the initial saliency map, the modified one is more precise. Third, for the segmented superpixels, a normalized cut-based clustering method is used to cluster them into several clustering areas, and then the salient values in the same clustering area are averaged. Consequently, we can get a much more uniform saliency map. Experimental results show that, compared with the classical algorithms, the proposed method achieves a better performance since it can highlight the salient objects evenly and restrain the background clutters effectively.
一种改进的超像素显著性检测方法
提出了一种改进的基于超像素的显著性检测方法。首先,通过简单的线性迭代聚类将原始图像分割成多个超像素,每个超像素具有一致的颜色和纹理特征;其次,对这些超像素分别采用基于稀疏表示的方法和基于中心-周围思想的方法计算初始显著性图和中心-周围图。然后将这两个图相加,得到一个改进的显著性图。与初始显著性图相比,改进后的显著性图精度更高。第三,对分割后的超像素,采用归一化的基于切点的聚类方法,将其聚为若干聚类区域,然后对同一聚类区域内的显著值进行平均;因此,我们可以得到一个更加均匀的显著性图。实验结果表明,与经典算法相比,该方法能够均匀地突出突出目标,有效地抑制背景杂波,取得了更好的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信