SSNet: Learning Self-Similarity for Few-Shot Semantic Segmentation

Weisheng Lan, Yu Liu
{"title":"SSNet: Learning Self-Similarity for Few-Shot Semantic Segmentation","authors":"Weisheng Lan, Yu Liu","doi":"10.1109/ICARM58088.2023.10218769","DOIUrl":null,"url":null,"abstract":"Few-shot Segmentation(FSS) refers to the task of segmenting newly introduced classes using only a limited number of closely marked examples. In the past, prototype learning based on metric and segmentation based on visual correspondence mostly ignore the matching relationship of query image itself. In this article, we present a novel self-similarity based hyperrelation network (SSNet), which introduces self-similarity generation module (SGM) and self-similarity mask module (SMM) to capture self-similarity information of target classes in query images and help the network better understand the internal similarity of query images. We also replace the feature mask with the input mask to eliminate the interference of background information in the support image. Experiments on the PASCAL-Sishow that SSNet achieves new state-of-the-art (SOTA), with a mIoU score of 64.6% in the 1-shot scenario and 68.1% in the 5-shot scenario, which is 0.62% higher than the SOTA method in the 1-shot scenario. This demonstrates that SSNet can achieve efficient and accurate few-shot segmentation with only a small number of samples.","PeriodicalId":220013,"journal":{"name":"2023 International Conference on Advanced Robotics and Mechatronics (ICARM)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 International Conference on Advanced Robotics and Mechatronics (ICARM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICARM58088.2023.10218769","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Few-shot Segmentation(FSS) refers to the task of segmenting newly introduced classes using only a limited number of closely marked examples. In the past, prototype learning based on metric and segmentation based on visual correspondence mostly ignore the matching relationship of query image itself. In this article, we present a novel self-similarity based hyperrelation network (SSNet), which introduces self-similarity generation module (SGM) and self-similarity mask module (SMM) to capture self-similarity information of target classes in query images and help the network better understand the internal similarity of query images. We also replace the feature mask with the input mask to eliminate the interference of background information in the support image. Experiments on the PASCAL-Sishow that SSNet achieves new state-of-the-art (SOTA), with a mIoU score of 64.6% in the 1-shot scenario and 68.1% in the 5-shot scenario, which is 0.62% higher than the SOTA method in the 1-shot scenario. This demonstrates that SSNet can achieve efficient and accurate few-shot segmentation with only a small number of samples.
SSNet:基于少镜头语义分割的自相似学习
Few-shot Segmentation(FSS)是指只使用有限数量的紧密标记的例子来分割新引入的类的任务。过去,基于度量的原型学习和基于视觉对应的分割往往忽略了查询图像本身的匹配关系。本文提出了一种新的基于自相似度的超关系网络(SSNet),该网络引入自相似度生成模块(SGM)和自相似度掩码模块(SMM)来捕获查询图像中目标类的自相似信息,帮助网络更好地理解查询图像的内部相似度。我们还用输入掩码代替特征掩码,以消除支持图像中背景信息的干扰。在pascal - show上的实验表明,SSNet实现了新技术(SOTA),在1次射击场景下mIoU得分为64.6%,在5次射击场景下mIoU得分为68.1%,比1次射击场景下的SOTA方法高0.62%。这表明SSNet可以在少量的样本中实现高效、准确的少镜头分割。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信