An Instance Segmentation Model to Categorize Clothes from Wild Fashion Images

Rohan Indrajeet Jadhav, Paul Stynes, Pramod Pathak, Rejwanul Haque, Mohammed Hasanuzzaman
{"title":"An Instance Segmentation Model to Categorize Clothes from Wild Fashion Images","authors":"Rohan Indrajeet Jadhav, Paul Stynes, Pramod Pathak, Rejwanul Haque, Mohammed Hasanuzzaman","doi":"10.1145/3556677.3556690","DOIUrl":null,"url":null,"abstract":"Categorizing of clothes from wild fashion images involves identifying the type of clothes a person wears from non-studio images such as a shirt, trousers, and so on. Identifying the fashion clothes from wild images that are often grainy, unfocused, with people in different poses is a challenge. This research proposes a comparison between object detection and instance segmentation based models to categorise clothes from wild fashion images. The Object detection model is implemented using Faster Region-Based Convolutional Neural Network (RCNN). Mask RCNN is used to implement an instance segmentation model. We have trained the models on standard benchmark dataset namely deepfashion2. Results demonstrate that Instance Segmentation models such as Mask RCNN outperforms Object Detection models by 20%. Mask RCNN achieved 21.05% average precision, 73% recall across the different IoU (Intersection over Union). These results show promise for using Instance Segmentation models for faster image retrieval based e-commerce applications.","PeriodicalId":350340,"journal":{"name":"Proceedings of the 2022 6th International Conference on Deep Learning Technologies","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2022 6th International Conference on Deep Learning Technologies","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3556677.3556690","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Categorizing of clothes from wild fashion images involves identifying the type of clothes a person wears from non-studio images such as a shirt, trousers, and so on. Identifying the fashion clothes from wild images that are often grainy, unfocused, with people in different poses is a challenge. This research proposes a comparison between object detection and instance segmentation based models to categorise clothes from wild fashion images. The Object detection model is implemented using Faster Region-Based Convolutional Neural Network (RCNN). Mask RCNN is used to implement an instance segmentation model. We have trained the models on standard benchmark dataset namely deepfashion2. Results demonstrate that Instance Segmentation models such as Mask RCNN outperforms Object Detection models by 20%. Mask RCNN achieved 21.05% average precision, 73% recall across the different IoU (Intersection over Union). These results show promise for using Instance Segmentation models for faster image retrieval based e-commerce applications.
基于实例分割模型的服装分类方法
从疯狂的时尚图片中对衣服进行分类包括从非工作室图片(如衬衫、裤子等)中识别一个人穿的衣服类型。从杂乱无章的照片中识别时尚服装是一项挑战,这些照片往往是颗粒状的,没有聚焦,人们摆出不同的姿势。本文提出了一种基于对象检测和实例分割的服装分类模型的比较方法。目标检测模型采用基于更快区域的卷积神经网络(RCNN)实现。掩码RCNN是用来实现实例分割模型的。我们在标准基准数据集deepfashion2上训练了模型。结果表明,Mask RCNN等实例分割模型的性能比目标检测模型高出20%。掩模RCNN在不同IoU (Intersection over Union)上的平均准确率达到21.05%,召回率达到73%。这些结果显示了使用实例分割模型来实现更快的基于图像检索的电子商务应用的前景。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信