Yasushi Makihara, Takuya Tanoue, D. Muramatsu, Y. Yagi, Syunsuke Mori, Yuzuko Utsumi, M. Iwamura, K. Kise
{"title":"Individuality-preserving Silhouette Extraction for Gait Recognition","authors":"Yasushi Makihara, Takuya Tanoue, D. Muramatsu, Y. Yagi, Syunsuke Mori, Yuzuko Utsumi, M. Iwamura, K. Kise","doi":"10.2197/ipsjtcva.7.74","DOIUrl":null,"url":null,"abstract":"Most gait recognition approaches rely on silhouette-based representations due to high recognition accu- racy and computational efficiency, and a key problem for those approaches is how to accurately extract individuality- preserved silhouettes from real scenes, where foreground colors may be similar to background colors and the back- groundis cluttered. We thereforeproposea method of individuality-preservingsilhouetteextractionfor gait recognition using standard gait models (SGMs) composed of clean silhouette sequences of a variety of training subjects as a shape prior. We firstly match the multiple SGMs to a background subtraction sequence of a test subject by dynamic pro- gramming and select the training subject whose SGM fit the test sequence the best. We then formulate our silhouette extraction problem in a well-established graph-cut segmentation framework while considering a balance between the observed test sequence and the matched SGM. More specifically, we define an energy function to be minimized by the following three terms: (1) a data term derived from the observed test sequence, (2) a smoothness term derived from spatio-temporally adjacent edges, and (3) a shape-prior term derived from the matched SGM. We demonstrate that the proposed method successfully extracts individuality-preserved silhouettes and improved gait recognition accuracy through experiments using 56 subjects.","PeriodicalId":38957,"journal":{"name":"IPSJ Transactions on Computer Vision and Applications","volume":"20 1","pages":"74-78"},"PeriodicalIF":0.0000,"publicationDate":"2015-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"16","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IPSJ Transactions on Computer Vision and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2197/ipsjtcva.7.74","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Computer Science","Score":null,"Total":0}
引用次数: 16
Abstract
Most gait recognition approaches rely on silhouette-based representations due to high recognition accu- racy and computational efficiency, and a key problem for those approaches is how to accurately extract individuality- preserved silhouettes from real scenes, where foreground colors may be similar to background colors and the back- groundis cluttered. We thereforeproposea method of individuality-preservingsilhouetteextractionfor gait recognition using standard gait models (SGMs) composed of clean silhouette sequences of a variety of training subjects as a shape prior. We firstly match the multiple SGMs to a background subtraction sequence of a test subject by dynamic pro- gramming and select the training subject whose SGM fit the test sequence the best. We then formulate our silhouette extraction problem in a well-established graph-cut segmentation framework while considering a balance between the observed test sequence and the matched SGM. More specifically, we define an energy function to be minimized by the following three terms: (1) a data term derived from the observed test sequence, (2) a smoothness term derived from spatio-temporally adjacent edges, and (3) a shape-prior term derived from the matched SGM. We demonstrate that the proposed method successfully extracts individuality-preserved silhouettes and improved gait recognition accuracy through experiments using 56 subjects.