{"title":"利用颜色和深度信息对自遮挡进行鲁棒目标跟踪","authors":"Jun-ichi Imai, Yuhei Kashiwagi, Ryo Kitsuji","doi":"10.9746/SICETR.55.342","DOIUrl":null,"url":null,"abstract":"Visual object tracking techniques are widely required by many vision applications. The color-based particle filter is known as one of useful methods for robust object tracking. However, the conventional color-based particle filter has a problem that it is not robust against self-occlusion. Self-occlusion occurs when a part of a target object is hidden by itself from a camera. When the target object moves or rotates, a part of the target disappears because the self-occlusion occurs and other part appears because the self-occlusion is resolved. The conventional color-based particle filter often fails to follow such a change of the target’s appearance due to self-occlusion during the tracking process. In this paper, we propose a novel method for robust object tracking against the self-occlusion. The proposed method is based on the color-based particle filter, and it also uses depth information obtained by an RGB-D camera. When the self-occlusion occurs and the target’s appearance changes, the proposed method extracts a region for the target object in the input image by the graph cuts based on depth information. However, this process often includes unnecessary regions, especially when some objects are close to the target. Then, the proposed method distinguishes the region for the target from unnecessary ones by inves-tigating expanse of colors around the target. Therefore, the target model is correctly updated and the robust tracking is achieved. In order to verify the effectiveness of the proposed method, we carried out an experiment to compare the proposed method with the conventional one. Experimental results show that the proposed method works well.","PeriodicalId":416828,"journal":{"name":"Transactions of the Society of Instrument and Control Engineers","volume":"357 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Robust Object Tracking against Self-occlusion Using Color and Depth Information\",\"authors\":\"Jun-ichi Imai, Yuhei Kashiwagi, Ryo Kitsuji\",\"doi\":\"10.9746/SICETR.55.342\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Visual object tracking techniques are widely required by many vision applications. The color-based particle filter is known as one of useful methods for robust object tracking. However, the conventional color-based particle filter has a problem that it is not robust against self-occlusion. Self-occlusion occurs when a part of a target object is hidden by itself from a camera. When the target object moves or rotates, a part of the target disappears because the self-occlusion occurs and other part appears because the self-occlusion is resolved. The conventional color-based particle filter often fails to follow such a change of the target’s appearance due to self-occlusion during the tracking process. In this paper, we propose a novel method for robust object tracking against the self-occlusion. The proposed method is based on the color-based particle filter, and it also uses depth information obtained by an RGB-D camera. When the self-occlusion occurs and the target’s appearance changes, the proposed method extracts a region for the target object in the input image by the graph cuts based on depth information. However, this process often includes unnecessary regions, especially when some objects are close to the target. Then, the proposed method distinguishes the region for the target from unnecessary ones by inves-tigating expanse of colors around the target. Therefore, the target model is correctly updated and the robust tracking is achieved. In order to verify the effectiveness of the proposed method, we carried out an experiment to compare the proposed method with the conventional one. Experimental results show that the proposed method works well.\",\"PeriodicalId\":416828,\"journal\":{\"name\":\"Transactions of the Society of Instrument and Control Engineers\",\"volume\":\"357 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1900-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Transactions of the Society of Instrument and Control Engineers\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.9746/SICETR.55.342\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Transactions of the Society of Instrument and Control Engineers","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.9746/SICETR.55.342","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Robust Object Tracking against Self-occlusion Using Color and Depth Information
Visual object tracking techniques are widely required by many vision applications. The color-based particle filter is known as one of useful methods for robust object tracking. However, the conventional color-based particle filter has a problem that it is not robust against self-occlusion. Self-occlusion occurs when a part of a target object is hidden by itself from a camera. When the target object moves or rotates, a part of the target disappears because the self-occlusion occurs and other part appears because the self-occlusion is resolved. The conventional color-based particle filter often fails to follow such a change of the target’s appearance due to self-occlusion during the tracking process. In this paper, we propose a novel method for robust object tracking against the self-occlusion. The proposed method is based on the color-based particle filter, and it also uses depth information obtained by an RGB-D camera. When the self-occlusion occurs and the target’s appearance changes, the proposed method extracts a region for the target object in the input image by the graph cuts based on depth information. However, this process often includes unnecessary regions, especially when some objects are close to the target. Then, the proposed method distinguishes the region for the target from unnecessary ones by inves-tigating expanse of colors around the target. Therefore, the target model is correctly updated and the robust tracking is achieved. In order to verify the effectiveness of the proposed method, we carried out an experiment to compare the proposed method with the conventional one. Experimental results show that the proposed method works well.