{"title":"基于粒子滤波的色彩空间状态估计视频色彩情绪抓取","authors":"N. Ikoma","doi":"10.1109/ICCAIS.2017.8217569","DOIUrl":null,"url":null,"abstract":"As one possible model for human perception of color cue in vision, state space modeling approach and its particle filter implementation that grasps color mood in video by estimating the state defined over a color space has been proposed. The state space model is formulated over a state vector consisting of color instance and location of a small patch over the image frame. System model represents random fluctuation on each color instance and the location. New generation of color instances copes with emergence of new colors in the scene. Observation model evaluates likeliness of the color instance with the colors contained in the patch region specified by the location factor of the state vector. Experiment over a real image demonstrates performance of the proposed method. The prototype system has been developed for the experiment that works almost real-time for video image captured by a camera installed in PC. Abstraction of the video image becomes possible based on the proposed method that leads to further extension of the human perception model in higher level of knowledge and understanding of real scene.","PeriodicalId":410094,"journal":{"name":"2017 International Conference on Control, Automation and Information Sciences (ICCAIS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Color mood grasping in video by state estimation over color space with particle filter\",\"authors\":\"N. Ikoma\",\"doi\":\"10.1109/ICCAIS.2017.8217569\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"As one possible model for human perception of color cue in vision, state space modeling approach and its particle filter implementation that grasps color mood in video by estimating the state defined over a color space has been proposed. The state space model is formulated over a state vector consisting of color instance and location of a small patch over the image frame. System model represents random fluctuation on each color instance and the location. New generation of color instances copes with emergence of new colors in the scene. Observation model evaluates likeliness of the color instance with the colors contained in the patch region specified by the location factor of the state vector. Experiment over a real image demonstrates performance of the proposed method. The prototype system has been developed for the experiment that works almost real-time for video image captured by a camera installed in PC. Abstraction of the video image becomes possible based on the proposed method that leads to further extension of the human perception model in higher level of knowledge and understanding of real scene.\",\"PeriodicalId\":410094,\"journal\":{\"name\":\"2017 International Conference on Control, Automation and Information Sciences (ICCAIS)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 International Conference on Control, Automation and Information Sciences (ICCAIS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICCAIS.2017.8217569\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 International Conference on Control, Automation and Information Sciences (ICCAIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCAIS.2017.8217569","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Color mood grasping in video by state estimation over color space with particle filter
As one possible model for human perception of color cue in vision, state space modeling approach and its particle filter implementation that grasps color mood in video by estimating the state defined over a color space has been proposed. The state space model is formulated over a state vector consisting of color instance and location of a small patch over the image frame. System model represents random fluctuation on each color instance and the location. New generation of color instances copes with emergence of new colors in the scene. Observation model evaluates likeliness of the color instance with the colors contained in the patch region specified by the location factor of the state vector. Experiment over a real image demonstrates performance of the proposed method. The prototype system has been developed for the experiment that works almost real-time for video image captured by a camera installed in PC. Abstraction of the video image becomes possible based on the proposed method that leads to further extension of the human perception model in higher level of knowledge and understanding of real scene.