A. Crivellaro, Yannick Verdie, K. M. Yi, P. Fua, V. Lepetit
{"title":"用描述符字段跟踪无纹理的闪亮对象","authors":"A. Crivellaro, Yannick Verdie, K. M. Yi, P. Fua, V. Lepetit","doi":"10.1109/ISMAR.2014.6948474","DOIUrl":null,"url":null,"abstract":"Our demo demonstrates the method we published at CVPR this year for tracking specular and poorly textured objects, and lets the visitors experiment with it and with their own patterns. Our approach only requires a standard monocular camera (no need for a depth sensor), and can be easily integrated within existing systems to improve their robustness and accuracy. Code is publicly available.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":"{\"title\":\"Tracking texture-less, shiny objects with descriptor fields\",\"authors\":\"A. Crivellaro, Yannick Verdie, K. M. Yi, P. Fua, V. Lepetit\",\"doi\":\"10.1109/ISMAR.2014.6948474\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Our demo demonstrates the method we published at CVPR this year for tracking specular and poorly textured objects, and lets the visitors experiment with it and with their own patterns. Our approach only requires a standard monocular camera (no need for a depth sensor), and can be easily integrated within existing systems to improve their robustness and accuracy. Code is publicly available.\",\"PeriodicalId\":92225,\"journal\":{\"name\":\"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-11-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"5\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISMAR.2014.6948474\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISMAR.2014.6948474","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Tracking texture-less, shiny objects with descriptor fields
Our demo demonstrates the method we published at CVPR this year for tracking specular and poorly textured objects, and lets the visitors experiment with it and with their own patterns. Our approach only requires a standard monocular camera (no need for a depth sensor), and can be easily integrated within existing systems to improve their robustness and accuracy. Code is publicly available.