{"title":"Vind(x):通过协作标注使用用户","authors":"L. Vuurpijl, Lambert Schomaker, E. Broek","doi":"10.1109/IWFHR.2002.1030913","DOIUrl":null,"url":null,"abstract":"In this paper, the image retrieval system Vind(x) is described. The architecture of the system and first user-experiences are reported. Using Vind(x), users on the Internet may cooperatively annotate objects in paintings by use of the pen or mouse. The collected data can be searched through query-by-drawing techniques, but can also serve as an (ever-growing) training and benchmark set for the development of automated image retrieval systems of the future. Several other examples of cooperative annotation are presented in order to underline the importance of this concept for the design of pattern recognition systems and the labeling of large quantities of scanned documents or online data.","PeriodicalId":114017,"journal":{"name":"Proceedings Eighth International Workshop on Frontiers in Handwriting Recognition","volume":"159 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2002-08-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"24","resultStr":"{\"title\":\"Vind(x): using the user through cooperative annotation\",\"authors\":\"L. Vuurpijl, Lambert Schomaker, E. Broek\",\"doi\":\"10.1109/IWFHR.2002.1030913\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, the image retrieval system Vind(x) is described. The architecture of the system and first user-experiences are reported. Using Vind(x), users on the Internet may cooperatively annotate objects in paintings by use of the pen or mouse. The collected data can be searched through query-by-drawing techniques, but can also serve as an (ever-growing) training and benchmark set for the development of automated image retrieval systems of the future. Several other examples of cooperative annotation are presented in order to underline the importance of this concept for the design of pattern recognition systems and the labeling of large quantities of scanned documents or online data.\",\"PeriodicalId\":114017,\"journal\":{\"name\":\"Proceedings Eighth International Workshop on Frontiers in Handwriting Recognition\",\"volume\":\"159 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2002-08-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"24\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings Eighth International Workshop on Frontiers in Handwriting Recognition\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IWFHR.2002.1030913\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings Eighth International Workshop on Frontiers in Handwriting Recognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IWFHR.2002.1030913","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Vind(x): using the user through cooperative annotation
In this paper, the image retrieval system Vind(x) is described. The architecture of the system and first user-experiences are reported. Using Vind(x), users on the Internet may cooperatively annotate objects in paintings by use of the pen or mouse. The collected data can be searched through query-by-drawing techniques, but can also serve as an (ever-growing) training and benchmark set for the development of automated image retrieval systems of the future. Several other examples of cooperative annotation are presented in order to underline the importance of this concept for the design of pattern recognition systems and the labeling of large quantities of scanned documents or online data.