{"title":"非语言查询驱动的交互式搜索系统:与语言无关的信息访问技术研究","authors":"Viktors Garkavijs, N. Kando","doi":"10.1145/2362724.2362788","DOIUrl":null,"url":null,"abstract":"This work is concentrated on analyzing possibilities for retrieving and presenting the information available on the web, without the explicit need to formulate precise queries. The research question of our work can be formulated as: \"Is it possible to retrieve information without clearly formulated verbal query?\"\n For this consortium we present our prototype of interactive image search system called \"Gaze Learning Access and Search Engine\" (GLASE) version 0.1, which can perform relevance calculation based on gaze data and within-session learning. The search user interface (UI) uses an eye-tracker as an input device and employs a relevance re-ranking algorithm based on the gaze length. The preliminary experimental results showed that using our gaze-driven system reduced the task completion time an average of 13.7% in a search session.","PeriodicalId":413481,"journal":{"name":"International Conference on Information Interaction in Context","volume":"40 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-08-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Nonverbal query driven interactive search systems: a study on language agnostic information access technologies\",\"authors\":\"Viktors Garkavijs, N. Kando\",\"doi\":\"10.1145/2362724.2362788\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This work is concentrated on analyzing possibilities for retrieving and presenting the information available on the web, without the explicit need to formulate precise queries. The research question of our work can be formulated as: \\\"Is it possible to retrieve information without clearly formulated verbal query?\\\"\\n For this consortium we present our prototype of interactive image search system called \\\"Gaze Learning Access and Search Engine\\\" (GLASE) version 0.1, which can perform relevance calculation based on gaze data and within-session learning. The search user interface (UI) uses an eye-tracker as an input device and employs a relevance re-ranking algorithm based on the gaze length. The preliminary experimental results showed that using our gaze-driven system reduced the task completion time an average of 13.7% in a search session.\",\"PeriodicalId\":413481,\"journal\":{\"name\":\"International Conference on Information Interaction in Context\",\"volume\":\"40 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2012-08-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Conference on Information Interaction in Context\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2362724.2362788\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Conference on Information Interaction in Context","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2362724.2362788","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Nonverbal query driven interactive search systems: a study on language agnostic information access technologies
This work is concentrated on analyzing possibilities for retrieving and presenting the information available on the web, without the explicit need to formulate precise queries. The research question of our work can be formulated as: "Is it possible to retrieve information without clearly formulated verbal query?"
For this consortium we present our prototype of interactive image search system called "Gaze Learning Access and Search Engine" (GLASE) version 0.1, which can perform relevance calculation based on gaze data and within-session learning. The search user interface (UI) uses an eye-tracker as an input device and employs a relevance re-ranking algorithm based on the gaze length. The preliminary experimental results showed that using our gaze-driven system reduced the task completion time an average of 13.7% in a search session.