Jiayu Li, Min Zhang, Weizhi Ma, Yiqun Liu, Shaoping Ma
{"title":"具有用户反馈的多层次交互式生活日志搜索引擎","authors":"Jiayu Li, Min Zhang, Weizhi Ma, Yiqun Liu, Shaoping Ma","doi":"10.1145/3379172.3391720","DOIUrl":null,"url":null,"abstract":"With the rise of portable wearable devices, it is easier for users to save their lifelog data. As lifelog is usually disorganized with multi-modal information (even noisy sometimes), an interactive search engine is crucial for users to review and explore their lifelog. Unlike traditional search engines, lifelog search includes multi-modality information of images, text and other data from sensors, which brings challenges to data arrangement and search. Accordingly, users' information need is also multi-level. Hence, a single interaction mechanism may not be able to satisfy users' requirements. As the data set is highly personalized, interaction and feedback from users should also be considered in the search engine. Therefore, in this paper we present an interactive multi-modality lifelog search engine to help users manage and find lifelog data. To this end, lifelog data is clustered and processed in multi-level processing. Then, we build an interactive search engine, includingtext as query, image as query, andtimeline view modules. Besides, the system is able to adopt user feedback mechanisms in multi-round queries. Our system shows promising experimental results on LSC'20 dataset and development topics. The text-based search module gives correct results on more than 60% of the development topics at LSC'20.","PeriodicalId":340585,"journal":{"name":"Proceedings of the Third Annual Workshop on Lifelog Search Challenge","volume":"777 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"12","resultStr":"{\"title\":\"A Multi-level Interactive Lifelog Search Engine with User Feedback\",\"authors\":\"Jiayu Li, Min Zhang, Weizhi Ma, Yiqun Liu, Shaoping Ma\",\"doi\":\"10.1145/3379172.3391720\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"With the rise of portable wearable devices, it is easier for users to save their lifelog data. As lifelog is usually disorganized with multi-modal information (even noisy sometimes), an interactive search engine is crucial for users to review and explore their lifelog. Unlike traditional search engines, lifelog search includes multi-modality information of images, text and other data from sensors, which brings challenges to data arrangement and search. Accordingly, users' information need is also multi-level. Hence, a single interaction mechanism may not be able to satisfy users' requirements. As the data set is highly personalized, interaction and feedback from users should also be considered in the search engine. Therefore, in this paper we present an interactive multi-modality lifelog search engine to help users manage and find lifelog data. To this end, lifelog data is clustered and processed in multi-level processing. Then, we build an interactive search engine, includingtext as query, image as query, andtimeline view modules. Besides, the system is able to adopt user feedback mechanisms in multi-round queries. Our system shows promising experimental results on LSC'20 dataset and development topics. The text-based search module gives correct results on more than 60% of the development topics at LSC'20.\",\"PeriodicalId\":340585,\"journal\":{\"name\":\"Proceedings of the Third Annual Workshop on Lifelog Search Challenge\",\"volume\":\"777 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-06-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"12\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the Third Annual Workshop on Lifelog Search Challenge\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3379172.3391720\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Third Annual Workshop on Lifelog Search Challenge","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3379172.3391720","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A Multi-level Interactive Lifelog Search Engine with User Feedback
With the rise of portable wearable devices, it is easier for users to save their lifelog data. As lifelog is usually disorganized with multi-modal information (even noisy sometimes), an interactive search engine is crucial for users to review and explore their lifelog. Unlike traditional search engines, lifelog search includes multi-modality information of images, text and other data from sensors, which brings challenges to data arrangement and search. Accordingly, users' information need is also multi-level. Hence, a single interaction mechanism may not be able to satisfy users' requirements. As the data set is highly personalized, interaction and feedback from users should also be considered in the search engine. Therefore, in this paper we present an interactive multi-modality lifelog search engine to help users manage and find lifelog data. To this end, lifelog data is clustered and processed in multi-level processing. Then, we build an interactive search engine, includingtext as query, image as query, andtimeline view modules. Besides, the system is able to adopt user feedback mechanisms in multi-round queries. Our system shows promising experimental results on LSC'20 dataset and development topics. The text-based search module gives correct results on more than 60% of the development topics at LSC'20.