Min Pan, Yu Liu, Quanli Pei, Huixian Mao, Aoqun Jin, Sheng Huang, Yinhan Yang
{"title":"A Multi-Dimensional Semantic Pseudo-Relevance Feedback Information Retrieval Model","authors":"Min Pan, Yu Liu, Quanli Pei, Huixian Mao, Aoqun Jin, Sheng Huang, Yinhan Yang","doi":"10.1109/WI-IAT55865.2022.00141","DOIUrl":null,"url":null,"abstract":"Recently neural information retrieval systems have spurred many successful applications. Retrieval model to obtain a candidate document collection in the first retrieval stage, then use BERT to sort the candidate documents. Generally, the sentence score or paragraph score obtained using BERT is integrated into the document score to get the final ranking result. Semantic similarity is less often used to select query extensions and integrate semantic information into pseudo-relevance feedback. We propose a new strategy in this paper, selecting query extensions with semantic information using the BERT model. Incorporating semantic information weights into traditional pseudo-relevance feedback can better alleviate problems such as word polysemy and multi-word synonymy. Improve the performance of the retrieval system and return more accurate documents. The experimental results demonstrate that the query extensions selected by incorporating semantic information can help return more accurate results and improve the accuracy of the retrieval system, and the results of MAP and P@10 can prove the validity and feasibility of our proposed model.","PeriodicalId":345445,"journal":{"name":"2022 IEEE/WIC/ACM International Joint Conference on Web Intelligence and Intelligent Agent Technology (WI-IAT)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE/WIC/ACM International Joint Conference on Web Intelligence and Intelligent Agent Technology (WI-IAT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/WI-IAT55865.2022.00141","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Recently neural information retrieval systems have spurred many successful applications. Retrieval model to obtain a candidate document collection in the first retrieval stage, then use BERT to sort the candidate documents. Generally, the sentence score or paragraph score obtained using BERT is integrated into the document score to get the final ranking result. Semantic similarity is less often used to select query extensions and integrate semantic information into pseudo-relevance feedback. We propose a new strategy in this paper, selecting query extensions with semantic information using the BERT model. Incorporating semantic information weights into traditional pseudo-relevance feedback can better alleviate problems such as word polysemy and multi-word synonymy. Improve the performance of the retrieval system and return more accurate documents. The experimental results demonstrate that the query extensions selected by incorporating semantic information can help return more accurate results and improve the accuracy of the retrieval system, and the results of MAP and P@10 can prove the validity and feasibility of our proposed model.