{"title":"发现逻辑知识,进行深度问答","authors":"Zhao Liu, Xipeng Qiu, L. Cao, Xuanjing Huang","doi":"10.1145/2396761.2398544","DOIUrl":null,"url":null,"abstract":"Most open-domain question answering systems achieve better performances with large corpora, such as Web, by taking advantage of information redundancy. However, explicit answers are not always mentioned in the corpus, many answers are implicitly contained and can only be deducted by inference. In this paper, we propose an approach to discover logical knowledge for deep question answering, which automatically extracts knowledge in an unsupervised, domain-independent manner from background texts and reasons out implicit answers for the questions. Firstly, we use semantic role labeling to transform natural language expressions to predicates in first-order logic. Then we use association analysis to uncover the implicit relations among these predicates and build propositions for inference. Since our knowledge is drawn from different sources, we use Markov logic to merge multiple knowledge bases without resolving their inconsistencies. Our experiments show that these propositions can improve the performance of question answering significantly.","PeriodicalId":313414,"journal":{"name":"Proceedings of the 21st ACM international conference on Information and knowledge management","volume":"52 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Discovering logical knowledge for deep question answering\",\"authors\":\"Zhao Liu, Xipeng Qiu, L. Cao, Xuanjing Huang\",\"doi\":\"10.1145/2396761.2398544\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Most open-domain question answering systems achieve better performances with large corpora, such as Web, by taking advantage of information redundancy. However, explicit answers are not always mentioned in the corpus, many answers are implicitly contained and can only be deducted by inference. In this paper, we propose an approach to discover logical knowledge for deep question answering, which automatically extracts knowledge in an unsupervised, domain-independent manner from background texts and reasons out implicit answers for the questions. Firstly, we use semantic role labeling to transform natural language expressions to predicates in first-order logic. Then we use association analysis to uncover the implicit relations among these predicates and build propositions for inference. Since our knowledge is drawn from different sources, we use Markov logic to merge multiple knowledge bases without resolving their inconsistencies. Our experiments show that these propositions can improve the performance of question answering significantly.\",\"PeriodicalId\":313414,\"journal\":{\"name\":\"Proceedings of the 21st ACM international conference on Information and knowledge management\",\"volume\":\"52 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2012-10-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 21st ACM international conference on Information and knowledge management\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2396761.2398544\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 21st ACM international conference on Information and knowledge management","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2396761.2398544","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Discovering logical knowledge for deep question answering
Most open-domain question answering systems achieve better performances with large corpora, such as Web, by taking advantage of information redundancy. However, explicit answers are not always mentioned in the corpus, many answers are implicitly contained and can only be deducted by inference. In this paper, we propose an approach to discover logical knowledge for deep question answering, which automatically extracts knowledge in an unsupervised, domain-independent manner from background texts and reasons out implicit answers for the questions. Firstly, we use semantic role labeling to transform natural language expressions to predicates in first-order logic. Then we use association analysis to uncover the implicit relations among these predicates and build propositions for inference. Since our knowledge is drawn from different sources, we use Markov logic to merge multiple knowledge bases without resolving their inconsistencies. Our experiments show that these propositions can improve the performance of question answering significantly.