{"title":"Text Entailment Generation with Attention-based Sequence-to-sequence Model","authors":"Xiaomei Zhao, H. Yanagimoto","doi":"10.1109/IIAI-AAI50415.2020.00029","DOIUrl":null,"url":null,"abstract":"Text entailment needs semantic similarity judgment between two sentences and is a good task to measure text understanding. If we realize entailment generation, we can apply it to summarization that keeps semantics between an original text and a generated text. These days, neural networks are employed to construct modules that encode an original text and generate summarization. In natural language processing, sequence-to-sequence models, which realize sequential learning, are employed to develop machine translation. Moreover, attention mechanism is proposed to improve machine translation considering word alignment between a source language and a target language. In this paper, we applied an attention-based sequence-to-sequence model to an entailment generation task and confirmed the system realized entailment generation. The proposed method can capture important words in the input text and generate a frequent sentence, which is grammatically correct and semantically appropriate. The results mean the proposed system understands a text semantically.","PeriodicalId":188870,"journal":{"name":"2020 9th International Congress on Advanced Applied Informatics (IIAI-AAI)","volume":"35 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 9th International Congress on Advanced Applied Informatics (IIAI-AAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IIAI-AAI50415.2020.00029","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Text entailment needs semantic similarity judgment between two sentences and is a good task to measure text understanding. If we realize entailment generation, we can apply it to summarization that keeps semantics between an original text and a generated text. These days, neural networks are employed to construct modules that encode an original text and generate summarization. In natural language processing, sequence-to-sequence models, which realize sequential learning, are employed to develop machine translation. Moreover, attention mechanism is proposed to improve machine translation considering word alignment between a source language and a target language. In this paper, we applied an attention-based sequence-to-sequence model to an entailment generation task and confirmed the system realized entailment generation. The proposed method can capture important words in the input text and generate a frequent sentence, which is grammatically correct and semantically appropriate. The results mean the proposed system understands a text semantically.