{"title":"Neural Data-to-Text Generation Guided by Predicted Plan","authors":"Hanning Gao, Zhihua Wei","doi":"10.1109/icicse55337.2022.9828913","DOIUrl":null,"url":null,"abstract":"Data-to-text generation task aims to generate natural language text from structured data and has made great progress in recent years with the help of end-to-end neural network models. However, these end-to-end approaches often ignore the structure of the output text and convey the information in the input data in a random order. When faced with the data-to-text generation task, a person tends to make a plan for the complex input before writing the final text, which is inconsistent with end-to-end approaches. In this paper, we propose a novel plan-guided data-to-text generation framework consisting of a plan generator GT5 and a text generator Share-T5. The plan generator GT5 first predicts a plan based on the input data and then the text generator Share-T5 generates the target text based on the input data and the predicted plan. Empirical comparisons with strong baselines on two benchmark datasets show that our proposed plan-guided data-to-text generation framework can significantly improve the performance of plan prediction and text generation.","PeriodicalId":177985,"journal":{"name":"2022 IEEE 2nd International Conference on Information Communication and Software Engineering (ICICSE)","volume":"89 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE 2nd International Conference on Information Communication and Software Engineering (ICICSE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/icicse55337.2022.9828913","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Data-to-text generation task aims to generate natural language text from structured data and has made great progress in recent years with the help of end-to-end neural network models. However, these end-to-end approaches often ignore the structure of the output text and convey the information in the input data in a random order. When faced with the data-to-text generation task, a person tends to make a plan for the complex input before writing the final text, which is inconsistent with end-to-end approaches. In this paper, we propose a novel plan-guided data-to-text generation framework consisting of a plan generator GT5 and a text generator Share-T5. The plan generator GT5 first predicts a plan based on the input data and then the text generator Share-T5 generates the target text based on the input data and the predicted plan. Empirical comparisons with strong baselines on two benchmark datasets show that our proposed plan-guided data-to-text generation framework can significantly improve the performance of plan prediction and text generation.