Hyunju Kim, Junwon Hwang, Taewoo Yoo, Yun-Gyung Cheong
{"title":"Improving a Graph-to-Tree Model for Solving Math Word Problems","authors":"Hyunju Kim, Junwon Hwang, Taewoo Yoo, Yun-Gyung Cheong","doi":"10.1109/imcom53663.2022.9721720","DOIUrl":null,"url":null,"abstract":"In the area of Math Word Problem (MWP), various methods based on deep learning technology have been actively researched. Graph-to-Tree (Graph2Tree) is one of those methods which uses a graph-based encoder and a tree-based decoder to understand the word problem and to generate a valid equation. This method is proven to be well-performed by achieving state-of-the-art on several benchmarks. However, on the benchmark of SVAMP, recent methods including Sequence-to-Sequence (Seq2Seq), Goal-driven Tree-Structured MWP Solver (GTS), and Graph2Tree performs poorly, unable to cope with several variation types that requires natural language comprehension capability. In this paper, we propose an improved version of Graph2Tree which considers the characteristics of natural language to understand the word problems. On top of the original Graph2Tree model, we additionally build Dependency Graph and enhance the Quantity Cell Graph to Softly Expanded Quantity Cell Graph. This helps a graph-based encoder to capture the relationship among words. Also, we introduce question embedding for the tree-based decoder to generate equation based on the question given as input. We conduct experiments to evaluate our model against the original Graph2Tree model on three available datasets: MAWPS, ASDiv-A, and SVAMP. We also present case studies to qualitatively examine the effectiveness of the methods and showed that our methods have improved the original Graph2Tree model.","PeriodicalId":367038,"journal":{"name":"2022 16th International Conference on Ubiquitous Information Management and Communication (IMCOM)","volume":"96 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 16th International Conference on Ubiquitous Information Management and Communication (IMCOM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/imcom53663.2022.9721720","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
In the area of Math Word Problem (MWP), various methods based on deep learning technology have been actively researched. Graph-to-Tree (Graph2Tree) is one of those methods which uses a graph-based encoder and a tree-based decoder to understand the word problem and to generate a valid equation. This method is proven to be well-performed by achieving state-of-the-art on several benchmarks. However, on the benchmark of SVAMP, recent methods including Sequence-to-Sequence (Seq2Seq), Goal-driven Tree-Structured MWP Solver (GTS), and Graph2Tree performs poorly, unable to cope with several variation types that requires natural language comprehension capability. In this paper, we propose an improved version of Graph2Tree which considers the characteristics of natural language to understand the word problems. On top of the original Graph2Tree model, we additionally build Dependency Graph and enhance the Quantity Cell Graph to Softly Expanded Quantity Cell Graph. This helps a graph-based encoder to capture the relationship among words. Also, we introduce question embedding for the tree-based decoder to generate equation based on the question given as input. We conduct experiments to evaluate our model against the original Graph2Tree model on three available datasets: MAWPS, ASDiv-A, and SVAMP. We also present case studies to qualitatively examine the effectiveness of the methods and showed that our methods have improved the original Graph2Tree model.