{"title":"基于生成模型的线性方程数学问题求解器的实现","authors":"Gayoung Kim, Seonho Kim, Junseong Bang","doi":"10.1109/PlatCon53246.2021.9680762","DOIUrl":null,"url":null,"abstract":"Solving math word problems automatically with a computer is an interesting topic. Instead of statistical methods and semantic parsing methods, recently, deep learning model based methods are used to solve MWPs. We experimented with different deep learning generative model that directly translates a math word problem into a linear equation. In this paper, four MWP solvers using the Sequence-to-Sequence (Seq2Seq) model with a attention mechanism were implemented, i.e., Seq2Seq, BiLSTM Seq2Seq, convolutional Seq2Seq, and transformer models. Then, performance analysis for the 4 MWP solvers has performed on MaWPS (English) and Math23K (Chinese) MWP datasets. Experiment shows that both the Seq2Seq model and the transformer model showed similar performance in translating into simple linear equations, but the transformer model showed the best performance in translating into more complex linear equations.","PeriodicalId":344742,"journal":{"name":"2021 International Conference on Platform Technology and Service (PlatCon)","volume":"384 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-08-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Implementation of Generative Model Based Solver for Mathematical Word Problem With Linear Equations\",\"authors\":\"Gayoung Kim, Seonho Kim, Junseong Bang\",\"doi\":\"10.1109/PlatCon53246.2021.9680762\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Solving math word problems automatically with a computer is an interesting topic. Instead of statistical methods and semantic parsing methods, recently, deep learning model based methods are used to solve MWPs. We experimented with different deep learning generative model that directly translates a math word problem into a linear equation. In this paper, four MWP solvers using the Sequence-to-Sequence (Seq2Seq) model with a attention mechanism were implemented, i.e., Seq2Seq, BiLSTM Seq2Seq, convolutional Seq2Seq, and transformer models. Then, performance analysis for the 4 MWP solvers has performed on MaWPS (English) and Math23K (Chinese) MWP datasets. Experiment shows that both the Seq2Seq model and the transformer model showed similar performance in translating into simple linear equations, but the transformer model showed the best performance in translating into more complex linear equations.\",\"PeriodicalId\":344742,\"journal\":{\"name\":\"2021 International Conference on Platform Technology and Service (PlatCon)\",\"volume\":\"384 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-08-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 International Conference on Platform Technology and Service (PlatCon)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/PlatCon53246.2021.9680762\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 International Conference on Platform Technology and Service (PlatCon)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PlatCon53246.2021.9680762","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Implementation of Generative Model Based Solver for Mathematical Word Problem With Linear Equations
Solving math word problems automatically with a computer is an interesting topic. Instead of statistical methods and semantic parsing methods, recently, deep learning model based methods are used to solve MWPs. We experimented with different deep learning generative model that directly translates a math word problem into a linear equation. In this paper, four MWP solvers using the Sequence-to-Sequence (Seq2Seq) model with a attention mechanism were implemented, i.e., Seq2Seq, BiLSTM Seq2Seq, convolutional Seq2Seq, and transformer models. Then, performance analysis for the 4 MWP solvers has performed on MaWPS (English) and Math23K (Chinese) MWP datasets. Experiment shows that both the Seq2Seq model and the transformer model showed similar performance in translating into simple linear equations, but the transformer model showed the best performance in translating into more complex linear equations.