Deji Zhao , Donghong Han , Jia Wu , Zhongjiang He , Bo Ning , Ye Yuan , Yongxiang Li , Chao Wang , Shuangyong Song
{"title":"通过计算逻辑图提高大型语言模型的数学推理能力","authors":"Deji Zhao , Donghong Han , Jia Wu , Zhongjiang He , Bo Ning , Ye Yuan , Yongxiang Li , Chao Wang , Shuangyong Song","doi":"10.1016/j.knosys.2025.113905","DOIUrl":null,"url":null,"abstract":"<div><div>The reasoning capabilities of large language models (LLMs) are essential for a wide range of tasks, particularly in the domain of mathematical reasoning. Common chain of thought methods perform well in handling simple reasoning problems, but for complex problems, a single-dimensional chain of thought is inadequate to address multi-layered logical relationships. To tackle this challenge, this paper introduces the concept of a Computation Logic Graph (CLG), designed to enhance the logical reasoning abilities of LLMs when solving complex mathematical problems. The CLG decomposes complex mathematical problems into multiple simple intermediate computational units, and the final answer is obtained through multiple iterations of these units. On the one hand, the CLG improves the model’s ability to decompose and solve complex mathematical problems step-by-step from a global perspective. On the other hand, the local inference process within the CLG helps enhance the model’s accuracy in single step calculations. To develop models with the ability to construct Computation Logic Graphs automatically, we create a dataset of computational logic graphs for complex mathematical problems, called the Computation-intensive Math Logic Graph (CMLG) dataset. We fine-tune several open-source LLMs using the CMLG dataset. Experimental results demonstrate that the proposed CLG method significantly enhances the performance of LLMs in complex mathematical reasoning tasks, outperforming on both the CMLG dataset and six other publicly available datasets from diverse domains.</div></div>","PeriodicalId":49939,"journal":{"name":"Knowledge-Based Systems","volume":"325 ","pages":"Article 113905"},"PeriodicalIF":7.6000,"publicationDate":"2025-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Enhancing math reasoning ability of large language models via computation logic graphs\",\"authors\":\"Deji Zhao , Donghong Han , Jia Wu , Zhongjiang He , Bo Ning , Ye Yuan , Yongxiang Li , Chao Wang , Shuangyong Song\",\"doi\":\"10.1016/j.knosys.2025.113905\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>The reasoning capabilities of large language models (LLMs) are essential for a wide range of tasks, particularly in the domain of mathematical reasoning. Common chain of thought methods perform well in handling simple reasoning problems, but for complex problems, a single-dimensional chain of thought is inadequate to address multi-layered logical relationships. To tackle this challenge, this paper introduces the concept of a Computation Logic Graph (CLG), designed to enhance the logical reasoning abilities of LLMs when solving complex mathematical problems. The CLG decomposes complex mathematical problems into multiple simple intermediate computational units, and the final answer is obtained through multiple iterations of these units. On the one hand, the CLG improves the model’s ability to decompose and solve complex mathematical problems step-by-step from a global perspective. On the other hand, the local inference process within the CLG helps enhance the model’s accuracy in single step calculations. To develop models with the ability to construct Computation Logic Graphs automatically, we create a dataset of computational logic graphs for complex mathematical problems, called the Computation-intensive Math Logic Graph (CMLG) dataset. We fine-tune several open-source LLMs using the CMLG dataset. Experimental results demonstrate that the proposed CLG method significantly enhances the performance of LLMs in complex mathematical reasoning tasks, outperforming on both the CMLG dataset and six other publicly available datasets from diverse domains.</div></div>\",\"PeriodicalId\":49939,\"journal\":{\"name\":\"Knowledge-Based Systems\",\"volume\":\"325 \",\"pages\":\"Article 113905\"},\"PeriodicalIF\":7.6000,\"publicationDate\":\"2025-06-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Knowledge-Based Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0950705125009517\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Knowledge-Based Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0950705125009517","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Enhancing math reasoning ability of large language models via computation logic graphs
The reasoning capabilities of large language models (LLMs) are essential for a wide range of tasks, particularly in the domain of mathematical reasoning. Common chain of thought methods perform well in handling simple reasoning problems, but for complex problems, a single-dimensional chain of thought is inadequate to address multi-layered logical relationships. To tackle this challenge, this paper introduces the concept of a Computation Logic Graph (CLG), designed to enhance the logical reasoning abilities of LLMs when solving complex mathematical problems. The CLG decomposes complex mathematical problems into multiple simple intermediate computational units, and the final answer is obtained through multiple iterations of these units. On the one hand, the CLG improves the model’s ability to decompose and solve complex mathematical problems step-by-step from a global perspective. On the other hand, the local inference process within the CLG helps enhance the model’s accuracy in single step calculations. To develop models with the ability to construct Computation Logic Graphs automatically, we create a dataset of computational logic graphs for complex mathematical problems, called the Computation-intensive Math Logic Graph (CMLG) dataset. We fine-tune several open-source LLMs using the CMLG dataset. Experimental results demonstrate that the proposed CLG method significantly enhances the performance of LLMs in complex mathematical reasoning tasks, outperforming on both the CMLG dataset and six other publicly available datasets from diverse domains.
期刊介绍:
Knowledge-Based Systems, an international and interdisciplinary journal in artificial intelligence, publishes original, innovative, and creative research results in the field. It focuses on knowledge-based and other artificial intelligence techniques-based systems. The journal aims to support human prediction and decision-making through data science and computation techniques, provide a balanced coverage of theory and practical study, and encourage the development and implementation of knowledge-based intelligence models, methods, systems, and software tools. Applications in business, government, education, engineering, and healthcare are emphasized.