{"title":"Monotonicity Reasoning in the Age of Neural Foundation Models","authors":"Zeming Chen, Qiyue Gao","doi":"10.1007/s10849-023-09411-3","DOIUrl":null,"url":null,"abstract":"<p>The recent advance of large language models (LLMs) demonstrates that these large-scale foundation models achieve remarkable capabilities across a wide range of language tasks and domains. The success of the statistical learning approach challenges our understanding of traditional symbolic and logical reasoning. The first part of this paper summarizes several works concerning the progress of monotonicity reasoning through neural networks and deep learning. We demonstrate different methods for solving the monotonicity reasoning task using neural and symbolic approaches and also discuss their advantages and limitations. The second part of this paper focuses on analyzing the capability of large-scale general-purpose language models to reason with monotonicity.</p>","PeriodicalId":48732,"journal":{"name":"Journal of Logic Language and Information","volume":"1 1","pages":""},"PeriodicalIF":0.7000,"publicationDate":"2023-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Logic Language and Information","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s10849-023-09411-3","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
The recent advance of large language models (LLMs) demonstrates that these large-scale foundation models achieve remarkable capabilities across a wide range of language tasks and domains. The success of the statistical learning approach challenges our understanding of traditional symbolic and logical reasoning. The first part of this paper summarizes several works concerning the progress of monotonicity reasoning through neural networks and deep learning. We demonstrate different methods for solving the monotonicity reasoning task using neural and symbolic approaches and also discuss their advantages and limitations. The second part of this paper focuses on analyzing the capability of large-scale general-purpose language models to reason with monotonicity.
期刊介绍:
The scope of the journal is the logical and computational foundations of natural, formal, and programming languages, as well as the different forms of human and mechanized inference. It covers the logical, linguistic, and information-theoretic parts of the cognitive sciences.
Examples of main subareas are Intentional Logics including Dynamic Logic; Nonmonotonic Logic and Belief Revision; Constructive Logics; Complexity Issues in Logic and Linguistics; Theoretical Problems of Logic Programming and Resolution; Categorial Grammar and Type Theory; Generalized Quantification; Information-Oriented Theories of Semantic Structure like Situation Semantics, Discourse Representation Theory, and Dynamic Semantics; Connectionist Models of Logical and Linguistic Structures. The emphasis is on the theoretical aspects of these areas.