Yige Zhang , Xiaoyan Zhang , Jian Sun , Ying Li , Jiaquan Gao
{"title":"HTS-LB:超图树搜索学习分支","authors":"Yige Zhang , Xiaoyan Zhang , Jian Sun , Ying Li , Jiaquan Gao","doi":"10.1016/j.neunet.2025.107784","DOIUrl":null,"url":null,"abstract":"<div><div>Mixed integer linear programming (MILP) is a fundamental combinatorial optimization problem with wide-ranging applications in resource-constrained scenarios. Recent studies have focused on using machine learning to imitate the decision-making process in MILP solving, often representing MILPs as bipartite graphs for learning branching policies. We analyze these studies and identify three key issues that need to be addressed for solving MILPs, namely scalability, richness of information, and branching accuracy. In this study, we propose a hypergraph tree search framework for learning branch (HTS-LB) to address the above issues. In HTS-LB, MILPs are first represented by hypergraphs to make them available for large-scale scenarios. Second, a hypergraph attention network (HAN) for branching policy encoding is constructed to map the hypergraph representation to the probability distributions of branching variables. In HAN, a dual multi-head attention mechanism is used to obtain more accurate information when nodes update their representations. Finally, we design a tree search gating mechanism to capture rich dynamic information for subsequent updates of the variable representation. Extensive experiments on NP-hard MILP problems and practical scenarios demonstrate that our model is effective and outperforms popular machine learning algorithms in terms of branching accuracy, branch and bound nodes, and the dual–primal gap. Additionally, the integration of HTS-LB into the SCIP solver shows its strong generalization performance in large-scale MILPs.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"191 ","pages":"Article 107784"},"PeriodicalIF":6.3000,"publicationDate":"2025-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"HTS-LB: Hypergraph tree search for learning branch\",\"authors\":\"Yige Zhang , Xiaoyan Zhang , Jian Sun , Ying Li , Jiaquan Gao\",\"doi\":\"10.1016/j.neunet.2025.107784\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Mixed integer linear programming (MILP) is a fundamental combinatorial optimization problem with wide-ranging applications in resource-constrained scenarios. Recent studies have focused on using machine learning to imitate the decision-making process in MILP solving, often representing MILPs as bipartite graphs for learning branching policies. We analyze these studies and identify three key issues that need to be addressed for solving MILPs, namely scalability, richness of information, and branching accuracy. In this study, we propose a hypergraph tree search framework for learning branch (HTS-LB) to address the above issues. In HTS-LB, MILPs are first represented by hypergraphs to make them available for large-scale scenarios. Second, a hypergraph attention network (HAN) for branching policy encoding is constructed to map the hypergraph representation to the probability distributions of branching variables. In HAN, a dual multi-head attention mechanism is used to obtain more accurate information when nodes update their representations. Finally, we design a tree search gating mechanism to capture rich dynamic information for subsequent updates of the variable representation. Extensive experiments on NP-hard MILP problems and practical scenarios demonstrate that our model is effective and outperforms popular machine learning algorithms in terms of branching accuracy, branch and bound nodes, and the dual–primal gap. Additionally, the integration of HTS-LB into the SCIP solver shows its strong generalization performance in large-scale MILPs.</div></div>\",\"PeriodicalId\":49763,\"journal\":{\"name\":\"Neural Networks\",\"volume\":\"191 \",\"pages\":\"Article 107784\"},\"PeriodicalIF\":6.3000,\"publicationDate\":\"2025-07-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Networks\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0893608025006641\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025006641","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
HTS-LB: Hypergraph tree search for learning branch
Mixed integer linear programming (MILP) is a fundamental combinatorial optimization problem with wide-ranging applications in resource-constrained scenarios. Recent studies have focused on using machine learning to imitate the decision-making process in MILP solving, often representing MILPs as bipartite graphs for learning branching policies. We analyze these studies and identify three key issues that need to be addressed for solving MILPs, namely scalability, richness of information, and branching accuracy. In this study, we propose a hypergraph tree search framework for learning branch (HTS-LB) to address the above issues. In HTS-LB, MILPs are first represented by hypergraphs to make them available for large-scale scenarios. Second, a hypergraph attention network (HAN) for branching policy encoding is constructed to map the hypergraph representation to the probability distributions of branching variables. In HAN, a dual multi-head attention mechanism is used to obtain more accurate information when nodes update their representations. Finally, we design a tree search gating mechanism to capture rich dynamic information for subsequent updates of the variable representation. Extensive experiments on NP-hard MILP problems and practical scenarios demonstrate that our model is effective and outperforms popular machine learning algorithms in terms of branching accuracy, branch and bound nodes, and the dual–primal gap. Additionally, the integration of HTS-LB into the SCIP solver shows its strong generalization performance in large-scale MILPs.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.