Yaoxin Wu , Zhiguang Cao , Wen Song , Yingqian Zhang
{"title":"基于表征学习的两阶段随机整数方案求解","authors":"Yaoxin Wu , Zhiguang Cao , Wen Song , Yingqian Zhang","doi":"10.1016/j.neunet.2025.107446","DOIUrl":null,"url":null,"abstract":"<div><div>Solving stochastic integer programs (SIPs) is extremely intractable due to the high computational complexity. To solve two-stage SIPs efficiently, we propose a conditional variational autoencoder (CVAE) for scenario representation learning. A graph convolutional network (GCN) based VAE embeds scenarios into a low-dimensional latent space, conditioned on the deterministic context of each instance. With the latent representations of stochastic scenarios, we perform two auxiliary tasks: <em>objective prediction</em> and <em>scenario contrast</em>, which predict scenario objective values and the similarities between them, respectively. These tasks further integrate objective information into the representations through gradient backpropagation. Experiments show that the learned scenario representations can help reduce scenarios in SIPs, facilitating high-quality solutions in a short computational time. This superiority generalizes well to instances of larger sizes, more scenarios, and various distributions.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"188 ","pages":"Article 107446"},"PeriodicalIF":6.0000,"publicationDate":"2025-04-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Solving two-stage stochastic integer programs via representation learning\",\"authors\":\"Yaoxin Wu , Zhiguang Cao , Wen Song , Yingqian Zhang\",\"doi\":\"10.1016/j.neunet.2025.107446\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Solving stochastic integer programs (SIPs) is extremely intractable due to the high computational complexity. To solve two-stage SIPs efficiently, we propose a conditional variational autoencoder (CVAE) for scenario representation learning. A graph convolutional network (GCN) based VAE embeds scenarios into a low-dimensional latent space, conditioned on the deterministic context of each instance. With the latent representations of stochastic scenarios, we perform two auxiliary tasks: <em>objective prediction</em> and <em>scenario contrast</em>, which predict scenario objective values and the similarities between them, respectively. These tasks further integrate objective information into the representations through gradient backpropagation. Experiments show that the learned scenario representations can help reduce scenarios in SIPs, facilitating high-quality solutions in a short computational time. This superiority generalizes well to instances of larger sizes, more scenarios, and various distributions.</div></div>\",\"PeriodicalId\":49763,\"journal\":{\"name\":\"Neural Networks\",\"volume\":\"188 \",\"pages\":\"Article 107446\"},\"PeriodicalIF\":6.0000,\"publicationDate\":\"2025-04-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Networks\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0893608025003259\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025003259","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Solving two-stage stochastic integer programs via representation learning
Solving stochastic integer programs (SIPs) is extremely intractable due to the high computational complexity. To solve two-stage SIPs efficiently, we propose a conditional variational autoencoder (CVAE) for scenario representation learning. A graph convolutional network (GCN) based VAE embeds scenarios into a low-dimensional latent space, conditioned on the deterministic context of each instance. With the latent representations of stochastic scenarios, we perform two auxiliary tasks: objective prediction and scenario contrast, which predict scenario objective values and the similarities between them, respectively. These tasks further integrate objective information into the representations through gradient backpropagation. Experiments show that the learned scenario representations can help reduce scenarios in SIPs, facilitating high-quality solutions in a short computational time. This superiority generalizes well to instances of larger sizes, more scenarios, and various distributions.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.