扩展深次模态函数

Seyed Mohammad Hosseini, Arash Jamshid, Seyed Mahdi Noormousavi, Mahdi Jafari Siavoshani, Naeimeh Omidvar
{"title":"扩展深次模态函数","authors":"Seyed Mohammad Hosseini, Arash Jamshid, Seyed Mahdi Noormousavi, Mahdi Jafari Siavoshani, Naeimeh Omidvar","doi":"arxiv-2409.12053","DOIUrl":null,"url":null,"abstract":"We introduce a novel category of set functions called Extended Deep\nSubmodular functions (EDSFs), which are neural network-representable. EDSFs\nserve as an extension of Deep Submodular Functions (DSFs), inheriting crucial\nproperties from DSFs while addressing innate limitations. It is known that DSFs\ncan represent a limiting subset of submodular functions. In contrast, through\nan analysis of polymatroid properties, we establish that EDSFs possess the\ncapability to represent all monotone submodular functions, a notable\nenhancement compared to DSFs. Furthermore, our findings demonstrate that EDSFs\ncan represent any monotone set function, indicating the family of EDSFs is\nequivalent to the family of all monotone set functions. Additionally, we prove\nthat EDSFs maintain the concavity inherent in DSFs when the components of the\ninput vector are non-negative real numbers-an essential feature in certain\ncombinatorial optimization problems. Through extensive experiments, we\nillustrate that EDSFs exhibit significantly lower empirical generalization\nerror than DSFs in the learning of coverage functions. This suggests that EDSFs\npresent a promising advancement in the representation and learning of set\nfunctions with improved generalization capabilities.","PeriodicalId":501301,"journal":{"name":"arXiv - CS - Machine Learning","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Extended Deep Submodular Functions\",\"authors\":\"Seyed Mohammad Hosseini, Arash Jamshid, Seyed Mahdi Noormousavi, Mahdi Jafari Siavoshani, Naeimeh Omidvar\",\"doi\":\"arxiv-2409.12053\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We introduce a novel category of set functions called Extended Deep\\nSubmodular functions (EDSFs), which are neural network-representable. EDSFs\\nserve as an extension of Deep Submodular Functions (DSFs), inheriting crucial\\nproperties from DSFs while addressing innate limitations. It is known that DSFs\\ncan represent a limiting subset of submodular functions. In contrast, through\\nan analysis of polymatroid properties, we establish that EDSFs possess the\\ncapability to represent all monotone submodular functions, a notable\\nenhancement compared to DSFs. Furthermore, our findings demonstrate that EDSFs\\ncan represent any monotone set function, indicating the family of EDSFs is\\nequivalent to the family of all monotone set functions. Additionally, we prove\\nthat EDSFs maintain the concavity inherent in DSFs when the components of the\\ninput vector are non-negative real numbers-an essential feature in certain\\ncombinatorial optimization problems. Through extensive experiments, we\\nillustrate that EDSFs exhibit significantly lower empirical generalization\\nerror than DSFs in the learning of coverage functions. This suggests that EDSFs\\npresent a promising advancement in the representation and learning of set\\nfunctions with improved generalization capabilities.\",\"PeriodicalId\":501301,\"journal\":{\"name\":\"arXiv - CS - Machine Learning\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Machine Learning\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.12053\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Machine Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.12053","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

我们引入了一类新的集合函数,称为扩展深度子模态函数(EDSF),它是神经网络可表示的。EDSF 是深度子模态函数(DSF)的扩展,继承了 DSF 的关键特性,同时解决了其固有的局限性。众所周知,DSF 可以代表亚模态函数的极限子集。与此相反,通过分析多模态性质,我们发现 EDSF 具有表示所有单调子模态函数的能力,这与 DSF 相比是一个显著的进步。此外,我们的研究结果表明,EDSF可以表示任何单调集合函数,这表明EDSF族等价于所有单调集合函数族。此外,我们还证明了当输入向量的分量为非负实数时,EDSFs 保持了 DSFs 固有的凹性--这是某些组合优化问题的基本特征。通过大量实验,我们证明在学习覆盖函数时,EDSF 的经验泛化误差明显低于 DSF。这表明,EDSF 在表示和学习具有更好泛化能力的集合函数方面是一个很有前途的进步。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Extended Deep Submodular Functions
We introduce a novel category of set functions called Extended Deep Submodular functions (EDSFs), which are neural network-representable. EDSFs serve as an extension of Deep Submodular Functions (DSFs), inheriting crucial properties from DSFs while addressing innate limitations. It is known that DSFs can represent a limiting subset of submodular functions. In contrast, through an analysis of polymatroid properties, we establish that EDSFs possess the capability to represent all monotone submodular functions, a notable enhancement compared to DSFs. Furthermore, our findings demonstrate that EDSFs can represent any monotone set function, indicating the family of EDSFs is equivalent to the family of all monotone set functions. Additionally, we prove that EDSFs maintain the concavity inherent in DSFs when the components of the input vector are non-negative real numbers-an essential feature in certain combinatorial optimization problems. Through extensive experiments, we illustrate that EDSFs exhibit significantly lower empirical generalization error than DSFs in the learning of coverage functions. This suggests that EDSFs present a promising advancement in the representation and learning of set functions with improved generalization capabilities.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信