{"title":"扩展深次模态函数","authors":"Seyed Mohammad Hosseini, Arash Jamshid, Seyed Mahdi Noormousavi, Mahdi Jafari Siavoshani, Naeimeh Omidvar","doi":"arxiv-2409.12053","DOIUrl":null,"url":null,"abstract":"We introduce a novel category of set functions called Extended Deep\nSubmodular functions (EDSFs), which are neural network-representable. EDSFs\nserve as an extension of Deep Submodular Functions (DSFs), inheriting crucial\nproperties from DSFs while addressing innate limitations. It is known that DSFs\ncan represent a limiting subset of submodular functions. In contrast, through\nan analysis of polymatroid properties, we establish that EDSFs possess the\ncapability to represent all monotone submodular functions, a notable\nenhancement compared to DSFs. Furthermore, our findings demonstrate that EDSFs\ncan represent any monotone set function, indicating the family of EDSFs is\nequivalent to the family of all monotone set functions. Additionally, we prove\nthat EDSFs maintain the concavity inherent in DSFs when the components of the\ninput vector are non-negative real numbers-an essential feature in certain\ncombinatorial optimization problems. Through extensive experiments, we\nillustrate that EDSFs exhibit significantly lower empirical generalization\nerror than DSFs in the learning of coverage functions. This suggests that EDSFs\npresent a promising advancement in the representation and learning of set\nfunctions with improved generalization capabilities.","PeriodicalId":501301,"journal":{"name":"arXiv - CS - Machine Learning","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Extended Deep Submodular Functions\",\"authors\":\"Seyed Mohammad Hosseini, Arash Jamshid, Seyed Mahdi Noormousavi, Mahdi Jafari Siavoshani, Naeimeh Omidvar\",\"doi\":\"arxiv-2409.12053\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We introduce a novel category of set functions called Extended Deep\\nSubmodular functions (EDSFs), which are neural network-representable. EDSFs\\nserve as an extension of Deep Submodular Functions (DSFs), inheriting crucial\\nproperties from DSFs while addressing innate limitations. It is known that DSFs\\ncan represent a limiting subset of submodular functions. In contrast, through\\nan analysis of polymatroid properties, we establish that EDSFs possess the\\ncapability to represent all monotone submodular functions, a notable\\nenhancement compared to DSFs. Furthermore, our findings demonstrate that EDSFs\\ncan represent any monotone set function, indicating the family of EDSFs is\\nequivalent to the family of all monotone set functions. Additionally, we prove\\nthat EDSFs maintain the concavity inherent in DSFs when the components of the\\ninput vector are non-negative real numbers-an essential feature in certain\\ncombinatorial optimization problems. Through extensive experiments, we\\nillustrate that EDSFs exhibit significantly lower empirical generalization\\nerror than DSFs in the learning of coverage functions. This suggests that EDSFs\\npresent a promising advancement in the representation and learning of set\\nfunctions with improved generalization capabilities.\",\"PeriodicalId\":501301,\"journal\":{\"name\":\"arXiv - CS - Machine Learning\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Machine Learning\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.12053\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Machine Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.12053","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
We introduce a novel category of set functions called Extended Deep
Submodular functions (EDSFs), which are neural network-representable. EDSFs
serve as an extension of Deep Submodular Functions (DSFs), inheriting crucial
properties from DSFs while addressing innate limitations. It is known that DSFs
can represent a limiting subset of submodular functions. In contrast, through
an analysis of polymatroid properties, we establish that EDSFs possess the
capability to represent all monotone submodular functions, a notable
enhancement compared to DSFs. Furthermore, our findings demonstrate that EDSFs
can represent any monotone set function, indicating the family of EDSFs is
equivalent to the family of all monotone set functions. Additionally, we prove
that EDSFs maintain the concavity inherent in DSFs when the components of the
input vector are non-negative real numbers-an essential feature in certain
combinatorial optimization problems. Through extensive experiments, we
illustrate that EDSFs exhibit significantly lower empirical generalization
error than DSFs in the learning of coverage functions. This suggests that EDSFs
present a promising advancement in the representation and learning of set
functions with improved generalization capabilities.