{"title":"Distribution Metric Based $V$-Matrix Support Vector Machine","authors":"Yiwei Song;Yuanhai Shao;Chunna Li","doi":"10.1109/LSP.2025.3543266","DOIUrl":null,"url":null,"abstract":"The <inline-formula><tex-math>$V$</tex-math></inline-formula>-matrix Support Vector Machine (VSVM) is an innovative machine learning method recently proposed by Vapnik and Izmailov, which integrates positional relationships among training samples into the model learning, yielding the decision via conditional probability. But it overlooks the distribution information hidden in the data which plays a pivotal role in the training process and neglects the utilization of testing samples. To fully exploit the distribution information of the data, this paper proposes a novel Distribution Metric Based <inline-formula><tex-math>$V$</tex-math></inline-formula>-matrix Support Vector Machine (DVSVM) building upon VSVM. DVSVM incorporates the distributional information implicit in the data by measuring the distances between samples using the Wasserstein distance. Compared to VSVM, it also additionally accounts for the positional relationships of testing samples. It is further theoretically proved that VSVM can degenerate from DVSVM under certain conditions. Experimental results on several synthetic datasets and real-world disease datasets demonstrate the superiority of DVSVM.","PeriodicalId":13154,"journal":{"name":"IEEE Signal Processing Letters","volume":"32 ","pages":"1031-1035"},"PeriodicalIF":3.2000,"publicationDate":"2025-02-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Signal Processing Letters","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10891902/","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
The $V$-matrix Support Vector Machine (VSVM) is an innovative machine learning method recently proposed by Vapnik and Izmailov, which integrates positional relationships among training samples into the model learning, yielding the decision via conditional probability. But it overlooks the distribution information hidden in the data which plays a pivotal role in the training process and neglects the utilization of testing samples. To fully exploit the distribution information of the data, this paper proposes a novel Distribution Metric Based $V$-matrix Support Vector Machine (DVSVM) building upon VSVM. DVSVM incorporates the distributional information implicit in the data by measuring the distances between samples using the Wasserstein distance. Compared to VSVM, it also additionally accounts for the positional relationships of testing samples. It is further theoretically proved that VSVM can degenerate from DVSVM under certain conditions. Experimental results on several synthetic datasets and real-world disease datasets demonstrate the superiority of DVSVM.
期刊介绍:
The IEEE Signal Processing Letters is a monthly, archival publication designed to provide rapid dissemination of original, cutting-edge ideas and timely, significant contributions in signal, image, speech, language and audio processing. Papers published in the Letters can be presented within one year of their appearance in signal processing conferences such as ICASSP, GlobalSIP and ICIP, and also in several workshop organized by the Signal Processing Society.