{"title":"安全,加速筛选框架支持张量机","authors":"Xiao Li , Hongmei Wang , Yitian Xu","doi":"10.1016/j.neunet.2025.107458","DOIUrl":null,"url":null,"abstract":"<div><div>Support Tensor Machines (STMs) constitute an effective supervised learning method for classifying high-dimensional tensor data. However, traditional iterative solving methods are often time-consuming. To effectively address the issue of lengthy training times, inspired by the safe screening strategies employed in support vector machines, we generalize the safe screening rule to the tensor domain and propose a novel safe screening rule for STM, which includes the dual static screening rule (DSSR), the dynamic screening rule (DGSR), and a subsequent checking verification. The screening rule initially employs variational inequalities to screen out a portion of redundant samples before training, reducing the problem scale. During the training process, the rule further accelerates training by iteratively screening redundant samples using the duality gap. We also design a subsequent checking technique based on optimality conditions to guarantee the safety of the screening rule. Building on this, we also develop a flexible safe screening framework, referred to as DS-DGSR, which incorporates the DSSR and the DGSR. It not only tackles the challenges of combining various tensor decomposition methods and the diverse scenarios of the decomposed coefficient parameter and decomposed samples in STMs, but also offers flexible adaptation and application according to the characteristics of different STMs. Numerical experiments on multiple real-world high-dimensional tensor datasets confirm the effectiveness and feasibility of DS-DGSR.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"188 ","pages":"Article 107458"},"PeriodicalIF":6.0000,"publicationDate":"2025-04-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Safe and accelerated screening framework for support tensor machines\",\"authors\":\"Xiao Li , Hongmei Wang , Yitian Xu\",\"doi\":\"10.1016/j.neunet.2025.107458\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Support Tensor Machines (STMs) constitute an effective supervised learning method for classifying high-dimensional tensor data. However, traditional iterative solving methods are often time-consuming. To effectively address the issue of lengthy training times, inspired by the safe screening strategies employed in support vector machines, we generalize the safe screening rule to the tensor domain and propose a novel safe screening rule for STM, which includes the dual static screening rule (DSSR), the dynamic screening rule (DGSR), and a subsequent checking verification. The screening rule initially employs variational inequalities to screen out a portion of redundant samples before training, reducing the problem scale. During the training process, the rule further accelerates training by iteratively screening redundant samples using the duality gap. We also design a subsequent checking technique based on optimality conditions to guarantee the safety of the screening rule. Building on this, we also develop a flexible safe screening framework, referred to as DS-DGSR, which incorporates the DSSR and the DGSR. It not only tackles the challenges of combining various tensor decomposition methods and the diverse scenarios of the decomposed coefficient parameter and decomposed samples in STMs, but also offers flexible adaptation and application according to the characteristics of different STMs. Numerical experiments on multiple real-world high-dimensional tensor datasets confirm the effectiveness and feasibility of DS-DGSR.</div></div>\",\"PeriodicalId\":49763,\"journal\":{\"name\":\"Neural Networks\",\"volume\":\"188 \",\"pages\":\"Article 107458\"},\"PeriodicalIF\":6.0000,\"publicationDate\":\"2025-04-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Networks\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0893608025003375\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025003375","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Safe and accelerated screening framework for support tensor machines
Support Tensor Machines (STMs) constitute an effective supervised learning method for classifying high-dimensional tensor data. However, traditional iterative solving methods are often time-consuming. To effectively address the issue of lengthy training times, inspired by the safe screening strategies employed in support vector machines, we generalize the safe screening rule to the tensor domain and propose a novel safe screening rule for STM, which includes the dual static screening rule (DSSR), the dynamic screening rule (DGSR), and a subsequent checking verification. The screening rule initially employs variational inequalities to screen out a portion of redundant samples before training, reducing the problem scale. During the training process, the rule further accelerates training by iteratively screening redundant samples using the duality gap. We also design a subsequent checking technique based on optimality conditions to guarantee the safety of the screening rule. Building on this, we also develop a flexible safe screening framework, referred to as DS-DGSR, which incorporates the DSSR and the DGSR. It not only tackles the challenges of combining various tensor decomposition methods and the diverse scenarios of the decomposed coefficient parameter and decomposed samples in STMs, but also offers flexible adaptation and application according to the characteristics of different STMs. Numerical experiments on multiple real-world high-dimensional tensor datasets confirm the effectiveness and feasibility of DS-DGSR.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.