IEEE transactions on neural networks and learning systems最新文献

筛选
英文 中文
TIENet: A Tri-Interaction Enhancement Network for Multimodal Person Reidentification
IF 10.4 1区 计算机科学
IEEE transactions on neural networks and learning systems Pub Date : 2025-03-19 DOI: 10.1109/tnnls.2025.3544679
Xi Yang, Wenjiao Dong, De Cheng, Nannan Wang, Xinbo Gao
{"title":"TIENet: A Tri-Interaction Enhancement Network for Multimodal Person Reidentification","authors":"Xi Yang, Wenjiao Dong, De Cheng, Nannan Wang, Xinbo Gao","doi":"10.1109/tnnls.2025.3544679","DOIUrl":"https://doi.org/10.1109/tnnls.2025.3544679","url":null,"abstract":"","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"59 1","pages":""},"PeriodicalIF":10.4,"publicationDate":"2025-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143661452","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Noise-Robust Federated Learning via Interclient Co-Distillation
IF 10.4 1区 计算机科学
IEEE transactions on neural networks and learning systems Pub Date : 2025-03-19 DOI: 10.1109/tnnls.2025.3546903
Liang Gao, Li Li, Yingwen Chen, Shaojing Fu, Dongsheng Wang, Siwei Wang, Cheng-Zhong Xu, Ming Xu
{"title":"Noise-Robust Federated Learning via Interclient Co-Distillation","authors":"Liang Gao, Li Li, Yingwen Chen, Shaojing Fu, Dongsheng Wang, Siwei Wang, Cheng-Zhong Xu, Ming Xu","doi":"10.1109/tnnls.2025.3546903","DOIUrl":"https://doi.org/10.1109/tnnls.2025.3546903","url":null,"abstract":"","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"379 1","pages":""},"PeriodicalIF":10.4,"publicationDate":"2025-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143661421","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Multiview Representation Learning via Information-Theoretic Optimization
IF 10.4 1区 计算机科学
IEEE transactions on neural networks and learning systems Pub Date : 2025-03-19 DOI: 10.1109/tnnls.2025.3546660
Weiqing Yan, Shuochen Yao, Chang Tang, Wujie Zhou
{"title":"Multiview Representation Learning via Information-Theoretic Optimization","authors":"Weiqing Yan, Shuochen Yao, Chang Tang, Wujie Zhou","doi":"10.1109/tnnls.2025.3546660","DOIUrl":"https://doi.org/10.1109/tnnls.2025.3546660","url":null,"abstract":"","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"40 1","pages":""},"PeriodicalIF":10.4,"publicationDate":"2025-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143661225","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Globality Meets Locality: An Anchor Graph Collaborative Learning Framework for Fast Multiview Subspace Clustering.
IF 10.2 1区 计算机科学
IEEE transactions on neural networks and learning systems Pub Date : 2025-03-18 DOI: 10.1109/TNNLS.2025.3545435
Jipeng Guo, Yanfeng Sun, Xin Ma, Junbin Gao, Yongli Hu, Youqing Wang, Baocai Yin
{"title":"Globality Meets Locality: An Anchor Graph Collaborative Learning Framework for Fast Multiview Subspace Clustering.","authors":"Jipeng Guo, Yanfeng Sun, Xin Ma, Junbin Gao, Yongli Hu, Youqing Wang, Baocai Yin","doi":"10.1109/TNNLS.2025.3545435","DOIUrl":"10.1109/TNNLS.2025.3545435","url":null,"abstract":"<p><p>Multiview subspace clustering (MSC) maximizes the utilization of complementary description information provided by multiview data and achieves impressive clustering performance. However, most of them are inefficient or even invalid among large-scale scenarios due to expensive computational complexity. Recently, anchor strategy has been developed to address this, which selects a few representative samples as anchor points for representation learning and anchor graph construction. However, most of them only explore single cross-view correlation, i.e., cross-view consistency from the global aspect or cross-view complementarity from the local aspect, which provides insufficient semantic correlation understanding and exploration for complex multiview data. To effectively address this issue, this study proposes a fast multiview subspace clustering (FMSC) with local-global anchor representation collaborative learning. FMSC integrates the discriminative anchor points learning and anchor graph construction with optimal structure into a joint framework. Furthermore, local (view-specific) and global (view-shared) anchor representations are learned collaboratively under two interaction strategies at different levels, providing beneficial guidance from global learning to local learning. Thus, the proposed FMSC can maximize the exploration of the complementarity-consistency among multiview data and capture a more comprehensive semantic correlation. More importantly, an effective algorithm with linear complexity is designed to solve the corresponding optimization problem of FMSC, making it more practical in large-scale clustering tasks. Extensive experimental results confirm the superiority of the proposed FMSC in both clustering performance and computational efficiency.</p>","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"PP ","pages":""},"PeriodicalIF":10.2,"publicationDate":"2025-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143656968","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Local-Global Structure-Aware Geometric Equivariant Graph Representation Learning for Predicting Protein-Ligand Binding Affinity.
IF 10.2 1区 计算机科学
IEEE transactions on neural networks and learning systems Pub Date : 2025-03-18 DOI: 10.1109/TNNLS.2025.3547300
Shihong Chen, Haicheng Yi, Zhuhong You, Xuequn Shang, Yu-An Huang, Lei Wang, Zhen Wang
{"title":"Local-Global Structure-Aware Geometric Equivariant Graph Representation Learning for Predicting Protein-Ligand Binding Affinity.","authors":"Shihong Chen, Haicheng Yi, Zhuhong You, Xuequn Shang, Yu-An Huang, Lei Wang, Zhen Wang","doi":"10.1109/TNNLS.2025.3547300","DOIUrl":"10.1109/TNNLS.2025.3547300","url":null,"abstract":"<p><p>Predicting protein-ligand binding affinities is a critical problem in drug discovery and design. A majority of existing methods fail to accurately characterize and exploit the geometrically invariant structures of protein-ligand complexes for predicting binding affinities. In this study, we propose Geo-protein-ligand binding affinity (PLA), a geometric equivariant graph representation learning framework with local-global structure awareness, to predict binding affinity by capturing the geometric information of protein-ligand complexes. Specifically, the local structural information of 3-D protein-ligand complexes is extracted by using an equivariant graph neural network (EGNN), which iteratively updates node representations while preserving the equivariance of coordinate transformations. Meanwhile, a graph transformer is utilized to capture long-range interactions among atoms, offering a global view that adaptively focuses on complex regions with a significant impact on binding affinities. Furthermore, the multiscale information from the two channels is integrated to enhance the predictive capability of the model. Extensive experimental studies on two benchmark datasets confirm the superior performance of Geo-PLA. Moreover, the visual interpretation of the learned protein-ligand complexes further indicates that our model offers valuable biological insights for virtual screening and drug repositioning.</p>","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"PP ","pages":""},"PeriodicalIF":10.2,"publicationDate":"2025-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143656972","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
One-Shot Secure Federated K-Means Clustering Based on Density Cores.
IF 10.2 1区 计算机科学
IEEE transactions on neural networks and learning systems Pub Date : 2025-03-18 DOI: 10.1109/TNNLS.2025.3547362
Yizhang Wang, Wei Pang, Di Wang, Witold Pedrycz
{"title":"One-Shot Secure Federated K-Means Clustering Based on Density Cores.","authors":"Yizhang Wang, Wei Pang, Di Wang, Witold Pedrycz","doi":"10.1109/TNNLS.2025.3547362","DOIUrl":"10.1109/TNNLS.2025.3547362","url":null,"abstract":"<p><p>Federated clustering (FC) performs well in independent and identically distributed (IID) scenarios, but it does not perform well in non-IID scenarios. In addition, existing methods lack proof of strict privacy protection. To address the above issues, we propose a new secure federated k-means clustering framework to achieve better clustering results under privacy requirements. Specifically, for the clients, we use cluster centers (representative points) generated by k-means to represent the corresponding clusters. These representative points can effectively preserve the structure of the local data and they are encrypted by differential privacy. For the server, we propose two methods to reprocess the uploaded encrypted representative points to obtain better final cluster centers, one uses k-means, and the other considers the improved density peaks (density cores) as final centers and then sends them back to the clients. Finally, each client assigns local data to their nearest centers. Experimental results show that the proposed methods perform better than several centralized (nonfederated) classical clustering algorithms [k-means, density-based spatial clustering of applications with noise (DBSCAN), and density peak clustering (DPC)] and state-of-the-art (SOTA) centralized clustering algorithms in most cases. In particular, the proposed algorithms perform better than the SOTA FC framework k-FED (ICML2021) and MUFC (ICLR2023).</p>","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"PP ","pages":""},"PeriodicalIF":10.2,"publicationDate":"2025-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143656996","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Mean-Square Synchronization of Additive Time-Varying Delayed Markovian Jumping Neural Networks Under Multiple Stochastic Sampling.
IF 10.2 1区 计算机科学
IEEE transactions on neural networks and learning systems Pub Date : 2025-03-18 DOI: 10.1109/TNNLS.2024.3478395
Pratap Anbalagan, Zhiguang Feng, Tingwen Huang, Yukang Cui
{"title":"Mean-Square Synchronization of Additive Time-Varying Delayed Markovian Jumping Neural Networks Under Multiple Stochastic Sampling.","authors":"Pratap Anbalagan, Zhiguang Feng, Tingwen Huang, Yukang Cui","doi":"10.1109/TNNLS.2024.3478395","DOIUrl":"10.1109/TNNLS.2024.3478395","url":null,"abstract":"<p><p>This study aims to solve the mean-square asymptotic synchronization problem of additive time-varying delayed Markovian jumping neural networks (ATVMJNNs) under the framework of multiple stochastic samplings and its direct application in secure image encryption (SIE). To do this, first, we assume the existence of multiple sampled data periods that satisfy a Bernoulli distribution and introduce random variables to represent the positions of input delays and sampling periods. Then, based on these assumptions, we develop a mode-dependent discontinuous Lyapunov-Krasovskii functional (DLKF) to reduce model conservatism. Next, we introduce a new auxiliary slack-matrix-based integral inequality (ASMBII) to approximate the integral quadratic terms arising from the derivative of the DLKFs. Furthermore, we develop a multiple stochastic sampling framework to achieve asymptotic synchronization between the primary and secondary systems, and less conservative criteria for asymptotic stability in the mean square sense of the error model are derived by solving a set of linear matrix inequalities (LMIs). Finally, we present the numerical validations and corresponding experimental results in a pragmatic application of image processing to demonstrate the benefits of the proposed algorithms and techniques. From both numerical and practical results, the proposed algorithms and techniques can yield superior performance compared to existing studies.</p>","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"PP ","pages":""},"PeriodicalIF":10.2,"publicationDate":"2025-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143657028","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Model-Free and Pseudoinverse-Free Zhang Neurodynamics Scheme for Robotic Arms' Path Tracking Control.
IF 10.2 1区 计算机科学
IEEE transactions on neural networks and learning systems Pub Date : 2025-03-18 DOI: 10.1109/TNNLS.2025.3540589
Jielong Chen, Yan Pan, Yunong Zhang
{"title":"Model-Free and Pseudoinverse-Free Zhang Neurodynamics Scheme for Robotic Arms' Path Tracking Control.","authors":"Jielong Chen, Yan Pan, Yunong Zhang","doi":"10.1109/TNNLS.2025.3540589","DOIUrl":"10.1109/TNNLS.2025.3540589","url":null,"abstract":"<p><p>Path tracking control of robotic arms is regarded as a fundamental problem in the field of robotics. However, obtaining an accurate model of the robotic arm in practical engineering poses significant challenges. As a result, model-free schemes have become a focus of investigation. In contrast to traditional model-free schemes used for estimating the Jacobian matrix of the robotic arm, in this work, a novel estimator directly for the pseudoinverse (PI) of the Jacobian matrix based on Zhang neurodynamics (ZN) is proposed for the first time. In addition, a novel model-free and PI-free ZN (MFPIFZN) scheme for path tracking control of robotic arms is proposed. The MFPIFZN scheme not only significantly reduces the operation complexity by eliminating the requirement to compute the PI of the Jacobian matrix but also enhances the accuracy by eliminating the potential errors that may arise from the computation of the PI. Theoretical analyses provide guarantees for the convergence and stability of the MFPIFZN scheme. Finally, experimental results conducted on planar four-link and Kinova Jaco2 robotic arms vividly illustrate the excellent performance of the MFPIFZN scheme. Comparison experiments with four other model-free schemes further confirm the superiority of the MFPIFZN scheme.</p>","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"PP ","pages":""},"PeriodicalIF":10.2,"publicationDate":"2025-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143656956","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A Novel Fusion and Feature Selection Framework for Multisource Time-Series Data Based on Information Entropy.
IF 10.2 1区 计算机科学
IEEE transactions on neural networks and learning systems Pub Date : 2025-03-18 DOI: 10.1109/TNNLS.2025.3548165
Xiuwei Chen, Li Lai, Maokang Luo
{"title":"A Novel Fusion and Feature Selection Framework for Multisource Time-Series Data Based on Information Entropy.","authors":"Xiuwei Chen, Li Lai, Maokang Luo","doi":"10.1109/TNNLS.2025.3548165","DOIUrl":"10.1109/TNNLS.2025.3548165","url":null,"abstract":"<p><p>Information technology growth brings vast time-series data. Despite richness, challenges like redundancy emphasize the need for time-series data fusion research. Rough set theory, a valuable tool for dealing with uncertainty, can identify features and reduce dimensionality, enhancing time-series data fusion. The contribution of the study lies in establishing a fusion and feature selection framework for multisource time-series data. This framework selects optimal information sources by minimizing entropy. In addition, the fusion process integrates a feature selection algorithm to eliminate redundant features, preventing a sequential increase in entropy. Crucial experiments on abundant datasets demonstrate that the proposed approach outperforms several state-of-the-art algorithms in terms of enhancing the accuracy of common classifiers. This research significantly advances the field of time-series data fusion in rough set theory, offering improved accuracy and efficiency in data processing and analysis.</p>","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"PP ","pages":""},"PeriodicalIF":10.2,"publicationDate":"2025-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143656889","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Convergence of Adaptive Stochastic Mirror Descent.
IF 10.2 1区 计算机科学
IEEE transactions on neural networks and learning systems Pub Date : 2025-03-18 DOI: 10.1109/TNNLS.2025.3545420
Ting Hu, Xiaotong Liu, Kai Ji, Yunwen Lei
{"title":"Convergence of Adaptive Stochastic Mirror Descent.","authors":"Ting Hu, Xiaotong Liu, Kai Ji, Yunwen Lei","doi":"10.1109/TNNLS.2025.3545420","DOIUrl":"10.1109/TNNLS.2025.3545420","url":null,"abstract":"<p><p>In this article, we present a family of adaptive stochastic optimization methods, which are associated with mirror maps that are widely used to capture the geometry properties of optimization problems during iteration processes. The well-known adaptive moment estimation (Adam)-type algorithm falls into the family when the mirror maps take the form of temporal adaptation. In the context of convex objective functions, we show that with proper step sizes and hyperparameters, the average regret can achieve the convergence rate after T iterations under some standard assumptions. We further improve it to when the objective functions are strongly convex. In the context of smooth objective functions (not necessarily convex), based on properties of the strongly convex differentiable mirror map, our algorithms achieve convergence rates of order up to a logarithmic term, requiring large or increasing hyperparameters that are coincident with practical usage of Adam-type algorithms. Thus, our work gives explanations for the selection of the hyperparameters in Adam-type algorithms' implementation.</p>","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"PP ","pages":""},"PeriodicalIF":10.2,"publicationDate":"2025-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143657011","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信