2007 International Joint Conference on Neural Networks最新文献

筛选
英文 中文
Estimation of Propagating Phase Transients in EEG Data - Application of Dynamic Logic Neural Modeling Approach 脑电数据中传播相位瞬态的估计——动态逻辑神经建模方法的应用
2007 International Joint Conference on Neural Networks Pub Date : 2007-10-29 DOI: 10.1109/IJCNN.2007.4371324
R. Kozma, R. Deming, L. Perlovsky
{"title":"Estimation of Propagating Phase Transients in EEG Data - Application of Dynamic Logic Neural Modeling Approach","authors":"R. Kozma, R. Deming, L. Perlovsky","doi":"10.1109/IJCNN.2007.4371324","DOIUrl":"https://doi.org/10.1109/IJCNN.2007.4371324","url":null,"abstract":"Dynamic logic (DL) approach establishes a unified framework for the statistical description of mixtures using model-based neural networks. In the present work, we extend the previous results to dynamic processes where the mixture parameters, including partial and total energy of the components are time-dependent. Equations are derived and solved for the estimation of parameters which vary in time. The results provide optimal approximation to a broad class of pattern recognition and process identification problems with variable and noisy data. The introduced methodology is demonstrated on the example of identification of propagating phase gradients generated by intermittent fluctuations in non-equilibrium neural media.","PeriodicalId":350091,"journal":{"name":"2007 International Joint Conference on Neural Networks","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2007-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115544916","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Emergence of Scale-free Graphs in Dynamical Spiking Neural Networks 动态峰值神经网络中无标度图的出现
2007 International Joint Conference on Neural Networks Pub Date : 2007-10-29 DOI: 10.1109/IJCNN.2007.4371052
Filip Pigkniewski
{"title":"Emergence of Scale-free Graphs in Dynamical Spiking Neural Networks","authors":"Filip Pigkniewski","doi":"10.1109/IJCNN.2007.4371052","DOIUrl":"https://doi.org/10.1109/IJCNN.2007.4371052","url":null,"abstract":"In this paper we discuss the presence of a scale-free property in spiking neural networks. Although as argued in the papers by Amaral et al. (2000) and Koch and Laurent (1999), some biological neural networks do not reveal scale-free nature on the level of single neurons, we believe, based on previous research (Piekniewski and Schreiber, 2007) and numerical simulations presented in this article, that such structures should emerge on the level of neuronal groups as a consequence of their rich dynamics and memory properties. The network we analyze is built upon the spiking model introduced by Eugene Izhikevich (2003; 2006). It is formed as a set of randomly constructed neuronal groups (each group to some extent resembles the original model), connected with Gaussian weights. Such a system exhibits rich dynamics, with chattering, bursting and other forms of neuronal activity, as well as global synchronization episodes. We analyze similarities of spike trains of neurons coming from different groups, and build a weighted graph which approximates the similarity of activities (synchronization) of pairs of units. The output graph reveals a scale-free structure giving support to our claim.","PeriodicalId":350091,"journal":{"name":"2007 International Joint Conference on Neural Networks","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2007-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122911672","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Significance measure of Local Cluster Neural Networks 局部聚类神经网络的显著性测度
2007 International Joint Conference on Neural Networks Pub Date : 2007-10-29 DOI: 10.1109/IJCNN.2007.4370950
R. Eickhoff, J. Sitte
{"title":"Significance measure of Local Cluster Neural Networks","authors":"R. Eickhoff, J. Sitte","doi":"10.1109/IJCNN.2007.4370950","DOIUrl":"https://doi.org/10.1109/IJCNN.2007.4370950","url":null,"abstract":"Artificial neural networks are intended to be used in emerging technologies as information processing systems because their biological equivalents seem to be tolerant to internal failures of computational elements. In this paper, we introduce a measurement which can identify significant neurons of the local cluster neural network and can be used to increase the fault tolerance of this network architecture. Furthermore, it show that this technique can control the network's complexity. Moreover, by this quality different parameter sets of the network and training techniques can be judged with respect to their fault tolerant properties.","PeriodicalId":350091,"journal":{"name":"2007 International Joint Conference on Neural Networks","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2007-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122803611","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Distributing SOM Ensemble Training using Grid Middleware 使用网格中间件分发SOM集成训练
2007 International Joint Conference on Neural Networks Pub Date : 2007-10-29 DOI: 10.1109/IJCNN.2007.4371387
B. Vrusias, L. Vomvoridis, Lee Gillam
{"title":"Distributing SOM Ensemble Training using Grid Middleware","authors":"B. Vrusias, L. Vomvoridis, Lee Gillam","doi":"10.1109/IJCNN.2007.4371387","DOIUrl":"https://doi.org/10.1109/IJCNN.2007.4371387","url":null,"abstract":"In this paper we explore the distribution of training of self-organised maps (SOM) on grid middleware. We propose a two-level architecture and discuss an experimental methodology comprising ensembles of SOMs distributed over a grid with periodic averaging of weights. The purpose of the experiments is to begin to systematically assess the potential for reducing the overall time taken for training by a distributed training regime against the impact on precision. Several issues are considered: (i) the optimum number of ensembles; (ii) the impact of different types of training data; and (iii) the appropriate period of averaging. The proposed architecture has been evaluated in a grid environment, with clock-time performance recorded.","PeriodicalId":350091,"journal":{"name":"2007 International Joint Conference on Neural Networks","volume":"14 8","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2007-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114132453","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
The Systematic Trajectory Search Algorithm for Feedforward Neural Network Training 前馈神经网络训练的系统轨迹搜索算法
2007 International Joint Conference on Neural Networks Pub Date : 2007-10-29 DOI: 10.1109/IJCNN.2007.4371124
L. Tseng, Wen-Ching Chen
{"title":"The Systematic Trajectory Search Algorithm for Feedforward Neural Network Training","authors":"L. Tseng, Wen-Ching Chen","doi":"10.1109/IJCNN.2007.4371124","DOIUrl":"https://doi.org/10.1109/IJCNN.2007.4371124","url":null,"abstract":"In this work, the systematic trajectory search algorithm (STSA) is proposed to train the connection weights of the feedforward neural networks. The STSA utilizes the orthogonal array (OA) to uniformly generate the initial population in order to globally explore the solution space, then it applies a novel trajectory search method that can exploit the promising area thoroughly. The performance of the proposed STSA is evaluated by applying it to train a class of feedforward neural networks to solve the n-bit parity problem and the classification problem on two medical datasets from the UCI machine learning repository. By comparing with the previous studies, the experimental results revealed that the neural networks trained by the STSA have very good classification ability.","PeriodicalId":350091,"journal":{"name":"2007 International Joint Conference on Neural Networks","volume":"96 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2007-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114431281","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Research on Diagnosing Heart Disease Using Adaptive Network-based Fuzzy Interferences System 基于自适应网络模糊干扰系统的心脏病诊断研究
2007 International Joint Conference on Neural Networks Pub Date : 2007-10-29 DOI: 10.1109/IJCNN.2007.4371036
Li Shi, Hui Li, Zhifu Sun, W. Liu
{"title":"Research on Diagnosing Heart Disease Using Adaptive Network-based Fuzzy Interferences System","authors":"Li Shi, Hui Li, Zhifu Sun, W. Liu","doi":"10.1109/IJCNN.2007.4371036","DOIUrl":"https://doi.org/10.1109/IJCNN.2007.4371036","url":null,"abstract":"The shape of ST segment of Electrocardiogram (ECG) is of great importance in diagnosing heart diseases. Based on feature points of ST segments which have been extracted from electrocardiogram (ECG) data with wavelet transform (WT), a five-input-and-single-output adaptive network-based fuzzy interferences system (ANFIS) is designed to classify the shapes of ST segments. In the system the if-then rule of Takagi-Sugeno is taken, and the combination of the gradient descent and the least-squares method is adopted to train the system. The effectiveness is demonstrated via the ECG data from the MIT-BIT and clinical ECG data.","PeriodicalId":350091,"journal":{"name":"2007 International Joint Conference on Neural Networks","volume":"75 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2007-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122086836","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
Gaussian Versus Cauchy Membership Functions in Fuzzy PSO 模糊粒子群中的高斯与柯西隶属函数
2007 International Joint Conference on Neural Networks Pub Date : 2007-10-29 DOI: 10.1109/IJCNN.2007.4371421
A. M. Abdelbar, S. Abdelshahid, D. Wunsch
{"title":"Gaussian Versus Cauchy Membership Functions in Fuzzy PSO","authors":"A. M. Abdelbar, S. Abdelshahid, D. Wunsch","doi":"10.1109/IJCNN.2007.4371421","DOIUrl":"https://doi.org/10.1109/IJCNN.2007.4371421","url":null,"abstract":"In standard particle swarm optimization (PSO), the best particle in each neighborhood exerts its influence over other particles in the neighborhood. Fuzzy PSO is a generalization which differs from standard PSO in the following respect: charisma (influence over others) is defined to be a fuzzy variable, and more than one particle in each neighborhood can have a non-zero degree of charisma, and, consequently, is allowed to influence others to a degree that depends on its charisma. In this paper, we compare between the use of the Gaussian and Cauchy membership functions (MF) as the MF of the charisma fuzzy variable. We evaluate the performance of the two MFs using the weighted max-sat problem.","PeriodicalId":350091,"journal":{"name":"2007 International Joint Conference on Neural Networks","volume":"118 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2007-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116822522","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
Recursive Feature Extraction for Ordinal Regression 有序回归的递归特征提取
2007 International Joint Conference on Neural Networks Pub Date : 2007-10-29 DOI: 10.1109/IJCNN.2007.4370934
Fen Xia, Qing Tao, Jue Wang, Wensheng Zhang
{"title":"Recursive Feature Extraction for Ordinal Regression","authors":"Fen Xia, Qing Tao, Jue Wang, Wensheng Zhang","doi":"10.1109/IJCNN.2007.4370934","DOIUrl":"https://doi.org/10.1109/IJCNN.2007.4370934","url":null,"abstract":"Most existing algorithms for ordinal regression usually seek an orientation for which the projected samples are well separated, and seriate intervals on that orientation to represent the ranks. However, these algorithms only make use of one dimension in the sample space, which would definitely lose some useful information in its complementary subspace. As a remedy, we propose an algorithm framework for ordinal regression which consists of two phases: recursively extracting features from the decreasing subspace and learning a ranking rule from the examples represented by the new features. In this framework, every algorithm that projects samples onto a line can be used as a feature extractor and features with decreasing ranking ability are extracted one by one to make best use of the information contained in the training samples. Experiments on synthetic and benchmark datasets verify the usefulness of our framework.","PeriodicalId":350091,"journal":{"name":"2007 International Joint Conference on Neural Networks","volume":"188 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2007-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117286885","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Hybrid Solution for the Feature Selection in Personal Identification Problems through Keystroke Dynamics 基于击键动力学的个人识别问题特征选择的混合解决方案
2007 International Joint Conference on Neural Networks Pub Date : 2007-10-29 DOI: 10.1109/IJCNN.2007.4371256
Gabriel L. F. B. G. Azevedo, George D. C. Cavalcanti, E. C. B. C. Filho
{"title":"Hybrid Solution for the Feature Selection in Personal Identification Problems through Keystroke Dynamics","authors":"Gabriel L. F. B. G. Azevedo, George D. C. Cavalcanti, E. C. B. C. Filho","doi":"10.1109/IJCNN.2007.4371256","DOIUrl":"https://doi.org/10.1109/IJCNN.2007.4371256","url":null,"abstract":"Techniques based on biometrics have been successfully applied to personal identification systems. One rather promising technique uses the keystroke dynamics of each user in order to recognize him/her. In this work, we present the development of a hybrid system based on support vector machines and stochastic optimization techniques. The main objective is the analysis of these optimization algorithms for feature selection. We evaluate two optimization techniques for this task: genetic algorithms (GA) and particle swarm optimization (PSO). In the present study, PSO outperformed GA with regard to classification error and processing time, but was inferior regarding the feature reduction rate.","PeriodicalId":350091,"journal":{"name":"2007 International Joint Conference on Neural Networks","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2007-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129525802","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 26
Backward Varilable Selection of Support Vector Regressors by Block Deletion 块删除支持向量回归量的反向变量选择
2007 International Joint Conference on Neural Networks Pub Date : 2007-10-29 DOI: 10.1109/IJCNN.2007.4371285
T. Nagatani, S. Abe
{"title":"Backward Varilable Selection of Support Vector Regressors by Block Deletion","authors":"T. Nagatani, S. Abe","doi":"10.1109/IJCNN.2007.4371285","DOIUrl":"https://doi.org/10.1109/IJCNN.2007.4371285","url":null,"abstract":"In function approximation, if datasets have many redundant input variables, various problems such as deterioration of the generalization ability and an increase of the computational cost may occur. One of the methods to solve these problems is variable selection. In pattern recognition, the effectiveness of backward variable selection by block deletion is shown. In this paper, we extend this method to function approximation. To prevent the deterioration of the generalization ability, we use the approximation error of a validation set as the selection criterion. And to reduce computational cost, during variable selection we only optimize the margin parameter by cross-validation. If block deletion fails we backtrack and start binary search for efficient variable selection. By computer experiments using some datasets, we show that our method has performance comparable with that of the conventional method and can reduce computational cost greatly. We also show that a set of input variables selected by LS-SVRs can be used for SVRs without deteriorating the generalization ability.","PeriodicalId":350091,"journal":{"name":"2007 International Joint Conference on Neural Networks","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2007-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128388370","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 16
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信