2010 IEEE International Workshop on Machine Learning for Signal Processing最新文献

筛选
英文 中文
Dirichlet mixtures of graph diffusions for semi supervised learning 半监督学习中图扩散的Dirichlet混合
2010 IEEE International Workshop on Machine Learning for Signal Processing Pub Date : 2010-10-07 DOI: 10.1109/MLSP.2010.5588854
Christian J. Walder
{"title":"Dirichlet mixtures of graph diffusions for semi supervised learning","authors":"Christian J. Walder","doi":"10.1109/MLSP.2010.5588854","DOIUrl":"https://doi.org/10.1109/MLSP.2010.5588854","url":null,"abstract":"Graph representations of data have emerged as powerful tools in the classification of partially labeled data. We give a new algorithm for graph based semi supervised learning which is based on a probabilistic model of the process which assigns labels to vertices. The main novelty is a non parametric mixture of graph diffusions, which we combine with a Markov random field potential. Markov chain Monte Carlo is used for the inference, which we demonstrate to be significantly better in terms of predictive power than the maximum a posteriori estimate. Experiments on bench-mark data demonstrate that while computationally expensive our approach can provide significantly improved predictions in comparison with previous approaches.","PeriodicalId":319181,"journal":{"name":"2010 IEEE International Workshop on Machine Learning for Signal Processing","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121248511","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Removal of ballistocardiogram artifacts exploiting second order cyclostationarity 利用二阶循环平稳性去除弹道心图伪影
2010 IEEE International Workshop on Machine Learning for Signal Processing Pub Date : 2010-10-07 DOI: 10.1109/MLSP.2010.5589220
Foad Ghaderi, K. Nazarpour, J. McWhirter, S. Sanei
{"title":"Removal of ballistocardiogram artifacts exploiting second order cyclostationarity","authors":"Foad Ghaderi, K. Nazarpour, J. McWhirter, S. Sanei","doi":"10.1109/MLSP.2010.5589220","DOIUrl":"https://doi.org/10.1109/MLSP.2010.5589220","url":null,"abstract":"Simultaneous recording of electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) is increasingly used to monitor the brain activity. The interactions between the scanner magnetic field, the patient's body, and the EEG electrodes generate a pulsation artifact called ballistocardiogram (BCG) which is synchronized with the patient's heart beat. The BCG artifact is considered here as the sum of a number of independent cyclostationary components having the same cycle frequency. Cyclostationary source extraction (CSE) is used here to remove BCG artifact. The results are compared with the results of benchmark BCG removal techniques. It is shown that visual evoked potentials (VEPs) recorded inside the scanner and processed using the proposed method are more correlated with the VEPs recorded outside the scanner. Moreover, the presence of electrocardiogram (ECG) data is not necessary in this method as the cycle frequency of the BCG is directly computed from the contaminated EEG signals.","PeriodicalId":319181,"journal":{"name":"2010 IEEE International Workshop on Machine Learning for Signal Processing","volume":"48 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114449610","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Convergence-guaranteed multiplicative algorithms for nonnegative matrix factorization with β-divergence 具有β-散度的非负矩阵分解的收敛保证乘法算法
2010 IEEE International Workshop on Machine Learning for Signal Processing Pub Date : 2010-10-07 DOI: 10.1109/MLSP.2010.5589233
M. Nakano, H. Kameoka, J. Le Roux, Yu Kitano, Nobutaka Ono, S. Sagayama
{"title":"Convergence-guaranteed multiplicative algorithms for nonnegative matrix factorization with β-divergence","authors":"M. Nakano, H. Kameoka, J. Le Roux, Yu Kitano, Nobutaka Ono, S. Sagayama","doi":"10.1109/MLSP.2010.5589233","DOIUrl":"https://doi.org/10.1109/MLSP.2010.5589233","url":null,"abstract":"This paper presents a new multiplicative algorithm for nonnegative matrix factorization with β-divergence. The derived update rules have a similar form to those of the conventional multiplicative algorithm, only differing through the presence of an exponent term depending on β. The convergence is theoretically proven for any real-valued β based on the auxiliary function method. The convergence speed is experimentally investigated in comparison with previous works.","PeriodicalId":319181,"journal":{"name":"2010 IEEE International Workshop on Machine Learning for Signal Processing","volume":"47 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129871684","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 117
Statistically linearized recursive least squares 统计线性化递归最小二乘
2010 IEEE International Workshop on Machine Learning for Signal Processing Pub Date : 2010-10-07 DOI: 10.1109/MLSP.2010.5589236
M. Geist, O. Pietquin
{"title":"Statistically linearized recursive least squares","authors":"M. Geist, O. Pietquin","doi":"10.1109/MLSP.2010.5589236","DOIUrl":"https://doi.org/10.1109/MLSP.2010.5589236","url":null,"abstract":"This article proposes a new interpretation of the sigma-point kalman filter (SPKF) for parameter estimation as being a statistically linearized recursive least-squares algorithm. This gives new insight on the SPKF for parameter estimation and particularly this provides an alternative proof for a result of Van der Merwe. On the other hand, it legitimates the use of statistical linearization and suggests many ways to use it for parameter estimation, not necessarily in a least-squares sens.","PeriodicalId":319181,"journal":{"name":"2010 IEEE International Workshop on Machine Learning for Signal Processing","volume":"73 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124645820","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 15
Kalman filtering and smoothing solutions to temporal Gaussian process regression models 时间高斯过程回归模型的卡尔曼滤波和平滑解决方案
2010 IEEE International Workshop on Machine Learning for Signal Processing Pub Date : 2010-10-07 DOI: 10.1109/MLSP.2010.5589113
Jouni Hartikainen, S. Sarkka
{"title":"Kalman filtering and smoothing solutions to temporal Gaussian process regression models","authors":"Jouni Hartikainen, S. Sarkka","doi":"10.1109/MLSP.2010.5589113","DOIUrl":"https://doi.org/10.1109/MLSP.2010.5589113","url":null,"abstract":"In this paper, we show how temporal (i.e., time-series) Gaussian process regression models in machine learning can be reformulated as linear-Gaussian state space models, which can be solved exactly with classical Kalman filtering theory. The result is an efficient non-parametric learning algorithm, whose computational complexity grows linearly with respect to number of observations. We show how the reformulation can be done for Matérn family of covariance functions analytically and for squared exponential covariance function by applying spectral Taylor series approximation. Advantages of the proposed approach are illustrated with two numerical experiments.","PeriodicalId":319181,"journal":{"name":"2010 IEEE International Workshop on Machine Learning for Signal Processing","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130063890","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 242
MLSP Competition, 2010: Description of second place method MLSP竞赛,2010:第二名法描述
2010 IEEE International Workshop on Machine Learning for Signal Processing Pub Date : 2010-10-07 DOI: 10.1109/MLSP.2010.5589246
Z. Iscan
{"title":"MLSP Competition, 2010: Description of second place method","authors":"Z. Iscan","doi":"10.1109/MLSP.2010.5589246","DOIUrl":"https://doi.org/10.1109/MLSP.2010.5589246","url":null,"abstract":"In this paper, the classification method which generated the second highest AUC (the area under the ROC curve) in the MLSP 2010 Competition is presented. After application of some pre-processing steps to the dataset, by using statistical information, proper weights are found which maximize the separability between the P300 and the non-P300 responses. The classification method is simple and very suitable for online brain-computer interface (BCI) applications due to its fast algorithm.","PeriodicalId":319181,"journal":{"name":"2010 IEEE International Workshop on Machine Learning for Signal Processing","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133887916","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
A conditional independence perspective of variable selection 变量选择的条件独立视角
2010 IEEE International Workshop on Machine Learning for Signal Processing Pub Date : 2010-10-07 DOI: 10.1109/MLSP.2010.5588682
S. Seth, J. Príncipe
{"title":"A conditional independence perspective of variable selection","authors":"S. Seth, J. Príncipe","doi":"10.1109/MLSP.2010.5588682","DOIUrl":"https://doi.org/10.1109/MLSP.2010.5588682","url":null,"abstract":"Variable selection is a necessary preprocessing stage in many applications, such as regression and classification, to reduce computational cost, to avoid curse of dimensionality and to improve generalization. A filter type approach to variable selection employs statistical criteria such as dependence to quantify the importance of a variable. In this paper we discuss the use of conditional independence as a criteria for variable selection, and describe a forward selection and a backward elimination based approach using this notion. We introduce two measures of conditional independence, describe their respective estimators and apply them in the variable selection task. We also provide a brief overview of the available variable selection methods and compare the proposed methods with these methods.","PeriodicalId":319181,"journal":{"name":"2010 IEEE International Workshop on Machine Learning for Signal Processing","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114719188","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
PASS-GP: Predictive active set selection for Gaussian processes 高斯过程的预测主动集选择
2010 IEEE International Workshop on Machine Learning for Signal Processing Pub Date : 2010-10-07 DOI: 10.1109/MLSP.2010.5589264
Ricardo Henao, O. Winther
{"title":"PASS-GP: Predictive active set selection for Gaussian processes","authors":"Ricardo Henao, O. Winther","doi":"10.1109/MLSP.2010.5589264","DOIUrl":"https://doi.org/10.1109/MLSP.2010.5589264","url":null,"abstract":"We propose a new approximation method for Gaussian process (GP) learning for large data sets that combines inline active set selection with hyperparameter optimization. The predictive probability of the label is used for ranking the data points. We use the leave-one-out predictive probability available in GPs to make a common ranking for both active and inactive points, allowing points to be removed again from the active set. This is important for keeping the complexity down and at the same time focusing on points close to the decision boundary. We lend both theoretical and empirical support to the active set selection strategy and marginal likelihood optimization on the active set. We make extensive tests on the USPS and MNIST digit classification databases with and without incorporating invariances, demonstrating that we can get state-of-the-art results (e.g.0.86% error on MNIST) with reasonable time complexity.","PeriodicalId":319181,"journal":{"name":"2010 IEEE International Workshop on Machine Learning for Signal Processing","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123454416","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
υ-structured support vector machines <s:1>结构支持向量机
2010 IEEE International Workshop on Machine Learning for Signal Processing Pub Date : 2010-10-07 DOI: 10.1109/MLSP.2010.5588703
Sungwoong Kim, Jongmin Kim, Sungrack Yun, Chang D. Yoo
{"title":"υ-structured support vector machines","authors":"Sungwoong Kim, Jongmin Kim, Sungrack Yun, Chang D. Yoo","doi":"10.1109/MLSP.2010.5588703","DOIUrl":"https://doi.org/10.1109/MLSP.2010.5588703","url":null,"abstract":"This paper considers a υ-structured support vector machine (υ-SSVM) which is a structured support vector machine (SSVM) incorporating an intuitive balance parameter υ. In the absence of the parameter υ, cumbersome validation would be required in choosing the balance parameter. We theoretically prove that the parameter υ asymptotically converges to both the empirical risk of margin errors and the empirical risk of support vectors. The stochastic subgradient descent is used to solve the optimization problem of the υ-SSVM in the primal domain, since it is simple, memory efficient, and fast to converge. We verify the properties of the υ-SSVM experimentally in the task of sequential labeling handwritten characters.","PeriodicalId":319181,"journal":{"name":"2010 IEEE International Workshop on Machine Learning for Signal Processing","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127098337","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Fast online anomaly detection using scan statistics 快速在线异常检测使用扫描统计
2010 IEEE International Workshop on Machine Learning for Signal Processing Pub Date : 2010-10-07 DOI: 10.1109/MLSP.2010.5589151
Ryan D. Turner, Zoubin Ghahramani, S. Bottone
{"title":"Fast online anomaly detection using scan statistics","authors":"Ryan D. Turner, Zoubin Ghahramani, S. Bottone","doi":"10.1109/MLSP.2010.5589151","DOIUrl":"https://doi.org/10.1109/MLSP.2010.5589151","url":null,"abstract":"We present methods to do fast online anomaly detection using scan statistics. Scan statistics have long been used to detect statistically significant bursts of events. We extend the scan statistics framework to handle many practical issues that occur in application: dealing with an unknown background rate of events, allowing for slow natural changes in background frequency, the inverse problem of finding an unusual lack of events, and setting the test parameters to maximize power. We demonstrate its use on real and synthetic data sets with comparison to other methods.","PeriodicalId":319181,"journal":{"name":"2010 IEEE International Workshop on Machine Learning for Signal Processing","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126040235","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信