IEEE International Workshop on Machine Learning for Signal Processing : [proceedings]. IEEE International Workshop on Machine Learning for Signal Processing最新文献

筛选
英文 中文
DATA-DRIVEN LEARNING OF GEOMETRIC SCATTERING MODULES FOR GNNS. 以数据为驱动学习 gnns 的几何散射模块。
Alexander Tong, Frederick Wenkel, Kincaid Macdonald, Smita Krishnaswamy, Guy Wolf
{"title":"DATA-DRIVEN LEARNING OF GEOMETRIC SCATTERING MODULES FOR GNNS.","authors":"Alexander Tong, Frederick Wenkel, Kincaid Macdonald, Smita Krishnaswamy, Guy Wolf","doi":"10.1109/mlsp52302.2021.9596169","DOIUrl":"10.1109/mlsp52302.2021.9596169","url":null,"abstract":"<p><p>We propose a new graph neural network (GNN) module, based on relaxations of recently proposed geometric scattering transforms, which consist of a cascade of graph wavelet filters. Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations. The incorporation of our LEGS-module in GNNs enables the learning of longer-range graph relations compared to many popular GNNs, which often rely on encoding graph structure via smoothness or similarity between neighbors. Further, its wavelet priors result in simplified architectures with significantly fewer learned parameters compared to competing GNNs. We demonstrate the predictive performance of LEGS-based networks on graph classification benchmarks, as well as the descriptive quality of their learned features in biochemical graph data exploration tasks.</p>","PeriodicalId":73290,"journal":{"name":"IEEE International Workshop on Machine Learning for Signal Processing : [proceedings]. IEEE International Workshop on Machine Learning for Signal Processing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10026018/pdf/nihms-1829559.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9192207","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
LEARNING GENERAL TRANSFORMATIONS OF DATA FOR OUT-OF-SAMPLE EXTENSIONS. 学习样本外扩展的一般数据转换。
Matthew Amodio, David van Dijk, Guy Wolf, Smita Krishnaswamy
{"title":"LEARNING GENERAL TRANSFORMATIONS OF DATA FOR OUT-OF-SAMPLE EXTENSIONS.","authors":"Matthew Amodio,&nbsp;David van Dijk,&nbsp;Guy Wolf,&nbsp;Smita Krishnaswamy","doi":"10.1109/mlsp49062.2020.9231660","DOIUrl":"https://doi.org/10.1109/mlsp49062.2020.9231660","url":null,"abstract":"<p><p>While generative models such as GANs have been successful at mapping from noise to specific distributions of data, or more generally from one distribution of data to another, they cannot isolate the transformation that is occurring and apply it to a new distribution not seen in training. Thus, they memorize the domain of the transformation, and cannot generalize the transformation <i>out of sample</i>. To address this, we propose a new neural network called a <i>Neuron Transformation Network</i> (NTNet) that isolates the signal representing the transformation itself from the other signals representing internal distribution variation. This signal can then be removed from a new dataset distributed differently from the original one trained on. We demonstrate the effectiveness of our NTNet on more than a dozen synthetic and biomedical single-cell RNA sequencing datasets, where the NTNet is able to learn the data transformation performed by genetic and drug perturbations on one sample of cells and successfully apply it to another sample of cells to predict treatment outcome.</p>","PeriodicalId":73290,"journal":{"name":"IEEE International Workshop on Machine Learning for Signal Processing : [proceedings]. IEEE International Workshop on Machine Learning for Signal Processing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2020-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/mlsp49062.2020.9231660","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39446576","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
CONVOLUTIONAL RECURRENT NEURAL NETWORK BASED DIRECTION OF ARRIVAL ESTIMATION METHOD USING TWO MICROPHONES FOR HEARING STUDIES. 基于卷积递归神经网络的双传声器听觉到达方向估计方法研究。
Abdullah Küçük, Issa M S Panahi
{"title":"CONVOLUTIONAL RECURRENT NEURAL NETWORK BASED DIRECTION OF ARRIVAL ESTIMATION METHOD USING TWO MICROPHONES FOR HEARING STUDIES.","authors":"Abdullah Küçük,&nbsp;Issa M S Panahi","doi":"10.1109/mlsp49062.2020.9231693","DOIUrl":"https://doi.org/10.1109/mlsp49062.2020.9231693","url":null,"abstract":"<p><p>This work proposes a convolutional recurrent neural network (CRNN) based direction of arrival (DOA) angle estimation method, implemented on the Android smartphone for hearing aid applications. The proposed app provides a 'visual' indication of the direction of a talker on the screen of Android smartphones for improving the hearing of people with hearing disorders. We use real and imaginary parts of short-time Fourier transform (STFT) as a feature set for the proposed CRNN architecture for DOA angle estimation. Real smartphone recordings are utilized for assessing performance of the proposed method. The accuracy of the proposed method reaches 87.33% for unseen (untrained) environments. This work also presents real-time inference of the proposed method, which is done on an Android smartphone using only its two built-in microphones and no additional component or external hardware. The real-time implementation also proves the generalization and robustness of the proposed CRNN based model.</p>","PeriodicalId":73290,"journal":{"name":"IEEE International Workshop on Machine Learning for Signal Processing : [proceedings]. IEEE International Workshop on Machine Learning for Signal Processing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2020-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/mlsp49062.2020.9231693","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"38969232","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Nonlinear and non-Gaussian signal processing 非线性和非高斯信号处理
Max A. Little
{"title":"Nonlinear and non-Gaussian signal processing","authors":"Max A. Little","doi":"10.1093/oso/9780198714934.003.0009","DOIUrl":"https://doi.org/10.1093/oso/9780198714934.003.0009","url":null,"abstract":"Linear, time-invariant (LTI) Gaussian DSP, has substantial mathematical conveniences that make it valuable in practical DSP applications and machine learning. When the signal really is generated by such an LTI-Gaussian model then this kind of processing is optimal from a statistical point of view. However, there are substantial limitations to the use of these techniques when we cannot guarantee that the assumptions of linearity, time-invariance and Gaussianity hold. In particular, signals that exhibit jumps or significant non-Gaussian outliers cause substantial adverse effects such as Gibb's phenomena in LTI filter outputs, and nonstationary signals cannot be compactly represented in the Fourier domain. In practice, many real signals show such phenomena to a greater or lesser degree, so it is important to have a `toolkit' of DSP methods that are effective in many situations. This chapter is dedicated to exploring the use of the statistical machine learning concepts in DSP.","PeriodicalId":73290,"journal":{"name":"IEEE International Workshop on Machine Learning for Signal Processing : [proceedings]. IEEE International Workshop on Machine Learning for Signal Processing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-08-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78242413","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Statistical modelling and inference 统计建模与推理
Max A. Little
{"title":"Statistical modelling and inference","authors":"Max A. Little","doi":"10.1093/oso/9780198714934.003.0004","DOIUrl":"https://doi.org/10.1093/oso/9780198714934.003.0004","url":null,"abstract":"The modern view of statistical machine learning and signal processing is that the central task is one of finding good probabilistic models for the joint distribution over all the variables in the problem. We can then make `queries' of this model, also known as inferences, to determine optimal parameter values or signals. Hence, the importance of statistical methods to this book cannot be overstated. This chapter is an in-depth exploration of what this probabilistic modeling entails, the origins of the concepts involved, how to perform inferences and how to test the quality of a model produced this way.","PeriodicalId":73290,"journal":{"name":"IEEE International Workshop on Machine Learning for Signal Processing : [proceedings]. IEEE International Workshop on Machine Learning for Signal Processing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-08-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"72739115","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Probabilistic graphical models 概率图形模型
Max A. Little
{"title":"Probabilistic graphical models","authors":"Max A. Little","doi":"10.1093/oso/9780198714934.003.0005","DOIUrl":"https://doi.org/10.1093/oso/9780198714934.003.0005","url":null,"abstract":"Statistical machine learning and statistical DSP are built on the foundations of probability theory and random variables. Different techniques encode different dependency structure between these variables. This structure leads to specific algorithms for inference and estimation. Many common dependency structures emerge naturally in this way, as a result, there are many common patterns of inference and estimation that suggest general algorithms for this purpose. So, it becomes important to formalize these algorithms; this is the purpose of this chapter. These general algorithms can often lead to substantial computational savings over more brute-force approaches, another benefit that comes from studying the structure of these models in the abstract.","PeriodicalId":73290,"journal":{"name":"IEEE International Workshop on Machine Learning for Signal Processing : [proceedings]. IEEE International Workshop on Machine Learning for Signal Processing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-08-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74696737","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Nonparametric Bayesian machine learning and signal processing 非参数贝叶斯机器学习与信号处理
Max A. Little
{"title":"Nonparametric Bayesian machine learning and signal processing","authors":"Max A. Little","doi":"10.1093/oso/9780198714934.003.0010","DOIUrl":"https://doi.org/10.1093/oso/9780198714934.003.0010","url":null,"abstract":"We have seen that stochastic processes play an important foundational role in a wide range of methods in DSP. For example, we treat a discrete-time signal as a Gaussian process, and thereby obtain many mathematically simplified algorithms, particularly based on the power spectral density. At the same time, in machine learning, it has generally been observed that nonparametric methods outperform parametric methods in terms of predictive accuracy since they can adapt to data with arbitrary complexity. However, these techniques are not Bayesian so we are unable to do important inferential procedures such as draw samples from the underlying probabilistic model or compute posterior confidence intervals. But, Bayesian models are often only mathematically tractable if parametric, with the corresponding loss of predictive accuracy. An alternative, discussed in this section, is to extend the mathematical tractability of stochastic processes to Bayesian methods. This leads to so-called Bayesian nonparametrics exemplified by techniques such as Gaussian process regression and Dirichlet process mixture modelling that have been shown to be extremely useful in practical DSP and machine learning applications.","PeriodicalId":73290,"journal":{"name":"IEEE International Workshop on Machine Learning for Signal Processing : [proceedings]. IEEE International Workshop on Machine Learning for Signal Processing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-08-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"80427209","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Mathematical foundations 数学基础
Max A. Little
{"title":"Mathematical foundations","authors":"Max A. Little","doi":"10.1093/oso/9780198714934.003.0001","DOIUrl":"https://doi.org/10.1093/oso/9780198714934.003.0001","url":null,"abstract":"Statistical machine learning and signal processing are topics in applied mathematics, which are based upon many abstract mathematical concepts. Defining these concepts clearly is the most important first step in this book. The purpose of this chapter is to introduce these foundational mathematical concepts. It also justifies the statement that much of the art of statistical machine learning as applied to signal processing, lies in the choice of convenient mathematical models that happen to be useful in practice. Convenient in this context means that the algebraic consequences of the choice of mathematical modeling assumptions are in some sense manageable. The seeds of this manageability are the elementary mathematical concepts upon which the subject is built.","PeriodicalId":73290,"journal":{"name":"IEEE International Workshop on Machine Learning for Signal Processing : [proceedings]. IEEE International Workshop on Machine Learning for Signal Processing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-08-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87510695","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Linear-Gaussian systems and signal processing 线性-高斯系统与信号处理
Max A. Little
{"title":"Linear-Gaussian systems and signal processing","authors":"Max A. Little","doi":"10.1093/oso/9780198714934.003.0007","DOIUrl":"https://doi.org/10.1093/oso/9780198714934.003.0007","url":null,"abstract":"Linear systems theory, based on the mathematics of vector spaces, is the backbone of all “classical” DSP and a large part of statistical machine learning. The basic idea -- that linear algebra applied to a signal can of substantial practical value -- has counterparts in many areas of science and technology. In other areas of science and engineering, linear algebra is often justified by the fact that it is often an excellent model for real-world systems. For example, in acoustics the theory of (linear) wave propagation emerges from the concept of linearization of small pressure disturbances about the equilibrium pressure in classical fluid dynamics. Similarly, the theory of electromagnetic waves is also linear. Except when a signal emerges from a justifiably linear system, in DSP and machine learning we do not have any particular correspondence to reality to back up the choice of linearity. However, the mathematics of vector spaces, particularly when applied to systems which are time-invariant and jointly Gaussian, is highly tractable, elegant and immensely useful.","PeriodicalId":73290,"journal":{"name":"IEEE International Workshop on Machine Learning for Signal Processing : [proceedings]. IEEE International Workshop on Machine Learning for Signal Processing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-08-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82901056","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Discrete signals: sampling, quantization and coding 离散信号:采样、量化和编码
Max A. Little
{"title":"Discrete signals: sampling, quantization and coding","authors":"Max A. Little","doi":"10.1093/oso/9780198714934.003.0008","DOIUrl":"https://doi.org/10.1093/oso/9780198714934.003.0008","url":null,"abstract":"Digital signal processing and machine learning require digital data which can be processed by algorithms on computer. However, most of the real-world signals that we observe are real numbers, occurring at real time values. This means that it is impossible in practice to store these signals on a computer and we must find some approximate signal representation which is amenable to finite, digital storage. This chapter describes the main methods which are used in practice to solve this representation problem.","PeriodicalId":73290,"journal":{"name":"IEEE International Workshop on Machine Learning for Signal Processing : [proceedings]. IEEE International Workshop on Machine Learning for Signal Processing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-08-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78781535","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信