Learning Ambiguities Using Bayesian Mixture of Experts

Atul Kanaujia, Dimitris N. Metaxas
{"title":"Learning Ambiguities Using Bayesian Mixture of Experts","authors":"Atul Kanaujia, Dimitris N. Metaxas","doi":"10.1109/ICTAI.2006.73","DOIUrl":null,"url":null,"abstract":"Mixture of Experts (ME) is an ensemble of function approximators that fit the clustered data set locally rather than globally. ME provides a useful tool to learn multi-valued mappings (ambiguities) in the data set. Mixture of Experts training involve learning a multi-category classifier for the gates distribution and fitting a regressor within each of the clusters. The learning of ME is based on divide and conquer which is known to increase the error due to variance. In order to avoid overfitting several researchers have proposed using linear experts. However in the absence of any knowledge of non-linearities existing in the data set, it is not clear how many linear experts could accurately model the data. In this work we propose a Bayesian learning framework for learning Mixture of Experts. Bayesian learning intrinsically embodies regularization and model selection using Occam's razor. In the past Bayesian learning methods have been applied to classification and regression in order to avoid scale sensitivity and orthodox model selection procedure of cross validation. Although true Bayesian learning is computationally intractable, approximations do result in sparser and more compact models","PeriodicalId":169424,"journal":{"name":"2006 18th IEEE International Conference on Tools with Artificial Intelligence (ICTAI'06)","volume":"80 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2006-11-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2006 18th IEEE International Conference on Tools with Artificial Intelligence (ICTAI'06)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICTAI.2006.73","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 10

Abstract

Mixture of Experts (ME) is an ensemble of function approximators that fit the clustered data set locally rather than globally. ME provides a useful tool to learn multi-valued mappings (ambiguities) in the data set. Mixture of Experts training involve learning a multi-category classifier for the gates distribution and fitting a regressor within each of the clusters. The learning of ME is based on divide and conquer which is known to increase the error due to variance. In order to avoid overfitting several researchers have proposed using linear experts. However in the absence of any knowledge of non-linearities existing in the data set, it is not clear how many linear experts could accurately model the data. In this work we propose a Bayesian learning framework for learning Mixture of Experts. Bayesian learning intrinsically embodies regularization and model selection using Occam's razor. In the past Bayesian learning methods have been applied to classification and regression in order to avoid scale sensitivity and orthodox model selection procedure of cross validation. Although true Bayesian learning is computationally intractable, approximations do result in sparser and more compact models
使用贝叶斯混合专家学习歧义
混合专家(ME)是一种局部拟合而非全局拟合聚类数据集的函数逼近器集合。ME提供了一个有用的工具来学习数据集中的多值映射(歧义)。混合专家训练包括学习门分布的多类别分类器和在每个聚类内拟合回归量。ME的学习是基于分而治之的方法,这种方法由于方差而增加了误差。为了避免过拟合,一些研究人员建议使用线性专家。然而,在缺乏数据集中存在的非线性知识的情况下,不清楚有多少线性专家可以准确地对数据建模。在这项工作中,我们提出了一个用于专家混合学习的贝叶斯学习框架。贝叶斯学习本质上体现了正则化和使用奥卡姆剃刀的模型选择。为了避免交叉验证的尺度敏感性和正统的模型选择过程,过去将贝叶斯学习方法应用于分类和回归。虽然真正的贝叶斯学习在计算上很棘手,但近似确实会产生更稀疏和更紧凑的模型
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信