单个连续序列的有限记忆最小二乘法普遍预测

R. Dar, M. Feder
{"title":"单个连续序列的有限记忆最小二乘法普遍预测","authors":"R. Dar, M. Feder","doi":"10.1109/ISIT.2011.6033961","DOIUrl":null,"url":null,"abstract":"In this paper we consider the problem of universal prediction of individual continuous sequences with square-error loss, using a deterministic finite-state machine (FSM). The goal is to attain universally the performance of the best constant predictor tuned to the sequence, which predicts the empirical mean and incurs the empirical variance as the loss. The paper analyzes the tradeoff between the number of states of the universal FSM and the excess loss (regret). We first present a machine, termed Exponential Decaying Memory (EDM) machine, used in the past for predicting binary sequences, and show bounds on its performance. Then we consider a new class of machines, Degenerated Tracking Memory (DTM) machines, find the optimal DTM machine and show that it outperforms the EDM machine for a small number of states. Incidentally, we prove a lower bound indicating that even with large number of states the regret of the DTM machine does not vanish. Finally, we show a lower bound on the achievable regret of any FSM, and suggest a new machine, the Enhanced Exponential Decaying Memory, which attains the bound and outperforms the EDM for any number of states.","PeriodicalId":208375,"journal":{"name":"2011 IEEE International Symposium on Information Theory Proceedings","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-10-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Finite-memory least squares universal prediction of individual continuous sequences\",\"authors\":\"R. Dar, M. Feder\",\"doi\":\"10.1109/ISIT.2011.6033961\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper we consider the problem of universal prediction of individual continuous sequences with square-error loss, using a deterministic finite-state machine (FSM). The goal is to attain universally the performance of the best constant predictor tuned to the sequence, which predicts the empirical mean and incurs the empirical variance as the loss. The paper analyzes the tradeoff between the number of states of the universal FSM and the excess loss (regret). We first present a machine, termed Exponential Decaying Memory (EDM) machine, used in the past for predicting binary sequences, and show bounds on its performance. Then we consider a new class of machines, Degenerated Tracking Memory (DTM) machines, find the optimal DTM machine and show that it outperforms the EDM machine for a small number of states. Incidentally, we prove a lower bound indicating that even with large number of states the regret of the DTM machine does not vanish. Finally, we show a lower bound on the achievable regret of any FSM, and suggest a new machine, the Enhanced Exponential Decaying Memory, which attains the bound and outperforms the EDM for any number of states.\",\"PeriodicalId\":208375,\"journal\":{\"name\":\"2011 IEEE International Symposium on Information Theory Proceedings\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-10-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2011 IEEE International Symposium on Information Theory Proceedings\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISIT.2011.6033961\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 IEEE International Symposium on Information Theory Proceedings","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISIT.2011.6033961","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

利用确定性有限状态机(FSM)研究具有平方误差损失的单个连续序列的普遍预测问题。目标是获得对序列调优的最佳常数预测器的普遍性能,该预测器预测经验均值并将经验方差作为损失。本文分析了广义有限状态机的状态数与超额损失(遗憾)之间的权衡。我们首先提出了一种机器,称为指数衰减存储器(EDM)机器,过去用于预测二进制序列,并展示了其性能的界限。然后,我们考虑了一类新的机器,退化跟踪记忆(DTM)机器,找到了最优的DTM机器,并证明了它在少数状态下优于EDM机器。顺便说一句,我们证明了一个下界,表明即使有大量的状态,DTM机器的遗憾也不会消失。最后,我们给出了任意状态机的可实现遗憾的下界,并提出了一种新的机器——增强指数衰减存储器,它在任意数量的状态下都达到了该下界,并且优于EDM。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Finite-memory least squares universal prediction of individual continuous sequences
In this paper we consider the problem of universal prediction of individual continuous sequences with square-error loss, using a deterministic finite-state machine (FSM). The goal is to attain universally the performance of the best constant predictor tuned to the sequence, which predicts the empirical mean and incurs the empirical variance as the loss. The paper analyzes the tradeoff between the number of states of the universal FSM and the excess loss (regret). We first present a machine, termed Exponential Decaying Memory (EDM) machine, used in the past for predicting binary sequences, and show bounds on its performance. Then we consider a new class of machines, Degenerated Tracking Memory (DTM) machines, find the optimal DTM machine and show that it outperforms the EDM machine for a small number of states. Incidentally, we prove a lower bound indicating that even with large number of states the regret of the DTM machine does not vanish. Finally, we show a lower bound on the achievable regret of any FSM, and suggest a new machine, the Enhanced Exponential Decaying Memory, which attains the bound and outperforms the EDM for any number of states.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信