{"title":"鲁棒神经网络","authors":"R. Martin","doi":"10.1109/CIFER.1995.495226","DOIUrl":null,"url":null,"abstract":"Neural networks are being increasingly used for modeling, analysis and prediction of financial data, particularly financial time series. Whatever the network architecture, the method for fitting a regression or prediction type network is almost always the method of least squares (LS), i.e., the minimization of the sum of squared errors (or prediction residuals). Unfortunately, the LS method is not robust: the estimated model can be highly effected by outliers of various kinds. In the financial time series context, the outliers might occur in isolation or in short patches. In the time series context, level shifts also cause havoc with LS fitting of neural networks. Contrary to some popular impressions, use of a neural network is not a cure-all for dealing with outliers and level shifts. We provide an introduction to statistical notions of robustness, and demonstrate the non-robustness of LS fitting of neural networks with some concrete examples where the neural network fitting is exceedingly bad due to the presence of outliers or level shifts. Then we discuss how to robustify the fitting of neural networks in both regression and time series prediction contexts. The robust methods are illustrated with several examples where the robust approach yields considerable improvement over LS fitting of neural networks.","PeriodicalId":374172,"journal":{"name":"Proceedings of 1995 Conference on Computational Intelligence for Financial Engineering (CIFEr)","volume":"82 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1995-04-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Robust neural networks\",\"authors\":\"R. Martin\",\"doi\":\"10.1109/CIFER.1995.495226\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Neural networks are being increasingly used for modeling, analysis and prediction of financial data, particularly financial time series. Whatever the network architecture, the method for fitting a regression or prediction type network is almost always the method of least squares (LS), i.e., the minimization of the sum of squared errors (or prediction residuals). Unfortunately, the LS method is not robust: the estimated model can be highly effected by outliers of various kinds. In the financial time series context, the outliers might occur in isolation or in short patches. In the time series context, level shifts also cause havoc with LS fitting of neural networks. Contrary to some popular impressions, use of a neural network is not a cure-all for dealing with outliers and level shifts. We provide an introduction to statistical notions of robustness, and demonstrate the non-robustness of LS fitting of neural networks with some concrete examples where the neural network fitting is exceedingly bad due to the presence of outliers or level shifts. Then we discuss how to robustify the fitting of neural networks in both regression and time series prediction contexts. The robust methods are illustrated with several examples where the robust approach yields considerable improvement over LS fitting of neural networks.\",\"PeriodicalId\":374172,\"journal\":{\"name\":\"Proceedings of 1995 Conference on Computational Intelligence for Financial Engineering (CIFEr)\",\"volume\":\"82 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1995-04-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of 1995 Conference on Computational Intelligence for Financial Engineering (CIFEr)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CIFER.1995.495226\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of 1995 Conference on Computational Intelligence for Financial Engineering (CIFEr)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CIFER.1995.495226","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Neural networks are being increasingly used for modeling, analysis and prediction of financial data, particularly financial time series. Whatever the network architecture, the method for fitting a regression or prediction type network is almost always the method of least squares (LS), i.e., the minimization of the sum of squared errors (or prediction residuals). Unfortunately, the LS method is not robust: the estimated model can be highly effected by outliers of various kinds. In the financial time series context, the outliers might occur in isolation or in short patches. In the time series context, level shifts also cause havoc with LS fitting of neural networks. Contrary to some popular impressions, use of a neural network is not a cure-all for dealing with outliers and level shifts. We provide an introduction to statistical notions of robustness, and demonstrate the non-robustness of LS fitting of neural networks with some concrete examples where the neural network fitting is exceedingly bad due to the presence of outliers or level shifts. Then we discuss how to robustify the fitting of neural networks in both regression and time series prediction contexts. The robust methods are illustrated with several examples where the robust approach yields considerable improvement over LS fitting of neural networks.