{"title":"Relationship between fault tolerance, generalization and the Vapnik-Chervonenkis (VC) dimension of feedforward ANNs","authors":"D. Phatak","doi":"10.1109/IJCNN.1999.831587","DOIUrl":null,"url":null,"abstract":"It is demonstrated that fault tolerance, generalization and the Vapnik-Chertonenkis (VC) dimension are inter-related attributes. It is well known that the generalization error if plotted as a function of the VC dimension h, exhibits a well defined minimum corresponding to an optimal value of h, say h/sub opt/. We show that if the VC dimension h of an ANN satisfies h/spl les/h/sub opt/ (i.e., there is no excess capacity or redundancy), then fault tolerance and generalization are mutually conflicting attributes. On the other hand, if h>h/sub opt/ (i.e., there is excess capacity or redundancy), then fault tolerance and generalization are mutually synergistic attributes. In other words, training methods geared towards improving the fault tolerance can also lead to better generalization and vice versa, only when there is excess capacity or redundancy. This is consistent with our previous results indicating that complete fault tolerance in ANNs requires a significant amount of redundancy.","PeriodicalId":157719,"journal":{"name":"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1999-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"18","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.1999.831587","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 18
Abstract
It is demonstrated that fault tolerance, generalization and the Vapnik-Chertonenkis (VC) dimension are inter-related attributes. It is well known that the generalization error if plotted as a function of the VC dimension h, exhibits a well defined minimum corresponding to an optimal value of h, say h/sub opt/. We show that if the VC dimension h of an ANN satisfies h/spl les/h/sub opt/ (i.e., there is no excess capacity or redundancy), then fault tolerance and generalization are mutually conflicting attributes. On the other hand, if h>h/sub opt/ (i.e., there is excess capacity or redundancy), then fault tolerance and generalization are mutually synergistic attributes. In other words, training methods geared towards improving the fault tolerance can also lead to better generalization and vice versa, only when there is excess capacity or redundancy. This is consistent with our previous results indicating that complete fault tolerance in ANNs requires a significant amount of redundancy.