{"title":"一种针对减少非线性的前馈神经网络模型的训练方法","authors":"C. Koutsougeras, G. Papadourakis","doi":"10.1109/TAI.1991.167095","DOIUrl":null,"url":null,"abstract":"In the analysis presented for feedforward neural networks, the causes of problems in the adaptation of current models are examined. A new method for training a feedforward neural net model is introduced. The method encompasses elements of both supervised and unsupervised learning. The development of internal representations is no more an issue tangential to the curve fitting objectives of the other known supervised learning methods. Curve fitting remains as a primary objective but unsupervised learning techniques are also used in order to aid the development of internal representations. The net structure is incrementally formed, thus allowing the formation of a structure of reduced nonlinearity.<<ETX>>","PeriodicalId":371778,"journal":{"name":"[Proceedings] Third International Conference on Tools for Artificial Intelligence - TAI 91","volume":"45 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1991-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"A method for training a feed-forward neural net model while targeting reduced nonlinearity\",\"authors\":\"C. Koutsougeras, G. Papadourakis\",\"doi\":\"10.1109/TAI.1991.167095\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In the analysis presented for feedforward neural networks, the causes of problems in the adaptation of current models are examined. A new method for training a feedforward neural net model is introduced. The method encompasses elements of both supervised and unsupervised learning. The development of internal representations is no more an issue tangential to the curve fitting objectives of the other known supervised learning methods. Curve fitting remains as a primary objective but unsupervised learning techniques are also used in order to aid the development of internal representations. The net structure is incrementally formed, thus allowing the formation of a structure of reduced nonlinearity.<<ETX>>\",\"PeriodicalId\":371778,\"journal\":{\"name\":\"[Proceedings] Third International Conference on Tools for Artificial Intelligence - TAI 91\",\"volume\":\"45 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1991-11-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"[Proceedings] Third International Conference on Tools for Artificial Intelligence - TAI 91\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/TAI.1991.167095\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"[Proceedings] Third International Conference on Tools for Artificial Intelligence - TAI 91","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TAI.1991.167095","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A method for training a feed-forward neural net model while targeting reduced nonlinearity
In the analysis presented for feedforward neural networks, the causes of problems in the adaptation of current models are examined. A new method for training a feedforward neural net model is introduced. The method encompasses elements of both supervised and unsupervised learning. The development of internal representations is no more an issue tangential to the curve fitting objectives of the other known supervised learning methods. Curve fitting remains as a primary objective but unsupervised learning techniques are also used in order to aid the development of internal representations. The net structure is incrementally formed, thus allowing the formation of a structure of reduced nonlinearity.<>