{"title":"Application of the recurrent neural network to the problem of language acquisition","authors":"R. Kamimura","doi":"10.1145/106965.105261","DOIUrl":null,"url":null,"abstract":"The purpose of this paper is to explore the possibility of langnage acquisition by using the recurrent neural net work. The knowledge of language that native speakers have is supposed to be reflected in the socalled “grammatical competence. ” Thus, the problem is to examine whether the recurrent neural network can acquire the grammatical competence. To simplify the experiments, the grammatical competence means the ability to infer the well-formedness of sentences. The training sentences are generated by the limited number of training sentences and the network must make judgments about the well-formedness of new sentences, The experimental results can be summarized as follows. First, the recurrent back-propagation needs only a few of propagations and back-propagations to obtain the necessary approximate values. Second, the recurrent network can infer the well-formedness of new sentences with sentence formulae of training sentences or new sentence formulae quite well. Third, the generalization performance of the network is not necessarily related to the number of hidden units. In some cases, we can obtain the best performance with no hidden units.","PeriodicalId":359315,"journal":{"name":"conference on Analysis of Neural Network Applications","volume":"52 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1991-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"conference on Analysis of Neural Network Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/106965.105261","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6
Abstract
The purpose of this paper is to explore the possibility of langnage acquisition by using the recurrent neural net work. The knowledge of language that native speakers have is supposed to be reflected in the socalled “grammatical competence. ” Thus, the problem is to examine whether the recurrent neural network can acquire the grammatical competence. To simplify the experiments, the grammatical competence means the ability to infer the well-formedness of sentences. The training sentences are generated by the limited number of training sentences and the network must make judgments about the well-formedness of new sentences, The experimental results can be summarized as follows. First, the recurrent back-propagation needs only a few of propagations and back-propagations to obtain the necessary approximate values. Second, the recurrent network can infer the well-formedness of new sentences with sentence formulae of training sentences or new sentence formulae quite well. Third, the generalization performance of the network is not necessarily related to the number of hidden units. In some cases, we can obtain the best performance with no hidden units.