{"title":"统计神经网络模型选择的调整网络信息准则","authors":"C. Udomboso, A. Chukwu, I. Dontwi","doi":"10.22237/JMASM/1478003040","DOIUrl":null,"url":null,"abstract":"In this paper, we derived and investigated the Adjusted Network Information Criterion (ANIC) criterion, based on Kullback's symmetric divergence, which has been designed to be an asymptotically unbiased estimator of the expected Kullback-Leibler information of a fitted model. The ANIC improves model selection in more sample sizes than does the NIC.","PeriodicalId":225385,"journal":{"name":"World Academy of Science, Engineering and Technology, International Journal of Mathematical and Computational Sciences","volume":"76 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"An Adjusted Network Information Criterion for Model Selection in Statistical Neural Network Models\",\"authors\":\"C. Udomboso, A. Chukwu, I. Dontwi\",\"doi\":\"10.22237/JMASM/1478003040\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, we derived and investigated the Adjusted Network Information Criterion (ANIC) criterion, based on Kullback's symmetric divergence, which has been designed to be an asymptotically unbiased estimator of the expected Kullback-Leibler information of a fitted model. The ANIC improves model selection in more sample sizes than does the NIC.\",\"PeriodicalId\":225385,\"journal\":{\"name\":\"World Academy of Science, Engineering and Technology, International Journal of Mathematical and Computational Sciences\",\"volume\":\"76 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-06-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"World Academy of Science, Engineering and Technology, International Journal of Mathematical and Computational Sciences\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.22237/JMASM/1478003040\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"World Academy of Science, Engineering and Technology, International Journal of Mathematical and Computational Sciences","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.22237/JMASM/1478003040","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
摘要
本文推导并研究了基于Kullback对称散度的调整网络信息准则(Adjusted Network Information Criterion, ANIC)准则,该准则被设计为拟合模型的期望Kullback- leibler信息的渐近无偏估计。与NIC相比,ANIC在更多的样本量上改进了模型选择。
An Adjusted Network Information Criterion for Model Selection in Statistical Neural Network Models
In this paper, we derived and investigated the Adjusted Network Information Criterion (ANIC) criterion, based on Kullback's symmetric divergence, which has been designed to be an asymptotically unbiased estimator of the expected Kullback-Leibler information of a fitted model. The ANIC improves model selection in more sample sizes than does the NIC.