Mats L. Richter, Leila Malihi, Anne-Kathrin Patricia Windler, U. Krumnack
{"title":"神经网络特征空间在训练过程中的性质与演化","authors":"Mats L. Richter, Leila Malihi, Anne-Kathrin Patricia Windler, U. Krumnack","doi":"10.1109/MVIP53647.2022.9738741","DOIUrl":null,"url":null,"abstract":"We investigate properties and the evolution of the emergent inference process inside neural networks using layer saturation [1] and logistic regression probes [2]. We demonstrate that the difficulty of a problem, defined by the number of classes and complexity of the visual domain, as well as the number of parameters in neural network layers affect the predictive performance in an antagonistic manner. We further show that this relationship can be measured using saturation. This opens the possibility of detecting over- and under-parameterization of neural networks. We further show that the observed effects are independent of previously reported pathological patterns like the \"tail pattern\" described in [1]. Finally, we study the emergence of saturation patterns during training, showing that saturation patterns emerge early during training. This allows for early analysis and potentially increased cycle-time during experiments.","PeriodicalId":184716,"journal":{"name":"2022 International Conference on Machine Vision and Image Processing (MVIP)","volume":"30 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-06-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Exploring the Properties and Evolution of Neural Network Eigenspaces during Training\",\"authors\":\"Mats L. Richter, Leila Malihi, Anne-Kathrin Patricia Windler, U. Krumnack\",\"doi\":\"10.1109/MVIP53647.2022.9738741\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We investigate properties and the evolution of the emergent inference process inside neural networks using layer saturation [1] and logistic regression probes [2]. We demonstrate that the difficulty of a problem, defined by the number of classes and complexity of the visual domain, as well as the number of parameters in neural network layers affect the predictive performance in an antagonistic manner. We further show that this relationship can be measured using saturation. This opens the possibility of detecting over- and under-parameterization of neural networks. We further show that the observed effects are independent of previously reported pathological patterns like the \\\"tail pattern\\\" described in [1]. Finally, we study the emergence of saturation patterns during training, showing that saturation patterns emerge early during training. This allows for early analysis and potentially increased cycle-time during experiments.\",\"PeriodicalId\":184716,\"journal\":{\"name\":\"2022 International Conference on Machine Vision and Image Processing (MVIP)\",\"volume\":\"30 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-06-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 International Conference on Machine Vision and Image Processing (MVIP)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/MVIP53647.2022.9738741\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Conference on Machine Vision and Image Processing (MVIP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MVIP53647.2022.9738741","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Exploring the Properties and Evolution of Neural Network Eigenspaces during Training
We investigate properties and the evolution of the emergent inference process inside neural networks using layer saturation [1] and logistic regression probes [2]. We demonstrate that the difficulty of a problem, defined by the number of classes and complexity of the visual domain, as well as the number of parameters in neural network layers affect the predictive performance in an antagonistic manner. We further show that this relationship can be measured using saturation. This opens the possibility of detecting over- and under-parameterization of neural networks. We further show that the observed effects are independent of previously reported pathological patterns like the "tail pattern" described in [1]. Finally, we study the emergence of saturation patterns during training, showing that saturation patterns emerge early during training. This allows for early analysis and potentially increased cycle-time during experiments.