{"title":"A Double Input Layered Neural Network - Using Input Weights For Better Understanding Of Decision Reasoning: A Medical Application","authors":"Z. Shen, M. Clarke, R. Jones","doi":"10.1109/NNAT.1993.586050","DOIUrl":null,"url":null,"abstract":"An unique structure of Multi-layered Perceptron (MLP) with double input layers is proposed. By using a second input layer to a traditional MLP, a set of input weights between the two single connected input layers is obtained. We aim to use these weights to determine the contribution and significance of each input to the decision making. we also found that the learning process is accelerated when the additional layer is used. In this paper, we report our results and compare them with the traditional MLP. The significance of weight analysis is that: the contribution of each input helps explain the decision made by the network, which has been regarded as one of major disadvantages of neural networks; the weights can be used to select subsets of inputs and reduce input dimensions.","PeriodicalId":164805,"journal":{"name":"Workshop on Neural Network Applications and Tools","volume":"146 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1993-09-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Workshop on Neural Network Applications and Tools","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NNAT.1993.586050","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
An unique structure of Multi-layered Perceptron (MLP) with double input layers is proposed. By using a second input layer to a traditional MLP, a set of input weights between the two single connected input layers is obtained. We aim to use these weights to determine the contribution and significance of each input to the decision making. we also found that the learning process is accelerated when the additional layer is used. In this paper, we report our results and compare them with the traditional MLP. The significance of weight analysis is that: the contribution of each input helps explain the decision made by the network, which has been regarded as one of major disadvantages of neural networks; the weights can be used to select subsets of inputs and reduce input dimensions.