{"title":"A genetic algorithm for constrained via minimization","authors":"M. Tang, K. Eshraghian, H. Cheung","doi":"10.1109/ICONIP.1999.845634","DOIUrl":"https://doi.org/10.1109/ICONIP.1999.845634","url":null,"abstract":"Constrained via minimization is a typical optimization problem in very large scale integrated circuit (VLSI) routing. It is used to minimize the number of vias introduced in a VLSI routing. The first genetic algorithm for the constrained via minimization problem is proposed. Experimental results show that the developed genetic algorithm can consistently produce the same or better results than the best deterministic constrained via minimization algorithms.","PeriodicalId":237855,"journal":{"name":"ICONIP'99. ANZIIS'99 & ANNES'99 & ACNN'99. 6th International Conference on Neural Information Processing. Proceedings (Cat. No.99EX378)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127134351","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A data-driven rule-based neural network model for classification","authors":"K. Smith","doi":"10.1109/ICONIP.1999.844649","DOIUrl":"https://doi.org/10.1109/ICONIP.1999.844649","url":null,"abstract":"A novel approach for generating rules from neural networks is proposed. Rather than extracting rules from a trained general neural network, we use a neural network structure which permits rules to be more readily interpreted. This network incorporates logic neurons, with a combination of both fixed and adaptive weights. The backpropagation learning rules is adapted to reflect the new architecture. The proposed model also provides an opportunity for encoding expert rules and combining these rules with data driven decisions.","PeriodicalId":237855,"journal":{"name":"ICONIP'99. ANZIIS'99 & ANNES'99 & ACNN'99. 6th International Conference on Neural Information Processing. Proceedings (Cat. No.99EX378)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127167725","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Functional MR image registration using a genetic algorithm","authors":"J. Rajapakse, B. Guojun","doi":"10.1109/ICONIP.1999.844660","DOIUrl":"https://doi.org/10.1109/ICONIP.1999.844660","url":null,"abstract":"Image registration is formulated as a problem of finding optimal linear intensity and spatial transformations. A genetic algorithm is proposed to find optimal parameters of the transformations. The new approach is used to register functional MR time series images of the human brain to compensate for subject head movement.","PeriodicalId":237855,"journal":{"name":"ICONIP'99. ANZIIS'99 & ANNES'99 & ACNN'99. 6th International Conference on Neural Information Processing. Proceedings (Cat. No.99EX378)","volume":"51 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127508588","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"On the effects of initialising a neural network with prior knowledge","authors":"R. Andrews, S. Geva","doi":"10.1109/ICONIP.1999.843995","DOIUrl":"https://doi.org/10.1109/ICONIP.1999.843995","url":null,"abstract":"This paper quantitatively examines the effects of initialising a Rapid Backprop Network (REP) with prior domain knowledge expressed in the form of propositional rules. The paper first describes the RBP network and then introduces the RULEIN algorithm which encodes propositional rules as the weights of the nodes of the REP network. A selection of datasets is used to compare networks that began learning from tabula rasa with those that were initialised with varying amounts of domain knowledge prior to the commencement of the learning phase. Network performance is compared in terms of time to converge, accuracy at convergence, and network size at convergence.","PeriodicalId":237855,"journal":{"name":"ICONIP'99. ANZIIS'99 & ANNES'99 & ACNN'99. 6th International Conference on Neural Information Processing. Proceedings (Cat. No.99EX378)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125943694","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A.S. d'Avila Garcez, K. Broda, D. Gabbay, Alberto F. de Souza
{"title":"Knowledge extraction from trained neural networks: a position paper","authors":"A.S. d'Avila Garcez, K. Broda, D. Gabbay, Alberto F. de Souza","doi":"10.1109/ICONIP.1999.845678","DOIUrl":"https://doi.org/10.1109/ICONIP.1999.845678","url":null,"abstract":"It is commonly accepted that one of the main drawbacks of neural networks, the lack of explanation, may be ameliorated by the so called rule extraction methods. We argue that neural networks encode nonmonotonicity, i.e., they jump to conclusions that might be withdrawn when new information is available. The authors present an extraction method that complies with the above perspective. We define a partial ordering on the network's input vector set, and use it to confine the search space for the extraction of rules by querying the network. We then define a number of simplification metarules, show that the extraction is sound and present the results of applying the extraction algorithm to the Monks' Problems (S.B. Thrun et al., 1991).","PeriodicalId":237855,"journal":{"name":"ICONIP'99. ANZIIS'99 & ANNES'99 & ACNN'99. 6th International Conference on Neural Information Processing. Proceedings (Cat. No.99EX378)","volume":"65 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123416276","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Fuzzy knowledge representation, learning and optimization with Bayesian analysis in fuzzy semantic networks","authors":"Mohamed Nazih Omri","doi":"10.1109/ICONIP.1999.844024","DOIUrl":"https://doi.org/10.1109/ICONIP.1999.844024","url":null,"abstract":"The paper presents an optimization method, based on both Bayesian analysis technique and Gallois lattice of a fuzzy semantic network. The technical system we use learns by interpreting an unknown word using the links created between this new word and known words. The main link is provided by the context of the query. When a novice's query is confused with an unknown verb (goal) applied to a known noun denoting either an object in the ideal user's network or an object in the user's network, the system infers that this new verb corresponds to one of the unknown goals. With the learning of new words for natural language interpretation, which is produced in agreement with the user, the system improves its representation scheme at each experiment with a new user and in addition, takes advantage of previous discussions with users. The semantic net of user objects thus obtained by these kinds of learning is not always optimal because some relationships between a couple of user objects can be generalized and others suppressed according to values of forces that characterize them. Indeed, to simplify the obtained net, we propose to proceed to an inductive Bayesian analysis on the net obtained from Gallois lattice. The objective of this analysis can be seen as an operation of filtering of the obtained descriptive graph.","PeriodicalId":237855,"journal":{"name":"ICONIP'99. ANZIIS'99 & ANNES'99 & ACNN'99. 6th International Conference on Neural Information Processing. Proceedings (Cat. No.99EX378)","volume":"489 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123559903","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An improved learning algorithm for laterally interconnected synergetically self-organizing map","authors":"Bai-ling Zhang, Tom Gedeon","doi":"10.1109/ICONIP.1999.843996","DOIUrl":"https://doi.org/10.1109/ICONIP.1999.843996","url":null,"abstract":"LISSOM (Laterally Interconnected Synergetically Self-Organizing Map) is a biologically motivated self-organizing neural network for the simultaneous development of topographic maps and lateral interactions in the visual cortex. However, the simple Hebbian mechanism for afferent connections requires a redundant dimension to be added to the input, and normalization is necessary. Another shortcoming of LISSOM is that several parameters must be chosen before it can be used as a model of topographic map formation. To solve these problems, we propose to apply the least mean-square error reconstruction (LMSER) learning rule as an alternative to the simple Hebbian rule for the afferent connections. Experiments demonstrate the essential topographic map properties from the improved LISSOM model.","PeriodicalId":237855,"journal":{"name":"ICONIP'99. ANZIIS'99 & ANNES'99 & ACNN'99. 6th International Conference on Neural Information Processing. Proceedings (Cat. No.99EX378)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122673176","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Structure analysis for fMRI brain data by using mutual information and interaction","authors":"K. Niki, J. Hatou, I. Tahara","doi":"10.1109/ICONIP.1999.844661","DOIUrl":"https://doi.org/10.1109/ICONIP.1999.844661","url":null,"abstract":"The authors propose a novel structure analysis method for fMRI data by using mutual information and interaction, based on Shannon's information theory. First, we introduce a structure analysis that assumes one directional information flow schema: stimulus variate/spl rarr/state variate/spl rarr/response variate. Next, we present alternative structure analysis methods that focus on the common information in variates. These methods are useful in the case where the direction of information flow is not obvious, just like in higher brain areas. We apply these analysis methods to artificially generated data, and show some kinds of classification error. However, intensive analysis that uses many kinds of information measurements can make information structure clear. Finally we apply these methods to fMRI data and show our methods are useful.","PeriodicalId":237855,"journal":{"name":"ICONIP'99. ANZIIS'99 & ANNES'99 & ACNN'99. 6th International Conference on Neural Information Processing. Proceedings (Cat. No.99EX378)","volume":"62 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129516869","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Latent classification via self-organizing maps: a study on hereditary hypertension","authors":"Hwann-Tzong Chen, M. Liou, W. Pan","doi":"10.1109/ICONIP.1999.845669","DOIUrl":"https://doi.org/10.1109/ICONIP.1999.845669","url":null,"abstract":"The latent classification technique (LCT) is a statistical tool for subdividing subjects into homogeneous groups according to important features. This study used the LCT to classify 698 subjects according to 11 risk factors associated with hypertension (HP) (e.g., blood cholesterol, urinary sodium) and identified subgroups whose odds of having parental HP were significantly high. Results showed that obese groups had higher odds as compared with other groups. In order to further establish the connection between risk factors and parental HP, this study classified subjects on a self-organizing map (SOM) and identified subgroups whose profiles on the risk factors were most similar, and whose odds of having parental HP were also high. The subgroups organized on the map closely matched those from the LCT.","PeriodicalId":237855,"journal":{"name":"ICONIP'99. ANZIIS'99 & ANNES'99 & ACNN'99. 6th International Conference on Neural Information Processing. Proceedings (Cat. No.99EX378)","volume":"65 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129015242","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Determination of the number of hidden units from a statistical viewpoint","authors":"T. Hayasaka, K. Hagiwara, N. Toda, S. Usui","doi":"10.1109/ICONIP.1999.843993","DOIUrl":"https://doi.org/10.1109/ICONIP.1999.843993","url":null,"abstract":"One of the important problems for 3-layered neural networks (3-LNN) is to determine the optimal network structure with high generalization ability. Although this can be formulated in terms of a statistical model selection, there remains a problem in applying traditional criteria for 3-LNN. We suggest the type of effective criteria for the model selection problem of 3-LNN by analyzing the statistical properties of some simplified nonlinear models. Results of numerical experiments are also presented.","PeriodicalId":237855,"journal":{"name":"ICONIP'99. ANZIIS'99 & ANNES'99 & ACNN'99. 6th International Conference on Neural Information Processing. Proceedings (Cat. No.99EX378)","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126980787","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}