Neural Network World最新文献

筛选
英文 中文
Quantum informatics and soft systems modeling 量子信息学与软系统建模
IF 0.8 4区 计算机科学
Neural Network World Pub Date : 2020-01-01 DOI: 10.14311/nnw.2020.30.010
M. Svítek
{"title":"Quantum informatics and soft systems modeling","authors":"M. Svítek","doi":"10.14311/nnw.2020.30.010","DOIUrl":"https://doi.org/10.14311/nnw.2020.30.010","url":null,"abstract":"This paper elaborates on the area of physical information analogies and introduces new features such as the distance between wave probabilistic functions or sets of new information quantities such as strength, strength moment, strength potential energy and generalized charge. New parameters are used to define the rules for a quantum node. The knowledge cycle, which is equivalent to the Otto thermodynamic cycle, is adopted for modeling of the soft systems together with its static and dynamic information stability. Looking at the closed knowledge cycle, the evolutionary field equivalent to a magnetic field is therefore determined.","PeriodicalId":49765,"journal":{"name":"Neural Network World","volume":"30 1","pages":"133-144"},"PeriodicalIF":0.8,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"67123463","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Representation learning of knowledge graphs using convolutional neural networks 使用卷积神经网络的知识图表示学习
IF 0.8 4区 计算机科学
Neural Network World Pub Date : 2020-01-01 DOI: 10.14311/nnw.2020.30.011
WANG GAO, Y. Fang, F. Zhang, Z. Yang
{"title":"Representation learning of knowledge graphs using convolutional neural networks","authors":"WANG GAO, Y. Fang, F. Zhang, Z. Yang","doi":"10.14311/nnw.2020.30.011","DOIUrl":"https://doi.org/10.14311/nnw.2020.30.011","url":null,"abstract":"Knowledge graphs have been playing an important role in many Artificial Intelligence (AI) applications such as entity linking, question answering and so forth. However, most of previous studies focused on the symbolic representation of knowledge graphs with structural information, which cannot deal well with new entities or rare entities with little relevant knowledge. In this paper, we propose a new deep knowledge representation architecture that jointly encodes both structure and textual information. We first propose a novel neural model to encode the text descriptions of entities based on Convolutional Neural Networks (CNN). Secondly, an attention mechanism is applied to capture the valuable information from these descriptions. Then we introduce position vectors as supplementary information. Finally, a gate mechanism is designed to integrate representations of structure and text into the joint representation. Experimental results on two datasets show that our models obtain state-of-the-art results on link prediction and triplet classification tasks, and achieve the best performance on the relation classification task.","PeriodicalId":49765,"journal":{"name":"Neural Network World","volume":"30 1","pages":"145-160"},"PeriodicalIF":0.8,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"67123469","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Iterated dilated convolutional neural networks for word segmentation 用于分词的迭代扩展卷积神经网络
IF 0.8 4区 计算机科学
Neural Network World Pub Date : 2020-01-01 DOI: 10.14311/NNW.2020.30.022
H. He, X. Yang, L. Wu, G. Wang
{"title":"Iterated dilated convolutional neural networks for word segmentation","authors":"H. He, X. Yang, L. Wu, G. Wang","doi":"10.14311/NNW.2020.30.022","DOIUrl":"https://doi.org/10.14311/NNW.2020.30.022","url":null,"abstract":"The latest development of neural word segmentation is governed by bi-directional Long Short-Term Memory Networks (Bi-LSTMs) that utilize Recurrent Neural Networks (RNNs) as standard sequence tagging models, resulting in expressive and accurate performance on large-scale dataset. However, RNNs are not adapted to fully exploit the parallelism capability of Graphics Processing Unit (GPU), limiting their computational efficiency in both learning and inferring phases. This paper proposes a novel approach adopting Iterated Dilated Convolutional Neural Networks (ID-CNNs) to supersede Bi-LSTMs for faster computation while retaining accuracy. Our implementation has achieved state-of-the-art result on SIGHAN Bakeoff 2005 datasets. Extensive experiments showed that our approach with ID-CNNs enables 3× training time speedups with no accuracy loss, achieving better accuracy compared to the prevailing Bi-LSTMs. Source code and corpora of this paper have been made publicly available on GitHub.","PeriodicalId":49765,"journal":{"name":"Neural Network World","volume":"30 1","pages":"333-346"},"PeriodicalIF":0.8,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"67123773","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Research on predictive model based on classification with parameters of optimization 基于参数优化分类的预测模型研究
IF 0.8 4区 计算机科学
Neural Network World Pub Date : 2020-01-01 DOI: 10.14311/NNW.2020.30.020
Turken Gulzat, Naizabayeva Lyazat, V. Siládi, Sembina Gulbakyt, Satymbekov Maksatbek
{"title":"Research on predictive model based on classification with parameters of optimization","authors":"Turken Gulzat, Naizabayeva Lyazat, V. Siládi, Sembina Gulbakyt, Satymbekov Maksatbek","doi":"10.14311/NNW.2020.30.020","DOIUrl":"https://doi.org/10.14311/NNW.2020.30.020","url":null,"abstract":"This paper effectively uses the data mining and optimization methods to investigate a classification based on decision trees algorithm, then optimizes by the method of grid search and cross-validation, which improves the prediction accuracy of the decision tree model for the PCs sales in practical application and solves insufficient training data, high computational cost, and low prediction accuracy. The main goal of the article is to predict PC sales using machine learning tools caused by various types of operating system factors in practical applications. This article proposes a combined predictive research model that fully reveals the benefits of optimization and neural networks, and also has a very accurate fit and forecasting accuracy. The proposed predictive model is implemented in the data science software platform RapidMiner. A decision tree model is executed, then the model’s prediction capacity is evaluated and tested. Grid search optimizer is used to automatically build the final model using the best-optimized parameter for training the classifier. The paper combines grid the grid search and cross-validation to optimize the parameters of the decision tree to improve the classification prediction accuracy of the decision tree model. This article combines neural networks with optimization methods to establish a prediction model for laptop sales. This model gives full play to the advantages of optimization and neural networks and has very good fitting capabilities and prediction accuracy. Besides, the neural network for the prediction model has strong dynamic analysis capabilities. Once there are new observations, it can continue to be added to the modeling, which has high adaptability. The Neural Network algorithm has the highest accuracy of the predicted PC sales by evaluating the results of the five kinds of algorithms. The result for prediction accuracy shows the highest performance.","PeriodicalId":49765,"journal":{"name":"Neural Network World","volume":"121 1","pages":"295-308"},"PeriodicalIF":0.8,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"67123635","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 10
Comparison of software packages for performing Bayesian inference 执行贝叶斯推理的软件包比较
IF 0.8 4区 计算机科学
Neural Network World Pub Date : 2020-01-01 DOI: 10.14311/NNW.2020.30.019
M. Koprivica
{"title":"Comparison of software packages for performing Bayesian inference","authors":"M. Koprivica","doi":"10.14311/NNW.2020.30.019","DOIUrl":"https://doi.org/10.14311/NNW.2020.30.019","url":null,"abstract":"In this paper, we compare three state-of-the-art Python packages for Bayesian inference: JAGS [14], Stan [5], and PyMC3 [18]. These packages are in focus because they are the most mature, and Python is among the most utilized programming languages for teaching mathematics and statistics in colleges [13]. The experiment is based on real-world data collected for investigating the therapeutic touch nursing technique [17]. It is analyzed through a hierarchical model with prior beta distribution and binomial likelihood function. The tools are compared by execution time and sample quality.","PeriodicalId":49765,"journal":{"name":"Neural Network World","volume":"30 1","pages":"283-294"},"PeriodicalIF":0.8,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"67123587","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Review and analysis of hidden neuron number effect of shallow backpropagation neural networks 浅层反向传播神经网络隐神经元数效应综述与分析
IF 0.8 4区 计算机科学
Neural Network World Pub Date : 2020-01-01 DOI: 10.14311/nnw.2020.30.008
B. Şekeroğlu, Kamil Dimililer
{"title":"Review and analysis of hidden neuron number effect of shallow backpropagation neural networks","authors":"B. Şekeroğlu, Kamil Dimililer","doi":"10.14311/nnw.2020.30.008","DOIUrl":"https://doi.org/10.14311/nnw.2020.30.008","url":null,"abstract":"Shallow neural network implementations are still popular for real-life classification problems that require rapid achievements with limited data. Parameters selection such as hidden neuron number, learning rate and momentum factor of neural networks are the main challenges that causes time loss during these implementations. In these parameters, the determination of hidden neuron numbers is the main drawback that affects both training and generalization phases of any neural system for learning efficiency and system accuracy. In this study, several experiments are performed in order to observe the effect of hidden neuron number of 3-layered backpropagation neural network on the generalization rate of classification problems using both numerical datasets and image databases. Experiments are performed by considering the increasing number of total processing elements, and various numbers of hidden neurons are used during the training. The results of each hidden neuron number are analyzed according to the accuracy rates and iteration numbers during the convergence. Results show that the effect of the hidden neuron numbers mainly depends on the number of training patterns. Also obtained results suggest intervals of hidden neuron numbers for different number of total processing elements and training patterns.","PeriodicalId":49765,"journal":{"name":"Neural Network World","volume":"30 1","pages":"97-112"},"PeriodicalIF":0.8,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"67123924","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Versatile function in GPA GPA的多功能功能
IF 0.8 4区 计算机科学
Neural Network World Pub Date : 2020-01-01 DOI: 10.14311/NNW.2020.30.025
T. Brandejsky
{"title":"Versatile function in GPA","authors":"T. Brandejsky","doi":"10.14311/NNW.2020.30.025","DOIUrl":"https://doi.org/10.14311/NNW.2020.30.025","url":null,"abstract":"The paper, devoted to continuous versatile function application in the Genetic Programming Algorithm (GPA), begins with a discussion of similarities between GPA with versatile function and neural network. Then, the function set influence on GPA efficiency is discussed. In the next part, there is described a hybrid evolutionary algorithm that combines GPA for structure development and Evolutionary Strategy (ES) for parameters and constant optimization; which is herein much more significant than in the standard GPA. There is also discussed the setting of parameters of this hybrid algorithm and due to a different function set. The original idea of a versatile function, which origins come from the area of fuzzy control systems, is formulated and explained. Four different implementations of this versatile function are discussed. On the base of experiments with the hybrid evolutionary algorithm providing symbolic regression of precomputed Lorenz attractor system data representing its dynamic behaviour; the comparison of three variants of versatile functions was formulated. The paper also presents ways how to set up hybrid evolutionary algorithm parameters like population sizes as well as limits of maximal population numbers for both algorithms: GPA for structural development and nested ES for parameters optimization. The versatile function concept is applicable but it requires the hybrid evolutionary algorithm use as it is explained in the paper.","PeriodicalId":49765,"journal":{"name":"Neural Network World","volume":"48 1","pages":"379-392"},"PeriodicalIF":0.8,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"67123935","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Robust and fragile watermarking for medical images using redundant residue number system and chaos 基于冗余剩余数系统和混沌的医学图像鲁棒脆弱水印
IF 0.8 4区 计算机科学
Neural Network World Pub Date : 2020-01-01 DOI: 10.14311/nnw.2020.30.013
M. T. Naseem, I. Qureshi, Atta-ur-Rahman, M. Z. Muzaffar
{"title":"Robust and fragile watermarking for medical images using redundant residue number system and chaos","authors":"M. T. Naseem, I. Qureshi, Atta-ur-Rahman, M. Z. Muzaffar","doi":"10.14311/nnw.2020.30.013","DOIUrl":"https://doi.org/10.14311/nnw.2020.30.013","url":null,"abstract":"This research discusses a novel watermarking scheme using redundant residue number system and chaos. The salient feature of said research is that image remains fragile while the watermark information is made robust. Image pixels are converted into residues so that the unaided eye could not see the image contents. To make the image invisible to the unaided eye, only the ROI part of image is passed through the Residue Number System thus, to enhance the secrecy of the image. While converting the ROI part of image into residues, there are some residues which exceed eight bits so, these residues are converted to exact eight bits by pertaining some intelligent mechanism. To achieve the robustness of watermark, firstly redundant residues of watermark are made and then the resultant watermark is encoded through error correcting codes. To achieve the fragility of image, hashing technique is utilized. Hash of the entire image but with the residued ROI is combined with the encoded and redundant residued watermark and then resultant watermark is embedded in the Region of non-interest (RONI) zone of native image rooted on the chaotic key in order to enhance the security of the watermark. In case of no tampering, fragile watermark can be successfully recovered as well as exact recovery of the original image but if the image is attacked, the fragile watermark is destroyed while the robust watermark is extracted with better readability.","PeriodicalId":49765,"journal":{"name":"Neural Network World","volume":"30 1","pages":"177-192"},"PeriodicalIF":0.8,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"67123489","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Parametric sensitivity in decision making process 决策过程中的参数敏感性
IF 0.8 4区 计算机科学
Neural Network World Pub Date : 2020-01-01 DOI: 10.14311/nnw.2020.30.003
P. Moos, M. Novák, Z. Votruba
{"title":"Parametric sensitivity in decision making process","authors":"P. Moos, M. Novák, Z. Votruba","doi":"10.14311/nnw.2020.30.003","DOIUrl":"https://doi.org/10.14311/nnw.2020.30.003","url":null,"abstract":"This paper introduces a possibility of application of parametric sensitivity appearing in processes of decision making in systems represented by the general production functions of the Tinbergen type, depending on information content, information flow and qualification of human resources. The so-called parametric sensitivity considering the information content I as an ordering parameter, dependent on the information flow φ, applied on production function. The theory of production function describes the relation between physical outputs of a production process and physical inputs, i.e. factors of production. Finally, the influence of knowledge in information content I, leading to correct decision, is demonstrated through the parametric sensitivity concept. For this invention, J. Tinbergen and R. Frisch achieved in 1969 the “Nobel price of the Swedish National Bank”. Besides, the production functions theory surprisingly represents also a tool for finding the reasons of living bodies behavior.","PeriodicalId":49765,"journal":{"name":"Neural Network World","volume":"30 1","pages":"45-53"},"PeriodicalIF":0.8,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"67123591","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Wave composition rules in quantum system theory 量子系统理论中的波组成规则
IF 0.8 4区 计算机科学
Neural Network World Pub Date : 2020-01-01 DOI: 10.14311/nnw.2020.30.004
M. Svítek
{"title":"Wave composition rules in quantum system theory","authors":"M. Svítek","doi":"10.14311/nnw.2020.30.004","DOIUrl":"https://doi.org/10.14311/nnw.2020.30.004","url":null,"abstract":"The paper presents the new approach to wave composition rules for advanced modeling of soft systems in quantum system theory. Firstly, the interpretation of phase parameters is given. The phase parameters are essential to specify the mathematical operations assigned to different relations among subsystems, e.g. co-operation, connection, co-existence, competition. Using wave composition rules, we are able to create more complex and sophisticated quantum circuits. We present the application of methodology on three illustrative examples.","PeriodicalId":49765,"journal":{"name":"Neural Network World","volume":"30 1","pages":"55-64"},"PeriodicalIF":0.8,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"67123620","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信