{"title":"Deep reasoning and thinking beyond deep learning by cognitive robots and brain-inspired systems","authors":"Yingxu Wang","doi":"10.1109/ICCI-CC.2016.7862095","DOIUrl":"https://doi.org/10.1109/ICCI-CC.2016.7862095","url":null,"abstract":"Recent basic studies reveal that AI problems are deeply rooted in both the understanding of the natural intelligence and the adoption of suitable mathematical means for rigorously modeling the brain in machine understandable forms. Learning is a cognitive process of knowledge and behavior acquisition. Learning can be classified into five categories known as object identification, cluster classification, functional regression, behavior generation, and knowledge acquisition. A fundamental challenge to knowledge learning different from the deep and recurring neural network technologies has led to the emergence of the field of cognitive machine learning on the basis of recent breakthroughs in denotational mathematics and mathematical engineering. This keynote lecture presents latest advances in formal brain studies and cognitive systems for deep reasoning and deep learning. It is recognized that key technologies enabling cognitive robots mimicking the brain rely not only on deep learning, but also on deep reasoning and thinking towards machinable thoughts and cognitive knowledge bases built by a cognitive systems. A fundamental theory and novel technology for implementing deep thinking robots are demonstrated based on concept algebra, semantics algebra, and inference algebra.","PeriodicalId":135701,"journal":{"name":"2016 IEEE 15th International Conference on Cognitive Informatics & Cognitive Computing (ICCI*CC)","volume":"99 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134441742","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A novel method for document summarization using Word2Vec","authors":"Zhibo Wang, Long Ma, Yanqing Zhang","doi":"10.1109/ICCI-CC.2016.7862087","DOIUrl":"https://doi.org/10.1109/ICCI-CC.2016.7862087","url":null,"abstract":"Texting mining is a process to extract useful patterns and information from large volume of unstructured text data. Unlike other quantitative data, unstructured text data cannot be directly utilized in machine learning models. Hence, data pre-processing is an essential step to remove vague or redundant data such as punctuations, stop-words, low-frequency words in the corpus, and re-organize the data in a format that computers can understand. Though existing approaches are able to eliminate some symbols and stop-words during the pre-processing step, a portion of words are not used to describe the documents' topics. These irrelevant words not only waste the storage that lessen the efficiency of computing, but also lead to confounding results. In this paper, we propose an optimization method to further remove these irrelevant words which are not highly correlated to the documents' topics. Experimental results indicate that our proposed method significantly compresses the documents, while the resulting documents remain a high discrimination in classification tasks; additionally, storage is greatly reduced according to various criteria.","PeriodicalId":135701,"journal":{"name":"2016 IEEE 15th International Conference on Cognitive Informatics & Cognitive Computing (ICCI*CC)","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134540901","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Valipour, Yingxu Wang, Omar A. Zatarain, M. Gavrilova
{"title":"Algorithms for determining semantic relations of formal concepts by cognitive machine learning based on concept algebra","authors":"M. Valipour, Yingxu Wang, Omar A. Zatarain, M. Gavrilova","doi":"10.1109/ICCI-CC.2016.7862021","DOIUrl":"https://doi.org/10.1109/ICCI-CC.2016.7862021","url":null,"abstract":"It is recognized that the semantic space of knowledge is a hierarchical concept network. This paper presents theories and algorithms of hierarchical concept classification by quantitative semantic relations via machine learning based on concept algebra. The equivalence between formal concepts are analyzed by an Algorithm of Concept Equivalence Analysis (ACEA), which quantitatively determines the semantic similarity of an arbitrary pair of formal concepts. This leads to the development of the Algorithm of Relational Semantic Classification (ARSC) for hierarchically classify any given concept in the semantic space of knowledge. Experiments applying Algorithms ACEA and ARSC on 20 formal concepts are successfully conducted, which encouragingly demonstrate the deep machine understanding of semantic relations and their quantitative weights beyond human perspectives on knowledge learning and natural language processing.","PeriodicalId":135701,"journal":{"name":"2016 IEEE 15th International Conference on Cognitive Informatics & Cognitive Computing (ICCI*CC)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115643302","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Extracting time-oriented relationships of nutrients to losing body fat mass using inductive logic programming","authors":"Sho Ushikubo, K. Kanamori, H. Ohwada","doi":"10.1109/ICCI-CC.2016.7862039","DOIUrl":"https://doi.org/10.1109/ICCI-CC.2016.7862039","url":null,"abstract":"This study was performed to extract rules for reducing body fat mass so as to prevent lifestyle-related diseases. Lifestyle-related diseases have been increasing in Japan, even among younger people. Body fat mass is related to lifestyle-related diseases. Hence, finding rules for reducing body fat mass is very meaningful. We obtained lifestyle time-series data on five male subjects who are in their 20s and not obese. The data includes the amount of body fat mass of each subject and a variety of features such as sleep, exercise, and nutrient intake. We used Inductive Logic Programming (ILP) to apply this data because ILP can more flexibly learn rules than other machine-learning methods. As a result of applying the data to ILP, our ILP system successfully extracted rules of time-oriented relationships of nutrients to decrease body fat mass based on limited data. Intake of various nutrients one day and two days prior was effective in reducing body fat mass. Moreover, we determined that nutrients related to losing body fat mass include vitamin B2, pantothenic acid, fat, vitamin B1, and biotin.","PeriodicalId":135701,"journal":{"name":"2016 IEEE 15th International Conference on Cognitive Informatics & Cognitive Computing (ICCI*CC)","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122770623","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Modeling chunking effects on learning and performance using the Computational-Unified Learning Model (C-ULM): A multiagent cognitive process model","authors":"D. Shell, Leen-Kiat Soh, Vlad Chiriacescu","doi":"10.1109/ICCI-CC.2016.7862098","DOIUrl":"https://doi.org/10.1109/ICCI-CC.2016.7862098","url":null,"abstract":"Chunking has emerged as a basic property of human cognition. Computationally, chunking has been proposed as a process for compressing information also has been identified in neural processes in the brain and used in models of these processes. Our purpose in this paper is to expand understanding of how chunking impacts both learning and performance using the Computational-Unified Learning Model (C-ULM) a multi-agent computational model. Chunks in C-ULM long-term memory result from the updating of concept connection weights via statistical learning. Concept connection weight values move toward the accurate weight value needed for a task and a confusion interval reflecting certainty in the weight value is shortened each time a concept is attended in working memory and each time a task is solved, and the confusion interval is lengthened when a chunk is not retrieved over a number of cycles and each time a task solution attempt fails. The dynamic tension between these updating mechanisms allows chunks to come to represent the history of relative frequency of co-occurrence for the concept connections present in the environment; thereby encoding the statistical regularities in the environment in the long-term memory chunk network. In this paper, the computational formulation of chunking in the C-ULM is described, followed by results of simulation studies examining impacts of chunking versus no chunking on agent learning and agent effectiveness. Then, conclusions and implications of the work both for understanding human learning and for applications within cognitive informatics, artificial intelligence, and cognitive computing are discussed.","PeriodicalId":135701,"journal":{"name":"2016 IEEE 15th International Conference on Cognitive Informatics & Cognitive Computing (ICCI*CC)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125691249","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Hebbian learning and the LMS algorithm","authors":"B. Widrow","doi":"10.1109/ICCI-CC.2016.7862094","DOIUrl":"https://doi.org/10.1109/ICCI-CC.2016.7862094","url":null,"abstract":"Hebbian learning is one of the fundamental premises of neuroscience. The LMS (least mean square) algorithm of Widrow and Hoff is the world's most widely used learning algorithm. Hebbian learning is unsupervised. LMS learning is supervised. However, a form of LMS can be constructed to perform unsupervised learning and to implement Hebbian learning. Combining the two paradigms creates a new unsupervised learning algorithm that has practical engineering applications and provides insight into learning in living neural networks. A fundamental question is, how does learning take place in living neural networks? The learning algorithm practiced by nature at the neuron and synapse level may well be the Hebbian-LMS algorithm.","PeriodicalId":135701,"journal":{"name":"2016 IEEE 15th International Conference on Cognitive Informatics & Cognitive Computing (ICCI*CC)","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129468511","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A key issue of semantics of information","authors":"L. Zadeh","doi":"10.1109/ICCI-CC.2016.7862093","DOIUrl":"https://doi.org/10.1109/ICCI-CC.2016.7862093","url":null,"abstract":"In his epoch-making work on information theory, Shannon defended information in terms of entropy. Entropy-based definitions of information relate to quantity of information, but not to its meaning. Subsequent attempts to introduce semantics into information theory have made some progress but fell short of having a capability to deal with information described in natural language. This paper is aimed that of laying information for the theory which has this capability, call it a theory of semantics information (TSI). TSI is centered on a concept which plays a key role in human intelligence — A concept whose basic importance has long been and continues to be unrecognized — The concept of a restriction is pervasive in human cognition. Restrictions underlie the remarkable human ability to reason and make rational decisions in an environment of imprecision, uncertainty and incompleteness of information. Such environments are the norm in the real-world. Such environments have the traditional logical systems that become dysfunctional. There are many applications in which semantics of information plays an important role. Among such applications are: machine translation, summarization, search and decision-making under uncertainty. Informally, a restriction on a specified (focal) variable, X, written as R (X), is a statement which is a carrier of information about the values which X can take. Typically, restrictions are described in natural language. Example. X = length of time it takes to drive from Berkeley to SF Airport; R(X) = usually it takes about 90 minutes to drive from Berkeley to SF Airport. In adverse weather it may take close to 2 hours. An important issue in TSI is computation with restrictions. TSI opens the door to modes of computation in which approximation is accepted. Acceptance of approximate computations takes the calculus of restrictions (CR) into uncharted territory.","PeriodicalId":135701,"journal":{"name":"2016 IEEE 15th International Conference on Cognitive Informatics & Cognitive Computing (ICCI*CC)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128756228","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Quantum cognitive computation by CICT","authors":"R. Fiorini","doi":"10.1109/ICCI-CC.2016.7862085","DOIUrl":"https://doi.org/10.1109/ICCI-CC.2016.7862085","url":null,"abstract":"We show and discuss how computational information conservation theory (CICT) can help us to develop even competitive advanced quantum cognitive computational systems towards deep computational cognitive intelligence. CICT new awareness of a discrete HG (hyperbolic geometry) subspace (reciprocal space, RS) of coded heterogeneous hyperbolic structures, underlying the familiar Q Euclidean (direct space, DS) system surface representation can open the way to holographic information geometry (HIG) to recover lost coherence information in system description and to develop advanced quantum cognitive systems. This paper is a relevant contribution towards an effective and convenient “Science 2.0” universal computational framework to achieve deeper cognitive intelligence at your fingertips and beyond.","PeriodicalId":135701,"journal":{"name":"2016 IEEE 15th International Conference on Cognitive Informatics & Cognitive Computing (ICCI*CC)","volume":"6 3","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132478153","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Hongzhi Liu, Bin Fu, Zhengshen Jiang, Zhonghai Wu, D. Hsu
{"title":"A feature selection framework based on supervised data clustering","authors":"Hongzhi Liu, Bin Fu, Zhengshen Jiang, Zhonghai Wu, D. Hsu","doi":"10.1109/ICCI-CC.2016.7862054","DOIUrl":"https://doi.org/10.1109/ICCI-CC.2016.7862054","url":null,"abstract":"Feature selection is an important step for data mining and machine learning to deal with the curse of dimensionality. In this paper, we propose a novel feature selection framework based on supervised data clustering. Instead of assuming there only exists low-order dependencies between features and the target variable, the proposed method directly estimates the high-dimensional mutual information between a candidate feature subset and the target variable through supervised data clustering. In addition, it can automatically determine the number of features to be selected instead of manually setting it in a prior. Experimental results show that the proposed method performs similar or better compared with state-of-the-art feature selection methods.","PeriodicalId":135701,"journal":{"name":"2016 IEEE 15th International Conference on Cognitive Informatics & Cognitive Computing (ICCI*CC)","volume":"141 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132719135","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"On cognitive foundations of big data science and engineering","authors":"Yingxu Wang","doi":"10.1109/ICCI-CC.2016.7862044","DOIUrl":"https://doi.org/10.1109/ICCI-CC.2016.7862044","url":null,"abstract":"Big data are one of the representative phenomena of the information era of human societies. A basic study on the cognitive foundations of big data science is presented with a coherent set of general principles and analytic methodologies for big data manipulations. It leads to a set of mathematical theories that rigorously describe the general patterns of big data across pervasive domains in sciences, engineering, and societies. A significant finding towards big data science is that big data systems in nature are a recursive n-dimensional typed hyperstructure (RNTHS). The fundamental topological property of big data system enables the inherited complexities and unprecedented challenges of big data to be formally dealt with as a set of denotational mathematical operations in big data engineering. The cognitive relationship and transformability between data, information, knowledge, and intelligence are formally revealed towards big data science.","PeriodicalId":135701,"journal":{"name":"2016 IEEE 15th International Conference on Cognitive Informatics & Cognitive Computing (ICCI*CC)","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116344902","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}