2006 IEEE International Conference on Granular Computing最新文献

筛选
英文 中文
The intelligent control terminal with SCM based on constructive neural networks 基于构造神经网络的单片机智能控制终端
2006 IEEE International Conference on Granular Computing Pub Date : 2006-05-10 DOI: 10.1109/GRC.2006.1635761
Chijian Zhang, H. Wang, Lin Zhang
{"title":"The intelligent control terminal with SCM based on constructive neural networks","authors":"Chijian Zhang, H. Wang, Lin Zhang","doi":"10.1109/GRC.2006.1635761","DOIUrl":"https://doi.org/10.1109/GRC.2006.1635761","url":null,"abstract":"The constructive neural networks(CNN) based on the FP(Forward Propagation) algorithms can be used to realize an intelligent machine with the SCM(single-chip microcomputer). High performance of error-toleration and error-correction can be found when it communicated with the others .The system has the ability of self-learning and expansion of knowledge on line, it can also run safely by setting the rejection schemes. Index Terms—constructive neural networks(CNN), SCM( single-chip microcomputer), intelligent terminal. I. INTRODUCTION he control objects in modern industry are mostly complex, nonlinear, gray and long-time-lag system, it is difficult to control and establish exact model. In order to get satisfactory control results, people try to construct real-time control system with neural networks, and a series of production (4,5,6) have been made. But, with the development of modern industry, people hope that the intelligent control terminal can recognize not only the existing control model, but also the newly process models. The control terminal can evolve through the circle of application, learning, and more advanced application. But the existing neural networks are difficult to learning and evolution on line. In this paper, the constructive neural networks to be used in the intelligent control terminal with the SCM and CNN can not only learn on line but also perform the error-toleration and error-correction. The system can be safely and effectively used in the filed of industrial control under severe condition. You can also see the following from this paper: (1) The CNN enable the control terminal with well self-learning function. It's learning complexity is only O(n), more less than other neural networks, it can be realized by SCM. (2) When the control terminal is used in pattern recognizing or communicating with the others, the CNN can automatically perform the error-toleration and the error-correction. CNN can perform top intelligent control.","PeriodicalId":400997,"journal":{"name":"2006 IEEE International Conference on Granular Computing","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-05-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125888307","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Reliable classification of childhood acute leukaemia from gene expression data using confidence machines 利用置信度机从基因表达数据中可靠地分类儿童急性白血病
2006 IEEE International Conference on Granular Computing Pub Date : 2006-05-10 DOI: 10.1109/GRC.2006.1635774
T. Bellotti, Zhiyuan Luo, A. Gammerman
{"title":"Reliable classification of childhood acute leukaemia from gene expression data using confidence machines","authors":"T. Bellotti, Zhiyuan Luo, A. Gammerman","doi":"10.1109/GRC.2006.1635774","DOIUrl":"https://doi.org/10.1109/GRC.2006.1635774","url":null,"abstract":"The Confidence Machine is a recently developed algorithmic framework for making reliable decisions in the face of uncertainty. Control of predictive accuracy is achieved by allowing hedged predictions, with the possible sacrifice of precision. We use the Support Vector Machine learning algorithm to derive a decision rule for the classification of childhood acute leukaemia subtypes from a small training set of gene expression data. We then implement a Confidence Machine for the decision rule and test on an independent data set to demonstrate its error calibration properties. We show that the Confidence Machine can be used to derive reliable predictions, with control of the risk of error whilst maintaining the level of accuracy given by the Support Vector Machine, yielding useful and precise predictions of leukaemia subtypes. Predictions are reliable even in the context of training from small sample size.","PeriodicalId":400997,"journal":{"name":"2006 IEEE International Conference on Granular Computing","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-05-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126347784","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 16
The STP model for solving imprecise problems 用于解决不精确问题的STP模型
2006 IEEE International Conference on Granular Computing Pub Date : 2006-05-10 DOI: 10.1109/GRC.2006.1635894
Jingtao Yao, Wei-Ning Liu
{"title":"The STP model for solving imprecise problems","authors":"Jingtao Yao, Wei-Ning Liu","doi":"10.1109/GRC.2006.1635894","DOIUrl":"https://doi.org/10.1109/GRC.2006.1635894","url":null,"abstract":"Researchers have been attracted for years to studies on solving imprecise problems. The first step for solving an imprecise problem is to clarify the problem itself. However, in many of cases, the impreciseness of a problem is due to its own nature and often leaves them unsolvable. There are at least two reasons for the impreciseness and unclearness of a problem. The first reason is that there may not be a suitable language to present the problem in an understandable and clear way. The second reason is that the problem itself is not well-definable. This is quite similar to a research question that a researcher is trying to specify. A problem may only be fully understood and specified after all solutions are available. In this paper, we introduce a problem solving approach by searching possible solutions to an imprecise problem. This is an approach to specify and solve an imprecise problem by matching the problem with its solutions. We present a Solution-To-Problem (STP) model as a new approach for imprecise problem solving. Basic notions, measures and algorithms for such a problem solving process are studied.","PeriodicalId":400997,"journal":{"name":"2006 IEEE International Conference on Granular Computing","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-05-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122466514","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
Object oriented modeling of protein translation system 蛋白质翻译系统的面向对象建模
2006 IEEE International Conference on Granular Computing Pub Date : 2006-05-10 DOI: 10.1109/GRC.2006.1635813
Zhong Huang, Xiaohua Hu
{"title":"Object oriented modeling of protein translation system","authors":"Zhong Huang, Xiaohua Hu","doi":"10.1109/GRC.2006.1635813","DOIUrl":"https://doi.org/10.1109/GRC.2006.1635813","url":null,"abstract":"Object oriented analysis and design (OOAD) has been utilized in modeling of large complex biological system in recent years. Despite its increasing popularity in software engineering OOAD has very limited application in system biology study mainly due to our incomplete understanding of the complicated molecular and cellular processes. Here we present a UML model of cell free protein translation system constructed by OOAD methodology. The major steps of in vitro protein translation process were captured and illustrated in UML diagrams. We demonstrated that the OOAD is a useful tool to build the UML model of a well defined and studied in vitro biological system for system biology study.","PeriodicalId":400997,"journal":{"name":"2006 IEEE International Conference on Granular Computing","volume":"233 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-05-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115015933","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
A Hidden Markov Model Approach to Model Protein Sequence and Structural Information: Identification of Helix-Turn-Helix DNA-Binding Motif 用隐马尔可夫模型模拟蛋白质序列和结构信息:螺旋-转-螺旋dna结合基序的鉴定
2006 IEEE International Conference on Granular Computing Pub Date : 2006-05-10 DOI: 10.1109/GRC.2006.1635821
Changhui Yan
{"title":"A Hidden Markov Model Approach to Model Protein Sequence and Structural Information: Identification of Helix-Turn-Helix DNA-Binding Motif","authors":"Changhui Yan","doi":"10.1109/GRC.2006.1635821","DOIUrl":"https://doi.org/10.1109/GRC.2006.1635821","url":null,"abstract":"This study presents a Hidden Markov Model (HMM) approach to model protein sequence and structure- derived information. The HMM emits both amino acid residue identity and the solvent accessibility of residues. The solvent accessibility of each residue is discretized into three states: buried (B), mediate (M) and exposed (E). A set of standard helix-turn- helix (HTH) motifs from a set of heterogeneous DNA-binding proteins was used to develop the HMM model for HTH motifs. The resulting HMM can identify HTH with a higher confidence and a higher sensitivity than the HMM that models only sequence information. It can also identify HTH motifs with structures different from the standard HTH motif.","PeriodicalId":400997,"journal":{"name":"2006 IEEE International Conference on Granular Computing","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-05-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129522154","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Rough set approximations VS. measurable spaces 粗糙集近似VS.可测空间
2006 IEEE International Conference on Granular Computing Pub Date : 2006-05-10 DOI: 10.1109/GRC.2006.1635807
Weizhi Wu, Wenxiu Zhang
{"title":"Rough set approximations VS. measurable spaces","authors":"Weizhi Wu, Wenxiu Zhang","doi":"10.1109/GRC.2006.1635807","DOIUrl":"https://doi.org/10.1109/GRC.2006.1635807","url":null,"abstract":"In this paper relationships between rough set ap- proximations and measurable spaces are examined. It is proved that the family of all definable sets in a serial crisp (fuzzy, respectively) rough set algebra forms a crisp (fuzzy respectively) algebra. For any crisp measurable space there must exist a crisp rough set algebra such that the family of all definable sets is the given crisp algebra. Also, for a fuzzy algebra generated by a crisp algebra there must exist a fuzzy rough set algebra such that the family of all definable sets is just the given fuzzy algebra.","PeriodicalId":400997,"journal":{"name":"2006 IEEE International Conference on Granular Computing","volume":"71 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-05-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129677031","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 22
Contrast enhancement for image with non-linear gray transform and wavelet neural network 基于非线性灰度变换和小波神经网络的图像对比度增强
2006 IEEE International Conference on Granular Computing Pub Date : 2006-05-10 DOI: 10.1109/GRC.2006.1635892
Changjiang Zhang, Xiaodong Wang, Haoran Zhang, G. Lv
{"title":"Contrast enhancement for image with non-linear gray transform and wavelet neural network","authors":"Changjiang Zhang, Xiaodong Wang, Haoran Zhang, G. Lv","doi":"10.1109/GRC.2006.1635892","DOIUrl":"https://doi.org/10.1109/GRC.2006.1635892","url":null,"abstract":"A new contrast enhancement algorithm for image is proposed with non-linear gray transform and wavelet neural network (WNN). In-complete Beta transform (IBT) is used to obtain non-linear gray transform curve. Transform parameters are determined by simulated annealing algorithm (SA) to obtain optimal s space, a new criterion is proposed. Contrast type for original image is determined employing the new criterion. Parameters space is given respectively according to different contrast types, which shrinks parameters space greatly. Thus searching direction and selegray transform parameters. In order to avoid the expensive time for traditional contrast enhancement algorithms, which search optimal gray transform parameters in the whole parameterction of initial values of SA is guided by the new parameter space. In order to calculate IBT in the whole image, a kind of WNN is proposed to approximate the IBT. Experimental results show that the new algorithm is able to adaptively enhance the contrast for image well. the algorithm was large. Existing many enhancement algorithms' intelligence and adaptability are worse and much artificial interference is required. To solve above problems, a new algorithm employing IBT, SA and WNN is proposed. To improve optimization speed and intelligence of algorithm, a new criterion is proposed based on gray level histogram. Contrast type for original image is determined employing the new criterion. Contrast for original image is classified into seven types: particular dark (PD), medium dark (MD), medium dark slightly (MDS), medium bright slightly (MBS), medium bright (MB), particular bright (PB) and good gray level distribution (GGLD). IBT operator transforms original image to a new space. A certain objective function is used to optimize non-linear transform parameters. SA, which was given by William, is used to determine the optimal non-linear transform parameters. In order to reduce the computation burden for calculating IBT, a new kind of WNN is proposed to approximate the IBT in the whole image.","PeriodicalId":400997,"journal":{"name":"2006 IEEE International Conference on Granular Computing","volume":"145 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-05-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131648055","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Granular computing based text classification 基于颗粒计算的文本分类
2006 IEEE International Conference on Granular Computing Pub Date : 2006-05-10 DOI: 10.1109/GRC.2006.1635803
Taorong Qiu, Xiaoqing Chen, Qing Liu, Houkuan Huang
{"title":"Granular computing based text classification","authors":"Taorong Qiu, Xiaoqing Chen, Qing Liu, Houkuan Huang","doi":"10.1109/GRC.2006.1635803","DOIUrl":"https://doi.org/10.1109/GRC.2006.1635803","url":null,"abstract":"In this paper, granular computing was introduced into text classification. First, Classification idea based on granular computing was explained, and a framework of text classification based on granular computing was presented. Secondly, some concepts such as information granule, feature granule, decision granule, association degree, core feature, effective decision granule and similarity of granules and so on, were defined. Thirdly, based on granular computing, a classifying model was built and an automatic classification algorithm was proposed. Finally, the proposed model and algorithm were illustrated by a real world example. It was shown that the proposed algorithms are useful and effective.","PeriodicalId":400997,"journal":{"name":"2006 IEEE International Conference on Granular Computing","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-05-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131228501","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Genotype susceptibility and integrated risk factors for complex diseases 复杂疾病的基因型易感性和综合危险因素
2006 IEEE International Conference on Granular Computing Pub Date : 2006-05-10 DOI: 10.1109/GRC.2006.1635910
W. Mao, D. Brinza, N. Hundewale, Stefan Gremalschi, A. Zelikovsky
{"title":"Genotype susceptibility and integrated risk factors for complex diseases","authors":"W. Mao, D. Brinza, N. Hundewale, Stefan Gremalschi, A. Zelikovsky","doi":"10.1109/GRC.2006.1635910","DOIUrl":"https://doi.org/10.1109/GRC.2006.1635910","url":null,"abstract":"Recent improvements in the accessibility of high- throughput genotyping have brought a great deal of attention to disease association and susceptibility studies. This paper explores possibility of applying discrete optimization methods to predict the genotype susceptibility for complex disease. The proposed combinatorial methods have been applied to publicly available genotype data on Crohn's disease and autoimmune disorders for predicting susceptibility to these diseases. The result of predicted status can be also viewed as an integrated risk factor. The quality of susceptibility prediction algorithm has been assessed using leave-one-out and leave-many-out tests and shown to be statistically significant based on randomization tests.The best prediction rate achieved by the prediction algorithms is 69.5% for Crohn's disease and 63.9% for autoimmune disorder. The risk rate of the corresponding integrated risk factor is 2.23 for Crohn's disease and 1.73 for autoimmune disorder.","PeriodicalId":400997,"journal":{"name":"2006 IEEE International Conference on Granular Computing","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-05-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134095190","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 12
Mining Frequent Patterns based on Compressed FP-tree without Conditional FP-tree Generation 不生成条件fp树的压缩fp树频繁模式挖掘
2006 IEEE International Conference on Granular Computing Pub Date : 2006-05-10 DOI: 10.1109/GRC.2006.1635844
Fei Chen, L. Shang, Ming Li, Zhaoqian Chen, Shifu Chen
{"title":"Mining Frequent Patterns based on Compressed FP-tree without Conditional FP-tree Generation","authors":"Fei Chen, L. Shang, Ming Li, Zhaoqian Chen, Shifu Chen","doi":"10.1109/GRC.2006.1635844","DOIUrl":"https://doi.org/10.1109/GRC.2006.1635844","url":null,"abstract":"Frequent patterns mining are widely used in many practical data mining applications. Therefore, current research focuses on developing frequent patterns mining algorithms of high performances and FP-growth is proved as an important and efficient frequent patterns mining algorithm. In this paper, a new algorithm Temporary Root growth based on Compressed FP-tree, i.e. TR-CFP, is proposed. This algorithm employs a temporary root constructing thought during mining on a CFP- tree without conditional FP-tree generation. TR-CFP saves large memory space occupied by FP-tree and the cost of constructing many conditional FP-trees. Experiments show that the time and space for TR-CFP have reduced significantly compared to FP- growth mining based on FP-tree. Furthermore, TR-CFP has a particular character other than Apriori and FP-growth, that it can specially mine frequent patterns of the designated length dynamically and efficiently. In further experiments, even at a low support threshold, when the whole mining process takes a long time, special mining of TR-CFP is still very quick. This particular character may be very useful in applications.","PeriodicalId":400997,"journal":{"name":"2006 IEEE International Conference on Granular Computing","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-05-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132189900","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信