{"title":"The Axiomatic Semantics of PDFD","authors":"Xiaolei Gao, Huai-kou Miao","doi":"10.1109/FCST.2008.18","DOIUrl":"https://doi.org/10.1109/FCST.2008.18","url":null,"abstract":"The integration of formal, structured and object-oriented methodology is the focus in the field of software development methodology. SOZL (structured methodology + object-oriented methodology + Z language) is a language that attempts to integrate structured method, object-oriented method and formal method. The core of this language is an improved data flow diagram, known as predicate data flow diagram, which is designed by adding input set, output set and corresponding predicate constraints to the components of the traditional data flow diagram. In order to eliminate the ambiguity of predicate data flow diagrams and their associated textual specifications, a formalization of the syntax and semantics of predicate data flow diagrams are necessary. In this paper we use Z notation to define an abstract syntax and the related structural constraints for the predicate data flow diagram notation, and provide it with an axiomatic semantics based on the concept of data availability. Necessary proofs are given to establish important properties on the axiomatic semantics.","PeriodicalId":206207,"journal":{"name":"2008 Japan-China Joint Workshop on Frontier of Computer Science and Technology","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116868258","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A New Technique of Modeling Acoustic Echo and its Implementation Using FPGA","authors":"A. Nassar, Ashraf Mohamed Ali","doi":"10.1109/FCST.2008.37","DOIUrl":"https://doi.org/10.1109/FCST.2008.37","url":null,"abstract":"We propose an integrated acoustic echo cancellation solution based on using multiple of small adaptive filters rather than using one long adaptive filter. A new approach is proposed using the concept of decomposing the long adaptive filter into low order multiple sub-filters in which the error signals are independent on each other. The independency of the error signals exhibits the parallelism technique. A novel class of efficient and robust adaptive algorithms. It exhibits fast convergence, superior tracking capabilities of the signal statistics. The proposed algorithm is also compared with multiple sub-filters approach used for acoustic echo cancellation as the technique of decomposition of error. It is generally found that adaptive LMS algorithm with lower order has faster convergence. In most of the cases, the eigen-value spread of the auto correlation decreases as the order of the filter decreases except for white input. We discuss also the complexity of our proposed design to show how our design has a good performance. Modern (FPGAs) include the resources needed to design efficient filtering structures. The modeling of the acoustic echo path was represented by using three sub-adaptive filters of order=10 with fixed step size =0.05/3 for each adaptive filter. We use sinusoidal input signal with additive white Gaussian noise (AWGN) which has different signal-to-noise ratio (SNRs) to examine our approach. The steady state error of our proposed technique is still high as the technique of decomposition of error. This steady state error is small with respect to using one long adaptive filter and this will be obvious in our simulation results. This paper addresses also the problems of blind source separation (BSS). In blind source separation, signals from multiple sources arrive simultaneously at a sensor array, so that each sensor output contains a mixture of source signals. Sets of sensor outputs are processed to recover the source signals from the mixed observations. The term blind refers to the fact that specific source signal values and accurate parameter values of a mixing model are not known a priori. Application domains for the material in this paper include communications, biomedical, and sensor array signal processing. Simulations are often needed when the performance of new methods is evaluated. If the method is designed to be blind or robust, simulation studies must cover the whole range of potential random input. It follows that there is a need for advanced tools of data generation. The purpose of this thesis is to introduce a technique for the generation of correlated multivariate random data with non-Gaussian marginal distributions for blind source separation technique.The output random variables are obtained as linear combinations of independent components. The covariance matrix and the first five moments of the output variables may be freely chosen. Moreover, the output variables may be filtered in order to add autocorrelation. Th","PeriodicalId":206207,"journal":{"name":"2008 Japan-China Joint Workshop on Frontier of Computer Science and Technology","volume":"68 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117018181","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Xinqiao Lu, Xiaojuan Liu, Guoqiang Xiao, E. Song, Ping Li, Qiaoling Luo
{"title":"A Segment Extraction Algorithm Based on Polygonal Approximation for On-Line Chinese Character Recognition","authors":"Xinqiao Lu, Xiaojuan Liu, Guoqiang Xiao, E. Song, Ping Li, Qiaoling Luo","doi":"10.1109/FCST.2008.23","DOIUrl":"https://doi.org/10.1109/FCST.2008.23","url":null,"abstract":"In this paper, a segment extraction algorithm based on polygonal approximation for on-line Chinese characters recognition (OLCCR) is presented. With this method, the point with the smallest interior angle is detected and the whole stroke is split into two adjacent curves by this point, which is called as a cut-off point or an inflexion. To each of the two curves, the same step is performed to detect the cut-off points respectively. The same operations are performed iteratively until the smallest interior angle in all the curves is larger than an appointed threshold value. All the cut-off points and the start-end points compose the stroke and every pair of adjacent points constructs a segment. Experiments proved that this method has the advantages of less computing complexity and better approximating effect then other methods. An OLCCR system with this segment extraction algorithm has achieved the speed of 20/s and the recognition rate of 97.2%.","PeriodicalId":206207,"journal":{"name":"2008 Japan-China Joint Workshop on Frontier of Computer Science and Technology","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122716842","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Privacy-Preserving Practical Convex Hulls Protocol","authors":"Youwen Zhu, Liusheng Huang, Wei Yang, Zhili Chen, Lingjun Li, Zhenshan Yu, Yonglong Luo","doi":"10.1109/FCST.2008.10","DOIUrl":"https://doi.org/10.1109/FCST.2008.10","url":null,"abstract":"Secure multi-party computation has been a hot research topic of cryptograhy for about two decades, and the convex hulls problem is a special case of it. However, the precise convex hulls will certainly expose all vertexes and even bring about unfairness. Therefore the practical approximate convex hulls are in need. In this paper, we summarize and discuss the convex hulls problem, and then we present a more effective new protocol to compute the approximate convex hulls. Furthermore, we analyze the security, communication complexity and efficiency of the protocol, and compare the new scheme with other privacy-preserving convex hulls protocols through simulated experiments. It shows that our privacy-preserving approximate convex hulls protocol is more effective than the previous privacy-preserving ones, and the new protocol is practical enough in many situations. Perfectly keeping privacy preserving and eliminating unfairness are the great advantages of our scheme.","PeriodicalId":206207,"journal":{"name":"2008 Japan-China Joint Workshop on Frontier of Computer Science and Technology","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129282209","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Design of General User Interface for Automatic Web Service Composition","authors":"Haruhiko Takada, Incheon Paik","doi":"10.1109/FCST.2008.20","DOIUrl":"https://doi.org/10.1109/FCST.2008.20","url":null,"abstract":"Automatic Web service composition (AWSC) was proposed to compose new services. Paik et al. (2007) proposed logical and physical Web service composition and ontologies for this. A useful composer needs a Web interface for general users. So a general user interface is designed and implemented in this thesis. This user interface generates some HTML form based on these ontology and additional HTN ontology.","PeriodicalId":206207,"journal":{"name":"2008 Japan-China Joint Workshop on Frontier of Computer Science and Technology","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114257993","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"VNIX: Managing Virtual Machines on Clusters","authors":"Xuanhua Shi, Haoyu Tan, Song Wu, Hai Jin","doi":"10.1109/FCST.2008.7","DOIUrl":"https://doi.org/10.1109/FCST.2008.7","url":null,"abstract":"With the development of virtualization technology, it¿s desirable to deploy virtual machines to high performance clusters used for data centers. VNIX, developed in Services Computing Technology and System lab, tries to help cluster administrators to manage a large number of virtual machines (VMs) distributed on clusters. To reduce the complexity of virtualization management, VNIX provides a whole-set of tools for monitoring, deploying, controlling, and configuring virtual machines on clusters. In addition to those basic management functions, VNIX also offers a number of specialized tools for clustering VMs. Due to the complex dynamic environment in clusters, it¿s challenging to design such tools. In this paper, we present the design of VNIX, and we describe several use cases of managing VMs in clusters with VNIX. Such use cases illustrate various ways of using VNIX to simplify the management work and to improve resource utilization.","PeriodicalId":206207,"journal":{"name":"2008 Japan-China Joint Workshop on Frontier of Computer Science and Technology","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128288059","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Evolutionary Testing of Unstructured Programs Using a Testability Transformation Approach","authors":"Sheng Jiang, Yansheng Lu","doi":"10.1109/FCST.2008.21","DOIUrl":"https://doi.org/10.1109/FCST.2008.21","url":null,"abstract":"Evolutionary testing is an effective technique for automatically generating good quality test data. However, under the Node-Orient criterion, the technique is hindered by the presence of unstructured control flow within loops, this is because the control dependence is effectively ignored by the fitness function. In this paper a method of testability transformation is proposed in order to circumvent the problem, the approach is a source-to-source transformation that aims to improve the performance of evolutionary testing for unstructured programs. An experimental study is then presented, which shows the power of the approach, comparing evolutionary search with transformed and untransformed versions of two programs, the results show that our new fitness calculation rule could effectively guide evolutionary search to successsfully find the required test data at low cost.","PeriodicalId":206207,"journal":{"name":"2008 Japan-China Joint Workshop on Frontier of Computer Science and Technology","volume":"49 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133729934","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Study and Optimize the Process of Batch Small Files Replication","authors":"Liang Xiao, Q. Cao, C. Xie, Chuanwen Wu","doi":"10.1109/FCST.2008.32","DOIUrl":"https://doi.org/10.1109/FCST.2008.32","url":null,"abstract":"I/O performance is always the traditional criterion for the evaluation of storage system. Many researches have been being carried on how to improve the storage system performance, mainly focusing on the storage architecture and I/O optimization for the storage devices. In many application systems, the phenomenon of replicating batch small files between two locations widely exists and always represents poor performance in systems. This paper analyzes and optimizes replication process for batch small files in Linux file system. In local case, six algorithms are achieved by using parallel, consecutive and aggregating polices in different stages of the whole process. In network case, achieve and compress strategies are also introduced and compared with aggregating algorithm. Moreover, the average latency of basic operations in each stage of file I/O can be estimated accurately, which is helpful for future research of file system. The experiment shows that the algorithm of consecutive reading source files and parallel writing target files have the best performance in local replication, and aggregating algorithm also do in network replication.","PeriodicalId":206207,"journal":{"name":"2008 Japan-China Joint Workshop on Frontier of Computer Science and Technology","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132217790","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Novel Hybrid system for Large-Scale Chinese Text Classification Problem","authors":"Zhong Gao, Guanming Lu, Daquan Gu","doi":"10.1109/FCST.2008.29","DOIUrl":"https://doi.org/10.1109/FCST.2008.29","url":null,"abstract":"Most of the Chinese text classification systems are all based on the technology of bag of words (BW) which is a valid probability tool for text representation and can provide a better semantic architecture. But the weakness in classification accuracy is still unconquerable. Support vector machine (SVM) has become a popular classification tool and can be applied in the scheme, but the main disadvantages of SVM algorithms are their large memory requirement and computation time to deal with very large datasets. In this paper, we propose a hybrid system based on BW and a novel cascade SVM with feedback that can be splitting the problem into smaller subsets and training a network to assign samples of different subsets. The proposed parallel training algorithm on large-scale classification problems where multiple SVM classifiers are applied speeds up the process of training SVM and increase the classification accuracy.","PeriodicalId":206207,"journal":{"name":"2008 Japan-China Joint Workshop on Frontier of Computer Science and Technology","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132502919","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An Energy Efficient Weight-Clustering Algorithm in Wireless Sensor Networks","authors":"Lu Cheng, D. Qian, Weiguo Wu","doi":"10.1109/FCST.2008.24","DOIUrl":"https://doi.org/10.1109/FCST.2008.24","url":null,"abstract":"In recent years, the wireless sensor networks (WSN) attracts increasing attention due to its bright application prospect in both military and civil fields. However, for the WSN is extremely energy limited, the traditional network routing protocols are not suitable to it. Energy conservation becomes a crucial problem in WSN routing protocol. Cluster-based routing protocols such as LEACH conserve energy by forming clusters which only cluster heads need to consume extra energy to perform data aggregation and transmit it to base station. Unfortunately, cluster formation not only dissipates lots of energy but also increases overhead. We propose an energy-efficient, weighted clustering algorithm which improves the cluster formation process of LEACH by taking residual energy, mutual position, workload balance and MAC functioning in to consideration. The algorithm is flexible and coefficients can be adjusted according to different networks. The simulation experiments demonstrate the proposed algorithm in this paper is better in performance than LEACH.","PeriodicalId":206207,"journal":{"name":"2008 Japan-China Joint Workshop on Frontier of Computer Science and Technology","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123709759","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}