{"title":"Detection of Microcalcifications in Mammograms Using Support Vector Machine","authors":"M. Sharkas, Mohamed Al-Sharkawy, D. Ragab","doi":"10.1109/EMS.2011.23","DOIUrl":"https://doi.org/10.1109/EMS.2011.23","url":null,"abstract":"For years cancer has been one of the biggest threats to human life, it is expected to become the leading cause of death over the next few decades. Early detection of breast cancer can play an important role in reducing the associated morbidity and mortality rates. Clusters of micro calcifications (MC) in the mammograms are an important early sign of breast cancer. Mammography is currently the most sensitive method to detect early breast cancer. Manual readings of mammograms may result in misdiagnosis due to human errors caused by visual fatigue. Computer aided detection systems (CAD) serve as a second opinion for radiologists. A new CAD system for the detection of MCs in mammograms is proposed. The discrete wavelet transforms (DWT), the contour let transform, and the principal component analysis (PCA) are used for feature extraction, while the support vector machine (SVM) is used for classification. The best classification rate was achieved using the DWT features. The system classifies normal and tumor tissues in addition to benign and malignant tumors. The classification rate was 100%.","PeriodicalId":131364,"journal":{"name":"2011 UKSim 5th European Symposium on Computer Modeling and Simulation","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126096519","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Mehdi Mohtashamzadeh, Ladan Momeni, Arshin Rezazadeh, S. Baghbani, Asma Mahdipoor
{"title":"A New Routing Algorithm for Irregular Mesh NoCs without Virtual Channel","authors":"Mehdi Mohtashamzadeh, Ladan Momeni, Arshin Rezazadeh, S. Baghbani, Asma Mahdipoor","doi":"10.1109/EMS.2011.53","DOIUrl":"https://doi.org/10.1109/EMS.2011.53","url":null,"abstract":"Network-on-Chips (NoCs) usually use regular mesh-based topologies. Regular mesh topologies are not always efficient because of power and area constraints which should be considered in designing system-on-chips. To overcome these problems, irregular mesh NoCs are used for which the design of routing algorithms is an important issue. This paper presents a novel routing algorithm for irregular mesh-based NoCs called Anomalous Routing for Mesh (ARM). In contrast to other routing algorithms, this algorithm can be implemented on any arbitrary irregular mesh NoC and can tolerate solid irregular areas without using virtual channels. Furthermore, the proposed scheme misroutes messages both in clockwise and counter-clockwise directions to reduce channel contention on an oversized node (ON). The main idea of this algorithm is borrowed from odd-even turn model and FT-Cube algorithm. Moreover, proposed algorithm is deadlock-free and live lock-free for non-overlapping irregular areas in mesh NoC interconnection network.","PeriodicalId":131364,"journal":{"name":"2011 UKSim 5th European Symposium on Computer Modeling and Simulation","volume":"47 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127745888","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Optimizing Cloud MapReduce for Processing Stream Data Using Pipelining","authors":"Rutvik Karve, Devendra Dahiphale, Amit Chhajer","doi":"10.1109/EMS.2011.76","DOIUrl":"https://doi.org/10.1109/EMS.2011.76","url":null,"abstract":"Cloud MapReduce (CMR) is a framework for processing large data sets of batch data in cloud. The Map and Reduce phases run sequentially, one after another. This leads to: 1. Compulsory batch processing 2. No parallelization of the map and reduce phases 3. Increased delays. The current implementation is not suited for processing streaming data. We propose a novel architecture to support streaming data as input using pipelining between the Map and Reduce phases in CMR, ensuring that the output of the Map phase is made available to the Reduce phase as soon as it is produced. This 'Pipelined MapReduce' approach leads to increased parallelism between the Map and Reduce phases, thereby 1. Supporting streaming data as input 2. Reducing delays 3. Enabling the user to take 'snapshots' of the approximate output generated in a stipulated time frame. 4. Supporting cascaded MapReduce jobs. This cloud implementation is light-weight and inherently scalable.","PeriodicalId":131364,"journal":{"name":"2011 UKSim 5th European Symposium on Computer Modeling and Simulation","volume":"69 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127629249","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Intelligent Composition of Dynamic-Cost Services in Service-Oriented Architectures","authors":"G. A. Ebrahim","doi":"10.1109/EMS.2011.51","DOIUrl":"https://doi.org/10.1109/EMS.2011.51","url":null,"abstract":"Service-oriented architecture becomes a major computing practice in modern enterprise software systems. In several cases, the cost of the services could depend on the combination of the services utilized. In addition, there could be several constraints imposed on the service composition process. One of the important constraints is to impose an upper bound on the number of clients' usages of services in a period of time. Another important constraint is to restrict the number of service-providers that should be utilized. This constraint is normally needed to reduce the number of long-term relationships between the clients and the service-providers. In this paper, a new algorithm is introduced that tries to find the optimal set of services needed by the software designer to fit his computing requirements. It tries to minimize the overall incurred cost taking into consideration the dynamic-cost of the services occurred due to clients' usage patterns. In addition, it tries to minimize the number of service-providers utilized and maximize the overall QoS. In addition, it takes into account the constraints imposed on using the services. Genetic algorithms are adopted in tackling this problem, which are able to reach a near-optimal solution.","PeriodicalId":131364,"journal":{"name":"2011 UKSim 5th European Symposium on Computer Modeling and Simulation","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132728127","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Select the Appropriate Model for the Earth's Magnetic Field","authors":"K. Kianfar, Ahad Jafarpour Mahalleh, A. Moridi","doi":"10.1109/EMS.2011.59","DOIUrl":"https://doi.org/10.1109/EMS.2011.59","url":null,"abstract":"Various models have been proposed for the Earth's magnetic field. These models can be validating locally or can be cited for the entire earth. In this paper, we examined the most significant models. We also analysis and investigate the actual performance of the Earth's magnetic field.","PeriodicalId":131364,"journal":{"name":"2011 UKSim 5th European Symposium on Computer Modeling and Simulation","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114293846","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Rashin Rezazadeh, B. Shadgar, A. Osareh, Arshin Rezazadeh
{"title":"Ontology-Based Data Instantiation Using Web Service","authors":"Rashin Rezazadeh, B. Shadgar, A. Osareh, Arshin Rezazadeh","doi":"10.1109/EMS.2011.50","DOIUrl":"https://doi.org/10.1109/EMS.2011.50","url":null,"abstract":"The Semantic Web aims at creating a platform where information has its semantics and can be understood and processed by computers themselves with minimum human interference. Ontology theory and its related technology have been developed to help construct such a platform because ontology promises to encode certain levels of semantics for information and offers a set of common vocabulary for people or computer to communicate with. In this article, we introduced the open-source software called ontology instantiate. This software has been created for book ontology construction and instantiation using web services. This software helps users to instantiate ontology of book information on Amazon web site. This software also allows the user to merge another book ontology in its produced ontology and integrates them in the form unit ontology. This software for integration of these ontologies uses a wide range of similarity measures, including semantic similarity, string-based similarity and structural similarity. The tree is used for investigating the structural similarity. Dictionaries like Wikipedia, Word Net, Google and Yahoo is used for investigating semantic similarity and string-based similarity.","PeriodicalId":131364,"journal":{"name":"2011 UKSim 5th European Symposium on Computer Modeling and Simulation","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125427926","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
N. Khairudin, M. A. Haron, S. Junid, A. Halim, M. Idros, N. S. A. Razak
{"title":"Design and Analysis of High Performance and Low Power Matrix Filling for DNA Sequence Alignment Accelerator Using ASIC Design Flow","authors":"N. Khairudin, M. A. Haron, S. Junid, A. Halim, M. Idros, N. S. A. Razak","doi":"10.1109/EMS.2011.9","DOIUrl":"https://doi.org/10.1109/EMS.2011.9","url":null,"abstract":"Efficient sequence alignment is one of the most important and challenging activities in bioinformatics. Many algorithms have been proposed to perform and accelerate sequence alignment activities. Among them Smith-Waterman (S-W) is the most sensitive (accurate) algorithm. This paper presents a novel approach and analysis of High Performance and Low Power Matrix Filling for DNA Sequence Alignment Accelerator by using ASIC design flow. The objective of this paper is to improve the performance of the DNA sequence alignment and to optimize power reduction of the existing technique by using Smith Waterman (SW) algorithm. The scope of study is by using the matrix filling method which is in parallel implementation of the Smith-Waterman algorithm. This method provides more efficient speed up compared to the traditional sequential implementation but at the same time maintaining the level of sensitivity. The methodology of this paper is using FPGA and Synopsis. This technique is used to implement the massive parallelism. The design was developed in Verilog HDL coding and synthesized by using LINUX tools. Matrix Cells with a design area 8808.307mm2 at 40ns clock period is the best design. Thus the power required at this clock period also smaller, dynamic power 111.1415uW and leakage power 212.9538 Nw. This is a large improvement over existing designs and improves data throughput by using ASIC design flow.","PeriodicalId":131364,"journal":{"name":"2011 UKSim 5th European Symposium on Computer Modeling and Simulation","volume":"79 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126907349","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Applying Formal Verification to a Cache Coherence Protocol in TLS","authors":"X. Lai, Cong Liu, Zhiying Wang","doi":"10.1109/EMS.2011.48","DOIUrl":"https://doi.org/10.1109/EMS.2011.48","url":null,"abstract":"Current hardware implementations of TLS (thread-level speculation) in both Hydra and Renau's SESC simulator use a global component to check data dependence violations, e.g. L2 Cache or hardware list. Frequent memory accesses cause global component bottlenecks. In this paper, we propose a cache coherence protocol using a distributed data dependence violation checking mechanism for TLS. The proposed protocol extends the traditional MESI cache coherence protocol by including several methods to exceed the present limits of centralized violation checking methods. The protocol adds an invalidation vector to each private L1 cache to record threads that violate RAW data dependence. It also adds a versioning priority register that compares data versions. Added to each private L1 cache block is a snooping bit which indicates whether the thread possesses a bus snooping right for the block. The proposed protocol is much more complicated than the traditional MESI protocol and hard to be completely verified only through simulation. So we applied formal verification to the proposed cache protocol to confirm its correctness. The verification result shows that the proposed protocol will function correctly in TLS system.","PeriodicalId":131364,"journal":{"name":"2011 UKSim 5th European Symposium on Computer Modeling and Simulation","volume":"123 1-2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127003950","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
K. Nwizege, F. Chukwunonso, Charity Kpabeb, Shedrack Mmeah
{"title":"The Impact of ICT on Computer Applications","authors":"K. Nwizege, F. Chukwunonso, Charity Kpabeb, Shedrack Mmeah","doi":"10.1109/EMS.2011.45","DOIUrl":"https://doi.org/10.1109/EMS.2011.45","url":null,"abstract":"Information and Communication Technology (ICT) is the study of the technology used in handling information and its concepts that aids communication. It has been identified by many international development institutions as a crucial element in developing the worlds' poorest countries, by integrating them into the global economy and by making global markets more accessible. ICT covers any products that will store, retrieve, manipulate, transmits or receives information electronically in a digital form. For example, personal computers, digital television. It is the basis of all fields of computer applications. It also finds usefulness in Management, Science and Engineering. ICT has created innovative applications that have lead to making life easier in many sectors. In this paper, we have analysis the impact of ICT on computer applications and its essentials.","PeriodicalId":131364,"journal":{"name":"2011 UKSim 5th European Symposium on Computer Modeling and Simulation","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131645820","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"On the TCP Flow Inter-arrival Times Dsitribution","authors":"L. Arshadi, A. Jahangir","doi":"10.1109/EMS.2011.34","DOIUrl":"https://doi.org/10.1109/EMS.2011.34","url":null,"abstract":"IP packets are known to have long range dependence and show self-similar properties. However, TCP flows-a set of related IP packets that form a TCP connection-which are considered to be generated by a large population of users and consequently mutually independent, seem to be best modeled by either Poisson processes with exponential inter-arrival times or some distributions with heavy tails such as Weibull distribution. In this paper, we show that despite the number of active nodes in a network, the inter-arrival times of TCP flows in the \"normal traffic\" conform to the Weibull distribution and any irregularity in the traffic causes deviations in the distribution of the inter-arrival times and so can be detected. This leads to a straightforward method for anomaly detection by which we are able to identify the anomalous part(s) of the traffic. We first apply the median-rank method to estimate the Weibull distribution parameters of the traffic and then check the conformity of the data against a Weibull distribution with the estimated parameters and determine whether the traffic is normal or not based on the chi-square test.","PeriodicalId":131364,"journal":{"name":"2011 UKSim 5th European Symposium on Computer Modeling and Simulation","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114933428","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}