{"title":"Two modified Otsu image segmentation methods based on Lognormal and Gamma distribution models","authors":"D. Alsaeed, A. Bouridane, A. Elzaart, R. Sammouda","doi":"10.1109/ICITES.2012.6216680","DOIUrl":"https://doi.org/10.1109/ICITES.2012.6216680","url":null,"abstract":"Otsu's method of image segmentation is one of the best methods for threshold selection. With Otsu's method an optimum threshold is found by maximizing the between-class variance; Otsu algorithm is based on the gray-level histogram which is estimated by a sum of Gaussian distributions. In some type of images, image data does not best fit in a Gaussian distribution model. The objective of this study is to develop and compare two modified versions of Otsu method, one is based on Lognormal distribution (Otsu-Lognormal), while the other is based on Gamma distribution (Otsu-Gamma); the maximum between-cluster variance is modified based on each model. The two proposed methods were applied on several images and promising experimental results were obtained. Evaluation of the resulting segmented images shows that both Otsu-Gamma method and Otsu-Lognormal yield better estimation of the optimal threshold than does the original Otsu method with Gaussian distribution (Otsu).","PeriodicalId":137864,"journal":{"name":"2012 International Conference on Information Technology and e-Services","volume":"70 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122497604","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An algorithm for web services composition and adaptation based on interface description","authors":"Y. Oussalah, N. Zeghib","doi":"10.1109/ICITES.2012.6216650","DOIUrl":"https://doi.org/10.1109/ICITES.2012.6216650","url":null,"abstract":"Web services play an important role in the development of distributed systems. Particularly, the possibility of composing already implemented web services in order to provide new functionalities is an interesting approach for building distributed applications and business processes. In this paper we focus on mismatches occurring during dynamic composition of web services. We propose an algorithm for dynamic and automatic composition and adaptation of web services. Our approach only uses the information that is already available in service interface descriptions. The algorithm allows programmers to define dynamic web service compositions without changing the source code.","PeriodicalId":137864,"journal":{"name":"2012 International Conference on Information Technology and e-Services","volume":"199 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123252916","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Choreography of web services and the impact of estimation of execution plan","authors":"F. Halili, A. Dika","doi":"10.1109/ICITES.2012.6216619","DOIUrl":"https://doi.org/10.1109/ICITES.2012.6216619","url":null,"abstract":"Nowadays Web Services are gaining a lot of attention as a new technology, which indicate automated information services and a collection of functions packaged in a single, self-contained entity and conducted over internet for other programs to use. They use the standardized technologies and formats/protocols that simplify the exchange and integration of large amounts of data over the Internet, and as such they make it easier to conduct work across organizations regardless of the types of operating systems, hardware/software, programming languages, and databases that are being used. This paper will express the modeling of Choreography of web services; which is the process of combining different web services with intention to create interactive applications. The interconnection of web services with databases is a valuable step, because it provides backup for data. When it comes to analyzing the performance of a specific query, one of the best methods is to view the query execution plan, which provides information on how a specific DBMS query optimizer runs a specific query. This information is very valuable when we want to find out why a specific query is running slow.","PeriodicalId":137864,"journal":{"name":"2012 International Conference on Information Technology and e-Services","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130585348","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Learning invariants using association rules technique","authors":"M. A. Souaiaia, T. Benouhiba","doi":"10.1109/ICITES.2012.6216625","DOIUrl":"https://doi.org/10.1109/ICITES.2012.6216625","url":null,"abstract":"Dynamic invariant detection is the identification of properties of programs by analyzing execution traces. Traditional dynamic invariant detectors, such as Daikon, use naive techniques based on verification of predefined invariant forms. Unfortunately, this may discard many useful knowledge such as relationship between variables. This kind of knowledge can be helpful to understand hidden dependencies in the program. In this paper, we propose to model invariant detection as a machine learning process. We intend to use learning algorithms to find out correlation between variables. We are particularly interested by association rules since they are suitable to detect such relationship. We propose an adaptation to existing learning techniques as well as some pruning algorithms in order to refine the obtained invariants. Compared to the traditional Daikon tool, our approach has successfully inferred many meaningful invariants about variables relationship.","PeriodicalId":137864,"journal":{"name":"2012 International Conference on Information Technology and e-Services","volume":"113 2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130173039","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Abduljawad A. Amory, A. El Zaart, Anas O. Rokabi, H. Mathkour, R. Sammouda
{"title":"Fast optimal thresholding based on between-class variance using mixture of log-normal distribution","authors":"Abduljawad A. Amory, A. El Zaart, Anas O. Rokabi, H. Mathkour, R. Sammouda","doi":"10.1109/ICITES.2012.6216682","DOIUrl":"https://doi.org/10.1109/ICITES.2012.6216682","url":null,"abstract":"Image thresholding is a technique used to estimate threshold values for segmenting an input image into distinct regions. The goal of image thresholding is to simplify or change the representation of an image into something that is easier to analyze and is more meaningful. The most famous image thresholding method is Otsu's global automatic image thresholding method which has been widely applied in many fields, especially those with real-time applications. In this paper we propose a new method for segmenting images based on Otsu's method by estimating the threshold using a histogram. Our method is based on between-class variance, and finds the optimal threshold using the first derivative of the BCV relation to obtain iterative equations which then produce the optimal threshold. We apply this method to SAR images, where it gives promising results compared with Otsu's method based on the Gaussian and gamma distributions.","PeriodicalId":137864,"journal":{"name":"2012 International Conference on Information Technology and e-Services","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132285880","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A heuristic approach to re-allocate data fragments in DDBSs","authors":"A. Amer, H. I. Abdalla","doi":"10.1109/ICITES.2012.6216621","DOIUrl":"https://doi.org/10.1109/ICITES.2012.6216621","url":null,"abstract":"Fragmentation of a large, global databases are performed by dividing the database horizontally, vertically or both. In order to enable distributed database systems to work efficiently, these fragments have to be allocated across the available sites in such a way that reduces communication cost. This paper presents a new efficient data re-allocation model for replicated and non-replicated constrained DDBSs by bringing a change to data access pattern over sites. This approach assumes that the distribution of fragments over network sites was initially performed according to a properly forecasted set of query frequency values that could be employed over sites. Our model proposes a plan to re-allocate data fragments based on communication costs between sites and update cost values for each fragment. The re-allocation process will be performed based on selecting the maximal update cost value for each fragment and deciding on the re-allocation accordingly. empirical results showed that the proposed technique will effectively contribute in solving dynamic fragments re-allocation problem in the context of a distributed relational database systems.","PeriodicalId":137864,"journal":{"name":"2012 International Conference on Information Technology and e-Services","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134235626","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Heart region extraction and segmentation from chest CT images using Hopfield Artificial Neural Networks","authors":"R. Sammouda, R. M. Jomaa, H. Mathkour","doi":"10.1109/ICITES.2012.6216678","DOIUrl":"https://doi.org/10.1109/ICITES.2012.6216678","url":null,"abstract":"A system for extracting and segmenting heart regions from three-dimensional (3D) CT chest images is proposed in this paper. At first, the regions of interest (ROIs) are extracted using pure basic image processing techniques applied on the 2D CT slices. Secondly, the ROIs in each slice are segmented using Hopfield Artificial Neural Networks (HANN). The segmentation results include tissues belonging to the heart and its surrounding organs. To distinguish between heart regions and the non-heart regions, a rule-based filtering approach is adopted. The system is evaluated using a database of 735 chest CT slices from 5 patients. It shows a good and accurate performance with some exceptions.","PeriodicalId":137864,"journal":{"name":"2012 International Conference on Information Technology and e-Services","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133090685","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Identity based hybrid signcryption revisited","authors":"K. Singh","doi":"10.1109/ICITES.2012.6216646","DOIUrl":"https://doi.org/10.1109/ICITES.2012.6216646","url":null,"abstract":"Signcryption is a cryptographic primitive which combines both the functions of digital signature and public key encryption logically in single step, and with a computational cost significantly less than required by the traditional signature-then-encryption approach. Identity based cryptosystem is a public key cryptosystem in which public key can be any arbitrary string. Hybrid cryptosystem combines the convenience of a public key cryptosystem with the efficiency of a symmetric cryptosystem. Dent [4] has given security models for hybrid signcryption scheme with insider security. He has given two security models for hybrid signcryption KEM: key indistinguishability and message indistinguishability model. Hybrid signcryption in identity base setting was given by Fagen Li et al. [10]. In this paper [10] only one security model key indistinguishability is considered. Our contribution in this paper is three fold: First we give new proof for IDB hybrid signcryption in Dent [4] security model. Second for the confidentiality of hybrid signcryption, we improve the bound as compared to Dent [4]. Third we also give the example of identity based hybrid signcryption based on [11].","PeriodicalId":137864,"journal":{"name":"2012 International Conference on Information Technology and e-Services","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129349063","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Using vector quantization in Automatic Speaker Verification","authors":"Djellali Hayet, L. M. Tayeb","doi":"10.1109/ICITES.2012.6216611","DOIUrl":"https://doi.org/10.1109/ICITES.2012.6216611","url":null,"abstract":"This article investigates several technique based on vector quantization (VQ) and maximum a posteriori adaptation (MAP) in Automatic Speaker Verification ASV. We propose to create multiple codebooks of Universal Background Model UBM by Vector Quantization and compare them with traditional approach in VQ, MAP adaptation and Gaussian Mixture Models.","PeriodicalId":137864,"journal":{"name":"2012 International Conference on Information Technology and e-Services","volume":"185 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116657684","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Formal analysis of PKM using scyther tool","authors":"Noudjoud Kahya, N. Ghoualmi, P. Lafourcade","doi":"10.1109/ICITES.2012.6216598","DOIUrl":"https://doi.org/10.1109/ICITES.2012.6216598","url":null,"abstract":"Owing to the natural characteristics of wireless communication, anyone can intercept or inject frames, making wireless communication much more vulnerable to attacks than its wired equivalents. In this paper we focused on the PKM protocol which provides the authorization process and secure distribution of keying data from the base station to mobile station. Concentrating on PKMv2, we give a formal analysis of this version and we found that is vulnerable to replay, DoS, Man-in-the middle attacks. We propose a new methodology to prevent the authorization protocol from such attacks by using nonce and timestamp together.","PeriodicalId":137864,"journal":{"name":"2012 International Conference on Information Technology and e-Services","volume":"89 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124745074","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}