M. Djehaich, H. Ziane, N. Achour, R. Tiar, N. Ouadah
{"title":"SLAM-ICP with a Boolean method applied on a car-like robot","authors":"M. Djehaich, H. Ziane, N. Achour, R. Tiar, N. Ouadah","doi":"10.1109/ISPS.2013.6581476","DOIUrl":"https://doi.org/10.1109/ISPS.2013.6581476","url":null,"abstract":"Scan matching is a popular way of recovering a mobile robot's motion and constitutes the basis of many localization and mapping approaches. Consequently, a variety of scan matching algorithms have been proposed in the past. All these algorithms share one common attribute: They match pairs of scans to obtain spatial relations between two robot poses. The work presented in this paper consists in the implementation of a SLAM algorithm (Simultaneous Localization and Mapping) on a car-like vehicle. Our algorithm is based on a measurement alignment method called “Iterative Closest Points” (ICP) using binary weighted method (Boolean). It helps find the rigid transformation that minimizes the distance between two clouds of points. The developed algorithm (SLAM-ICP) has been implemented and tested on the mobile robot. Experimental results given at the end of this paper are compared to classical localization technique (odometry) and SLAM-ICP with the recursive method that is already implemented on the Robucar.","PeriodicalId":222438,"journal":{"name":"2013 11th International Symposium on Programming and Systems (ISPS)","volume":"75 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127343614","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The impact of ECC's scalar multiplication on wireless sensor networks","authors":"Merad Boudia Omar Rafik, F. Mohammed","doi":"10.1109/ISPS.2013.6581488","DOIUrl":"https://doi.org/10.1109/ISPS.2013.6581488","url":null,"abstract":"The security in wireless sensor networks (WSNs) becomes an attractive area of research, especially during the last few years. This is due to the large number of applications where the sensors are deployed and the security needs. Elliptic Curve Cryptography (ECC) is a public key approach that represents one of solution and even a serious candidate for providing security in WSN. Unfortunately, the execution time of its slightly complex operations makes the suitability of ECC broken to a limited number of applications. The most expensive operation in ECC is scalar point multiplication (SPM). However, the ECC based schemes depend strongly on the performance of SPM. The side channel attacks (SCA) on ECC exploit the information leaks during SPM execution in order to find the secret key. In this paper, we focus on scalar point multiplication, its efficiency and its security against SCA (in particular simple power analysis). The experimental results are realized on the 16 bits TelosB mote.","PeriodicalId":222438,"journal":{"name":"2013 11th International Symposium on Programming and Systems (ISPS)","volume":"601 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116320393","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Fuzzy Alpha-cuts to capture customer requirements in improving product development","authors":"F. Bencherif, L. Mouss, S. Benaicha","doi":"10.1109/ISPS.2013.6581493","DOIUrl":"https://doi.org/10.1109/ISPS.2013.6581493","url":null,"abstract":"Quality Function Deployment is a tool to develop and design the quality in product and improve competitiveness advantages in the market. In developing new products and projects, we receive the needs from the customer, pass it around a corporate communication circle, and eventually return it to the customer in the form of the new product. First, needs and languages received from customer are often ambiguous, imprecise, and uncertain causing deviated studied results, and in a disregarding of the voice of customer. Second, to improve quality and solve the uncertainty in product development, numerous researchers try to apply the fuzzy set theory to product development. Their models usually focus only on customer requirements or on engineering characteristics. The subsequent stages of product design are rarely addressed. The correlation between engineering characteristics and benchmarking analysis disregarded in most of Quality Function Deployment practices related researches. This commonly upsets the consequences to delay and failed project development. Aiming to solve these three issues, the objective of this paper is to improve the accuracy of Quality Function Deployment, optimize and develop the customer requirements approach to attenuate risks in subsequent phases and in manufacturing process to increase industrial performance. This approach is based on Fuzzy sets theory and Alpha-cut operations, Pairwise comparison method, and fuzzy ranking and clustering method, and on theory of inventive problems solving (TRIZ).","PeriodicalId":222438,"journal":{"name":"2013 11th International Symposium on Programming and Systems (ISPS)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124350144","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A new method for dimensionality reduction of multi-dimensional data using Copulas","authors":"Rima Houari, A. Bounceur, Mohand Tahar Kechadi","doi":"10.1109/ISPS.2013.6581491","DOIUrl":"https://doi.org/10.1109/ISPS.2013.6581491","url":null,"abstract":"A new technique for the Dimensionality Reduction of Multi-Dimensional Data is presented in this paper. This technique employs the theory of Copulas to estimate the multivariate joint probability distribution without constraints to specific types of marginal distributions of random variables that represent the dimensions of our Data. A Copulas-based model, provides a complete and scale-free description of dependence that is more suitable to be modeled using well-known multivariate parametric laws. The model can be readily used for comparing of dependence of random variables by estimating the parameters of the Copula and to better see the relationship between data. This dependence is thereafter used for detecting the Redundant Values and noise in order to clean the original data, reduce them (eliminate Redundant attributes) and obtain representative Samples of good quality. We compared the proposed approach with singular values decomposition (SVD) technique, one of the most efficient method of Data mining.","PeriodicalId":222438,"journal":{"name":"2013 11th International Symposium on Programming and Systems (ISPS)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117194201","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Information retrieval techniques for knowledge discovery in biomedical literature","authors":"Sabrina Cherdioui, Fatiha Boubekeur","doi":"10.1109/ISPS.2013.6581479","DOIUrl":"https://doi.org/10.1109/ISPS.2013.6581479","url":null,"abstract":"This paper presents our contribution to enhance literature-based discovery with information retrieval techniques. We propose the joint use of a flexible Information Retrieval model and MeSH Concepts for knowledge discovery in biomedical literature. The Information Retrieval model contributes to filter MEDLINE biomedical literature to the most relevant documents. Utilizing MeSH concepts allows to quickly identifying candidate concepts that could potentially validate a hypothesis. We have tested our approach by replicating the Swanson's first discovery on fish oil and Raynaud's disease correlation. The obtained results show the effectiveness of our approach.","PeriodicalId":222438,"journal":{"name":"2013 11th International Symposium on Programming and Systems (ISPS)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128546407","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Fast simplification with sharp feature preserving for 3D point clouds","authors":"H. Benhabiles, O. Aubreton, H. Barki, Hedi Tabia","doi":"10.1109/ISPS.2013.6581492","DOIUrl":"https://doi.org/10.1109/ISPS.2013.6581492","url":null,"abstract":"This paper presents a fast point cloud simplification method that allows to preserve sharp edge points. The method is based on the combination of both clustering and coarse-to-fine simplification approaches. It consists to firstly create a coarse cloud using a clustering algorithm. Then each point of the resulting coarse cloud is assigned a weight that quantifies its importance, and allows to classify it into a sharp point or a simple point. Finally, both kinds of points are used to refine the coarse cloud and thus create a new simplified cloud characterized by high density of points in sharp regions and low density in flat regions. Experiments show that our algorithm is much faster than the last proposed simplification algorithm [1] which deals with sharp edge points preserving, and still produces similar results.","PeriodicalId":222438,"journal":{"name":"2013 11th International Symposium on Programming and Systems (ISPS)","volume":"58 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130142041","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Improvement of scalar multiplication time for elliptic curve cryptosystems","authors":"M. Lehsaini, M. Feham, Chifaa Tabet Hellel","doi":"10.1109/ISPS.2013.6581494","DOIUrl":"https://doi.org/10.1109/ISPS.2013.6581494","url":null,"abstract":"Sensor nodes have limited computing power and memory sizes. Sometimes, they are used in applications that require sending rapidly secure data to a remote control center. Therefore, they require lightweight techniques to accomplish this task. In this paper, we used Elliptical Curve Cryptography (ECC) for data encryption because ECC could create smaller and more efficient cryptographic keys compared to other cryptographic techniques such as RSA. We used specific algorithms to improve scalar multiplication time in spite of energy consumption. Moreover, we proposed a distributed scheme to enhance more the data delivery time from a source node to the base station by involving neighbors in the calculation. The results of experiments on TelosB motes showed considerable improvement of data delivery time.","PeriodicalId":222438,"journal":{"name":"2013 11th International Symposium on Programming and Systems (ISPS)","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131424018","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"TOP-SKY: Top-down algorithm for computing the skycube","authors":"Samiha Brahimi, M. Kholladi, Amina Hamerelain","doi":"10.1109/ISPS.2013.6581483","DOIUrl":"https://doi.org/10.1109/ISPS.2013.6581483","url":null,"abstract":"It is a big pleasure for the user to find the result of its skyline query without waiting for its processing; this can be realized by getting a saved version of this query. This paper focuses on the pre-computation of the skylines of all possible nonempty subsets of a given set of dimensions, what we call the skycube. We develop an efficient top-down approach called TOP-SKY for the skycube computation which derives the skyline objects from one of the subspace's parents adopting some techniques that help to achieve a better performance. In order to evaluate the effectiveness of the approach, TOP-SKY has been compared with the best algorithm in our knowledge Orion and with computing the cuboids of the skycube individually using BNL algorithm.","PeriodicalId":222438,"journal":{"name":"2013 11th International Symposium on Programming and Systems (ISPS)","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133266683","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Road traffic congestion estimation with macroscopic parameters","authors":"Asmâa Ouessai, K. Mokhtar, Ouamri Abdelaziz","doi":"10.1109/ISPS.2013.6581489","DOIUrl":"https://doi.org/10.1109/ISPS.2013.6581489","url":null,"abstract":"In this paper we propose an algorithm for road traffic density estimation, using macroscopic parameters, extracted from a video sequence. Macroscopic parameters are directly estimated by analyzing the global motion in the video scene without the need of motion detection and tracking methods. The extracted parameters are applied to the SVM classifier, to classify the road traffic in three categories: light, medium and heavy. The performance of the proposed algorithm is compared to that of the texture dynamic based traffic road classification method, using the same data base.","PeriodicalId":222438,"journal":{"name":"2013 11th International Symposium on Programming and Systems (ISPS)","volume":"47 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131987821","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"L-P2DSA: Location-based privacy-preserving detection of Sybil attacks","authors":"Kenza Mekliche, S. Moussaoui","doi":"10.1109/ISPS.2013.6581485","DOIUrl":"https://doi.org/10.1109/ISPS.2013.6581485","url":null,"abstract":"In this paper we propose an approach that uses infrastructures and localization of nodes to detect Sybil attacks. Security and privacy are two major concerns in VANETs. Regrettably, most privacy-preserving schemes are prone to Sybil attacks, where a malicious user pretends to be multiple vehicles. L-P2DSA is an improvement to C-P2DAP [3], as it allows detecting Sybil attacks while reducing the load on the DMV. This is done due to the cooperation between adjacent RSUs to determine the location of suspicious nodes and measure a distinguishability degree between the positions of these malicious nodes. The detection in this manner doesn't need for any vehicle to disclose its identity; thus preserving privacy. The applicability of our contribution is validated through simulation of a realistic test case.","PeriodicalId":222438,"journal":{"name":"2013 11th International Symposium on Programming and Systems (ISPS)","volume":"266 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123109634","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}