{"title":"Image forgery detection using QR method based on one dimensional cellular automata","authors":"Sandarbh Singh, R. Bhardwaj","doi":"10.1109/IC3.2016.7880197","DOIUrl":"https://doi.org/10.1109/IC3.2016.7880197","url":null,"abstract":"This paper tries to provide a reliable scheme for the image forgery detection. QR Decomposition is applied on the original image to extract the features of the image and then Robust Secret Key is generated by pushing those features into the cellular automata. QR Decomposition is used to decompose the image into two matrices namely orthogonal and upper triangular matrix. Robust Secret Key is used to protect the images against forgery. Our scheme is tested on images of size 512 × 512 and experimental results are compared with the existing techniques to illustrate the visual quality, robustness and reliability of the proposed scheme","PeriodicalId":294210,"journal":{"name":"2016 Ninth International Conference on Contemporary Computing (IC3)","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126041951","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"OE-LEACH: An optimized energy efficient LEACH algorithm for WSNs","authors":"S. Gambhir, Parul","doi":"10.1109/IC3.2016.7880225","DOIUrl":"https://doi.org/10.1109/IC3.2016.7880225","url":null,"abstract":"Wireless Sensor Networks (WSNs) are designed by hundreds or thousands of tiny, low cost and multifunctional sensor nodes. Each sensor node has very low battery life. Sensor nodes have finite storage capabilities, transmission and processing range and energy resources are also limited. There are many design issues in WSNs such as mobility, energy consumption, network topology, data aggregation, localization, production cost, security, network size and density etc. Routing protocols provide efficient working of the network, increase network lifetime, responsible for maintaining the routes in the network and perform reliable multi-hop communication under various conditions. LEACH (Low Energy Adaptive Clustering Protocol) is one of the hierarchical protocol in WSNs. LEACH uses TDMA MAC Protocol. During random data distribution, a number of TDMA slots are wasted. Because sensor nodes don't know either they have data to send or not, they continuously listen to the medium and this result in idle listening problem. This paper proposed OE-LEACH (An Optimized Energy Efficient LEACH Algorithm for WSNs) to enhance the performance of the LEACH Protocol, reduce time delay and energy consumption. Network Lifetime and throughput of WSN also increases. The Proposed method is simulated in MATLAB 2010a.","PeriodicalId":294210,"journal":{"name":"2016 Ninth International Conference on Contemporary Computing (IC3)","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126568832","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"BRC-KEP: A secure biometric and reversible CA based 2-party key exchange protocol","authors":"S. Choudhury, A. Kalwar, A. Goswami, M. Bhuyan","doi":"10.1109/IC3.2016.7880203","DOIUrl":"https://doi.org/10.1109/IC3.2016.7880203","url":null,"abstract":"Password-based authenticated key exchange (PAKE) protocol provides secure communication between two parties based on cryptographically agreed session key. But such protocols have one major disadvantage, i.e., it is vulnerable to man-in-the-middle, offline dictionary, and other different types of attacks. In this paper, we present a secure biometric and reversible cellular automaton (RCA) based 2-party key exchange protocol called BRC-KEP. The proposed protocol exploits the features of reversible CA and biometric components of an individual. BRC-KEP performs better in comparison to [6, 8, 10] considering speed and invulnerabilities to most of the known attacks. It can be applied in cloud-based data access through insecure communication channel.","PeriodicalId":294210,"journal":{"name":"2016 Ninth International Conference on Contemporary Computing (IC3)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132545229","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Feature selection using Markov clustering and maximum spanning tree in high dimensional data","authors":"Neha Bisht, Annappa Basava","doi":"10.1109/IC3.2016.7880208","DOIUrl":"https://doi.org/10.1109/IC3.2016.7880208","url":null,"abstract":"Feature selection is the most important preprocessing step for classification of high dimensional data. It reduces the load of computational cost and prediction time on classification algorithm by selecting only the salient features from the data set for learning. The main challenges while applying feature selection on high dimensional data (HDD) are: handling the relevancy, redundancy and correlation between features. The proposed algorithm works with the three main steps to overcome these issues. It focuses on filtering strategy for its effectiveness in handling the data sets with large size and high dimensions. Initially to measure the relevancy of features with respect to class, fisher score is calculated for each feature independently. Next, only relevant features are passed to the clustering algorithm to check the redundancy of features. Finally the correlation between features is calculated using maximum spanning tree and the most appropriate features are filtered out. The classification accuracy of the presented approach is validated by using C4.5, IB1 and Naive Bayes classifier. The proposed algorithm gives high classification accuracy when compared against the accuracies given by three different classifiers on the datasets containing features extracted from fisher score method and dataset containing all the features or full-featured dataset.","PeriodicalId":294210,"journal":{"name":"2016 Ninth International Conference on Contemporary Computing (IC3)","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133133897","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Prioritizing and optimizing risk factors in agile software development","authors":"R. Agrawal, Deepali Singh, Ashish Sharma","doi":"10.1109/IC3.2016.7880232","DOIUrl":"https://doi.org/10.1109/IC3.2016.7880232","url":null,"abstract":"To ensure success and quality of a software, early identification and prioritization of the risk is necessary. Risk impacts the cost and duration for of a software. As agile practices of software development prevail over traditional software development, so they are used in present scenario. This paper proposes an Agile based Risk Rank (AR-Rank) method for the prioritization of risk factors in agile software development. To reduce the impact of risks, the proposed method provides precedence ranking of risk factors from high to low. Therefore, the goal of proposed method is to provide minimum risk-free software on time with varying degree of flexibility. For optimization of risk factors, the Particle Swarm Optimization (PSO) is applied as an iterative approach. The proposal is compared proposed with various prevalent approaches as proposed in past.","PeriodicalId":294210,"journal":{"name":"2016 Ninth International Conference on Contemporary Computing (IC3)","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131209994","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
P. Ghule, M. Bhalerao, Rajan H. Chile, V. G. Asutkar
{"title":"Wheelchair control using speech recognition","authors":"P. Ghule, M. Bhalerao, Rajan H. Chile, V. G. Asutkar","doi":"10.1109/IC3.2016.7880214","DOIUrl":"https://doi.org/10.1109/IC3.2016.7880214","url":null,"abstract":"In this paper a speech controlled wheelchair for physically disabled person is developed which can be used for different languages. A speech recognition system using Mel Frequency Cepstral Coefficients (MFCC) was developed in the laptop with an interactive and user friendly GUI and the normal wheelchair was converted to an electric wheelchair by applying a gear mechanism to the wheels with DC motor attached to the gear. An Arduino Uno board is used to acquire the control signal from MATLAB and give it to relay driver circuit which intern results in the motion of wheelchair in desired direction. The speech inputs such as forward, back, left, right and stop acquired from the user and then the motion of the wheelchair made according to the respective command. The performance of MFCC in presence of noise and for different languages was studied to know the reliability of the algorithm in different condition.","PeriodicalId":294210,"journal":{"name":"2016 Ninth International Conference on Contemporary Computing (IC3)","volume":"61 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116339436","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Image forgery detection using Markov features in undecimated wavelet transform","authors":"Saurabh Agarwal, S. Chand","doi":"10.1109/IC3.2016.7880221","DOIUrl":"https://doi.org/10.1109/IC3.2016.7880221","url":null,"abstract":"Image forgery has become common due to the availability of high-quality image editing softwares. For detecting image forgery there is a need to have important features of the image. For obtaining the image features we need a suitable transform. One of the important and commonly used transform is discrete wavelet transform that can provide spatial and frequency related information of a signal. However, it provides ambiguous information due to its shift variant property. This ambiguity can be overcome using the UWT due to its shift invariance property. In forgery, some operations are applied to an image at different locations. For instance same type of details at different locations, the UWT provides the output of same nature, whereas the DWT doesn't. Due to this property, the features extracted in undecimated wavelet transform (UWT) domain provide better results in many applications like denoising, change detection, etc. In this paper, image features are extracted using the Markov model after transforming it into UWT domain. To evaluate the performance CASIA v1.0, Columbia Color and DSO-1 databases are used. The support vector machine with the linear kernel applied to separate the forged and pristine images. We experimentally obtain better results using the UWT transform as compared to the DWT transform on all these three databases.","PeriodicalId":294210,"journal":{"name":"2016 Ninth International Conference on Contemporary Computing (IC3)","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122426423","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An effective multi-objective workflow scheduling in cloud computing: A PSO based approach","authors":"Shubham, Rishabh Gupta, Vatsal Gajera, P. K. Jana","doi":"10.1109/IC3.2016.7880196","DOIUrl":"https://doi.org/10.1109/IC3.2016.7880196","url":null,"abstract":"Cloud computing has emerged as prominent paradigm in distributed computing which provides on-demand services to users. It involves challenging areas like workflow scheduling to decide the sequence in which the applications are to be scheduled on several computing resources. Due to NP-complete nature of workflow scheduling, finding an optimal solution is very challenging task. Thus, a meta-heuristic approach such as Particle Swarm Optimization (PSO) can be a promising technique to obtain a near-optimal solution of this problem. Several workflow scheduling algorithms have been developed in recent years but quite a few of them focuses on two or more parameters of scheduling at a time like usage cost, makespan, utilization of resource, load balancing etc. In this paper, we present a PSO based workflow scheduling which consider two such conflicting parameters i.e., makespan and resource utilization. With meticulous experiments on standard workflows we find that our proposed approach outperforms genetic algorithm based workflow scheduling in all cases achieving 100% results.","PeriodicalId":294210,"journal":{"name":"2016 Ninth International Conference on Contemporary Computing (IC3)","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132573854","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Web services based path guidance to rescue team alert system during flood","authors":"Manik Chandra, R. Niyogi","doi":"10.1109/IC3.2016.7880261","DOIUrl":"https://doi.org/10.1109/IC3.2016.7880261","url":null,"abstract":"In order to save the lives and infrastructure of a flood affected area, identification of the area and rescue team is needed. Along with a trained rescue team a quick and relevant intimation system is also needed so that the rescue operation can be carried out quickly. In this study, we propose a web based Rescue Alert System (RAS). RAS collects data from different locations of the affected site and analyses the data to detect the disastrous situation. After knowing the location of affected area, the RAS finds out shortest path from affected area to the current position of rescue team and provides the path to the rescue team. we design, DataAnalyzer, PathFinder, and PathValidator web services; the orchestration of the services is carried out using java based OpenESB framework.","PeriodicalId":294210,"journal":{"name":"2016 Ninth International Conference on Contemporary Computing (IC3)","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126187294","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Identification and removal of different noise patterns by measuring SNR value in magnetic resonance images","authors":"R. B. Yadav, Subodh Srivastava, R. Srivastava","doi":"10.1109/IC3.2016.7880212","DOIUrl":"https://doi.org/10.1109/IC3.2016.7880212","url":null,"abstract":"In MR image Rician noise is one of the prominent noise, however Gaussian and Rayleigh noise are also present. These types of noises in the MRI can be identified by measuring SNR value of image data. In the literature, there are many methods available to remove Rician noise. But little method has been reported for the removal of Rayleigh and Gaussian noise in MRI. So in this paper we concentrate on removal of Rayleigh and Gaussian noise from MRI. This method is automatically identify various type of noise present into the MRI and filters them by choosing an appropriate filter. The proposed filter consists of two terms namely data fidelity and prior. The data fidelity term i.e. likelihood term is derived from Gaussian pdf and Rayleigh pdf and a nonlinear complex diffusion (CD) based prior is used. The performance analysis and comparative study of the proposed method with other standard methods is presented for Brain Web dataset at varying noise levels in terms of MSE and SSIM. From the simulation results, it is observed that the proposed framework with CD based prior is performing better in comparison to other priors in consideration.","PeriodicalId":294210,"journal":{"name":"2016 Ninth International Conference on Contemporary Computing (IC3)","volume":"49 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115907535","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}