{"title":"Evolutionary approach for approximation of artificial neural network","authors":"S. Pal, Swati Vipsita, P. Patra","doi":"10.1109/IADCC.2010.5423015","DOIUrl":"https://doi.org/10.1109/IADCC.2010.5423015","url":null,"abstract":"Neural Network is an effective tool in the field of pattern recognition. The neural network classifies the pattern from the training data and recognizes if the testing data holds that pattern. The classical Back propagation (BP) algorithm is generally used to train the neural network for its simplicity. The basic drawback of this algorithm is its uncertainty and long training time and it searches the local optima and not the global optima. To overcome the drawback of Back propagation (BP) algorithm, here we use a hybrid evolutionary approach (GA-NN algorithm) to train neural networks. The aim of this algorithm is to find the optimized synaptic weight of neural network so as to escape from local minima and overcome the drawbacks of BP. The implementation is done taking images as input in “.png”and “.tif” format.","PeriodicalId":249763,"journal":{"name":"2010 IEEE 2nd International Advance Computing Conference (IACC)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115207230","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An efficient decentralized Load Balancing Algorithm for grid","authors":"P. K. Suri, Manpreet Singh","doi":"10.1109/IADCC.2010.5423048","DOIUrl":"https://doi.org/10.1109/IADCC.2010.5423048","url":null,"abstract":"The management of resources and scheduling computations is a challenging problem in grid. Load Balancing is essential for efficient utilization of resources and enhancing the performance of computational grid. In this paper, we propose a decentralized grid model, as a collection of clusters. We then introduce a Dynamic Load Balancing Algorithm (DLBA) which performs intra cluster and inter cluster (grid) load balancing. DLBA considers load index as well as other conventional influential parameters at each node for scheduling of tasks. Simulation results show that the proposed algorithm is feasible and improves the system performance considerably.","PeriodicalId":249763,"journal":{"name":"2010 IEEE 2nd International Advance Computing Conference (IACC)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130473639","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
P. Hiremath, J. Pujari, S. Shivashankar, V. Mouneswara
{"title":"Script identification in a handwritten document image using texture features","authors":"P. Hiremath, J. Pujari, S. Shivashankar, V. Mouneswara","doi":"10.1109/IADCC.2010.5423028","DOIUrl":"https://doi.org/10.1109/IADCC.2010.5423028","url":null,"abstract":"Script identification for handwritten document image is an open document analysis problem. In this paper, we propose an approach to script identification for documents containing handwritten text using the texture features. The texture features are extracted based on the co-occurrence histograms of wavelet decomposed images, which capture the information about relationships between each high frequency subband and that in low frequency subband of the transformed image at the corresponding level. The correlation between the subbands at the same resolution exhibits a strong relationship, indicating that this information is significant for characterizing a texture. This scheme is tested on seven Indian language scripts alongwith English. Our method is robust to the skew generated in the process of scanning a document and also to the varying coverage of text. The experimental results demonstrate the effectiveness of the texture features in identification of handwritten scripts. The experiments are also performed by considering the multiple writers.","PeriodicalId":249763,"journal":{"name":"2010 IEEE 2nd International Advance Computing Conference (IACC)","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128358939","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
S. Kamble, S. Agarwal, V. Shrivastava, Vikas Maheshkar
{"title":"DCT based texture watermarking using GLCM","authors":"S. Kamble, S. Agarwal, V. Shrivastava, Vikas Maheshkar","doi":"10.1109/IADCC.2010.5423014","DOIUrl":"https://doi.org/10.1109/IADCC.2010.5423014","url":null,"abstract":"We present a novel DCT technique for digital watermarking of textured images based on the concept of gray-level co-occurrence matrix (GLCM). We provide analysis to describe the behavior of the method in terms of correlation as a function of the offset for textured images. We compare our approach with another spatial and temporal domain watermarking techniques and demonstrate the potential for robust watermarking of textured images. From our extensive experiments, results indicate that our DCT approach is robust and secure against a wide range of image processing operations like JPEG compression, additive noise, cropping, scaling, and rotation. Also, the experimental results show that the proposed scheme has good imperceptibility.","PeriodicalId":249763,"journal":{"name":"2010 IEEE 2nd International Advance Computing Conference (IACC)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127162711","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Detection and segmentation of lines and words in Gurmukhi handwritten text","authors":"Rajiv Kumar, Amardeep Singh","doi":"10.1109/IADCC.2010.5422927","DOIUrl":"https://doi.org/10.1109/IADCC.2010.5422927","url":null,"abstract":"The scanned text image is a non editable image though it has the text but one can not edit it or make any change, if required, to that scanned document. This provides a basis for the optical character recognition (OCR) theory. OCR is the process of recognizing a segmented part of the scanned image as a character. The overall OCR process consists of three major sub processes like pre processing, segmentation and then recognition. Out of these three, the segmentation process is the back bone of the overall OCR process. We can say that the segmentation process is the most significant process because if the segmentation is incorrect then we can not have the correct results; it is just like garbage in and garbage out. But it is not an easy job, because segmentation is one of the complex processes. It is more difficult if the document is handwritten because in that case only few points are there which can be used to make segmentation. In this paper, we formulate an approach to segment the scanned document image. As per this approach, initially this considers the whole image as one large window. Then this large window is broken into less large windows giving lines, once the lines are identified then each window consisting of a line is used to find a word present in that line and finally to characters. For that purpose we used the concept of variable sized window, that is, the window whose size can be adjusted according to needs. This concept was implemented and results were analyzed. After the analysis the same concept was modified and finally tried on different documents and we got good reasonable results.","PeriodicalId":249763,"journal":{"name":"2010 IEEE 2nd International Advance Computing Conference (IACC)","volume":"37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122971595","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Modifications in Genetic Algorithm using additional parameters to make them computationally efficient","authors":"B. Sridharan","doi":"10.1109/IADCC.2010.5423037","DOIUrl":"https://doi.org/10.1109/IADCC.2010.5423037","url":null,"abstract":"This paper describes a novel approach towards the modification of Genetic Algorithms. The novelty of the modified Genetic Algorithm lies in the addition of a new parameter called the age of the chromosome that would select its ability to reproduce. Also, the concept of dynamic population and elitism size has been introduced. The modified Genetic Algorithm converges to the near optimum value at a faster rate, i.e. lesser number of generations are required for the convergence and due to the concept of dynamic population size the results obtained are more accurate. Thus, the modified algorithm is observed to be computationally more efficient. The algorithm was tested for some standard functions and curves and the results were found to be highly satisfactory.","PeriodicalId":249763,"journal":{"name":"2010 IEEE 2nd International Advance Computing Conference (IACC)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122996063","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Signature replacement attack and its counter-measures","authors":"Subrata Sinha, S. Sinha","doi":"10.1109/IADCC.2010.5423006","DOIUrl":"https://doi.org/10.1109/IADCC.2010.5423006","url":null,"abstract":"2-tuple Digital Signature scheme has two elements: a message and a signature. A tempered message can be verified by the decryption of the message digest, encrypted by the secret key of the signer, with the help of its corresponding public key. On the contrary, if the signature element is replaced then it cannot be verified. This is termed as signature replacement attack hitherto not discussed in the literature. In case of signature replacement attack, proof of origin is compromised. In this paper this attack is brought into focus for the first time. A solution for digital signature, resilient to signature replacement attack, is also proposed, where a trusted central arbiter is used as an in-line TTP. However, the central arbiter becomes the main bottleneck of performance. The problem is equally true for XML signature scheme used in Web service security today. This paper also proposes a solution with a BPEL process which acts as a central arbiter in the proposed special protocol.","PeriodicalId":249763,"journal":{"name":"2010 IEEE 2nd International Advance Computing Conference (IACC)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133985021","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An advanced secure (t, n) threshold proxy signature scheme based on RSA cryptosystem for known signers","authors":"Raman Kumar, H. Verma","doi":"10.1109/IADCC.2010.5422940","DOIUrl":"https://doi.org/10.1109/IADCC.2010.5422940","url":null,"abstract":"In a (t,n) threshold proxy signature scheme based on RSA, any t or more proxy signers can cooperatively generate a proxy signature while t-1 or fewer of them can't do it. The threshold proxy signature scheme uses the RSA cryptosystem to generate the private and the public key of the signers [8]. In this article, we discuss the implementation and comparison of some threshold proxy signature schemes that are based on the RSA cryptosystem. Comparison is done on the basis of time complexity, space complexity and communication overhead. We compare the performance of four schemes: Hwang et al. [1], Wen et al.[2], Geng et al.[3] and Fengying et al[4] with the performance of a scheme that has been proposed by the authors of this article earlier and proposed an advanced secure (t, n) threshold proxy signature scheme. In the proposed scheme, both the combiner and the secret share holder can verify the correctness of the information that they are receiving from each other. Therefore, the proposed scheme is secure and efficient against notorious conspiracy attacks.","PeriodicalId":249763,"journal":{"name":"2010 IEEE 2nd International Advance Computing Conference (IACC)","volume":"352 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124457912","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A weighted mean time Min-Min Max-Min selective scheduling strategy for independent tasks on Grid","authors":"S. Chauhan, R. Joshi","doi":"10.1109/IADCC.2010.5423047","DOIUrl":"https://doi.org/10.1109/IADCC.2010.5423047","url":null,"abstract":"With the emergence of Grid technologies, the problem of scheduling tasks in heterogeneous systems has been arousing attention. Task scheduling is a NP-complete problem[5] and it is more complicated under the Grid environment. To better use tremendous capabilities of Grid system, effective and efficient scheduling algorithms are needed. In this paper, we are presenting a new heuristic scheduling strategy for Independent tasks. The strategy is based on two traditional scheduling heuristics Min-Min and Max-Min. The strategy also considers the overall performance of machines to decide the scheduling sequence of tasks. We have evaluated our scheduling strategy within a grid simulator known as GridSim. We compared the results given by our strategy with the existing scheduling heuristics Min-Min and Max-Min and the results shows that our strategy outperforms in many cases than the existing ones.","PeriodicalId":249763,"journal":{"name":"2010 IEEE 2nd International Advance Computing Conference (IACC)","volume":"47 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128574584","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Automated analysis of gestational sac in medical image processing","authors":"V. Chakkarwar, M. Joshi, P. S. Revankar","doi":"10.1109/IADCC.2010.5422938","DOIUrl":"https://doi.org/10.1109/IADCC.2010.5422938","url":null,"abstract":"Ultrasonography is considered to be one of the most powerful techniques for imaging organs for an obstetrician and gynecologist. The first trimester of pregnancy is the most critical period in human existence. This evaluation of the first trimester pregnancy is usually indicated to confirm presence and number of pregnancy, its location and confirm well being of the pregnancy. The first element to be measurable is the gestational sac(gsac) of the early pregnancy. Size of gestational sac gives measure of fetus age in early pregnancy and also from that EDD is predicted. Today, the monitoring of gestational sac is done non-automatic, with human interaction. These methods involve multiple subjective decisions which increase the possibility of interobserver error. Because of the tedious and time-consuming nature of manual measurement, an automated, computer-based method is desirable which gives accurate boundary detection, consequently finding accurate diameter. Ultrasound images are characterized by speckle noise and edge information, which is weak and discontinuous. Therefore, traditional edge detection techniques are susceptible to spurious responses when applied to ultrasound imagery due to speckle noise. Algorithm for finding edges of gsac are as follows. In first step, we are using contrast enhancement, followed by filtering. We are smoothing image using lowpass filter followed by wiener filter. This image is segmented using thresholding. This results in image having large number of gaps due to high intensity around sac. These false regions are minimized by morphological reconstruction. Then boundaries are detected using morphological operations. Knowledge based filtering is used to remove false boundaries. In this prior knowledge of shape of gestational sac is used. First fragmented edges are removed then most circular shape is found as our sac is generally circular. Once sac is located, sac size is measured to predict the gestational age.","PeriodicalId":249763,"journal":{"name":"2010 IEEE 2nd International Advance Computing Conference (IACC)","volume":"76 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114861837","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}