{"title":"A rectification algorithm for distorted images from the inclination plane paper","authors":"Xuejing Dai, Chengqing Tang, P. Sun","doi":"10.1109/ICCDA.2010.5540856","DOIUrl":"https://doi.org/10.1109/ICCDA.2010.5540856","url":null,"abstract":"In this paper, we deal with the problem of image rectification with the images captured by digital cameras at an angle. The procedure of forensic photography requires that the film plane should be paralleled to a taken image. Another procedure must be followed when the print is located on reflexible surfaces such as vehicles, or faint marks on porous surfaces. Examinations were made of the evidential value of mark images which were received from the scene or taken deliberately at an angle out of proper perspective (i.e., the lens axis is not perpendicular to the target plane). Based on geometry of differential and projection as well as the theory of imaging optics, a robust and fast inclination rectification algorithm is proposed using perspective transform of camera lens, and validated using MATLAB.","PeriodicalId":190625,"journal":{"name":"2010 International Conference On Computer Design and Applications","volume":"69 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117325835","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An approach based on Decision Tree to agricultural land grading","authors":"Jiejun Huang, Yanbin Yuan, Wei-hong Cui, Y. Zhan","doi":"10.1109/ICCDA.2010.5541422","DOIUrl":"https://doi.org/10.1109/ICCDA.2010.5541422","url":null,"abstract":"Land resources assessment is the premise and basis for the sustainable utilization of land resources. As a classification technology, Decision Tree has already been applied wildly in the area of information classification. This paper firstly introduces the fundamental theory and characteristics of Decision tree as well as its learning process. Then Iterative Dichotomiser 3 (ID3) algorithm and Decision tree pruning algorithm are combined to construct the Decision tree for agricultural land grading with a good result. The sampling and testing result shows that the accuracy in this case reaches as high as 86%. It can be easily concluded that Decision tree is an effective way for the agriculture land grading.","PeriodicalId":190625,"journal":{"name":"2010 International Conference On Computer Design and Applications","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123556105","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A new high-level security portable system based on USB Key with fingerprint","authors":"Guodong Li, Hu Chen","doi":"10.1109/ICCDA.2010.5541189","DOIUrl":"https://doi.org/10.1109/ICCDA.2010.5541189","url":null,"abstract":"Nowadays, information security is always the most important topic. With the help of the USB Key and fingerprint, people have developed kinds of authentication solutions [1, 2, 3] to protect the confidential information against unauthorized access. After some research, we found that the above solutions have two serious problems. One is that the sensitive information is not encrypted; the other one is that the environment is insecure. In this paper, we developed a new high-level security portable system to solve these problems. The system is a combination of hardware and software. On the one hand, based on the USB Key and fingerprint scanner, we integrated a plug-and-play USB2.0 device without driver installation. On the other hand, by rebuilding the syslinux boot loader and customizing the Puppy Linux and doing some special works on the USB flash partition, we have developed and integrated a fingerprint-based authentication security portable custom-built software environment inside the USB device. With the help of the system, the users can work at a movable security environment freely, as well as the high-level protection of confidential information.","PeriodicalId":190625,"journal":{"name":"2010 International Conference On Computer Design and Applications","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122003894","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
S. Thamarai Selvi, M. Sheeba Santha Kumari, K. Prabavathi, G. Kannan
{"title":"Estimating job execution time and handling missing job requirements using rough set in grid scheduling","authors":"S. Thamarai Selvi, M. Sheeba Santha Kumari, K. Prabavathi, G. Kannan","doi":"10.1109/ICCDA.2010.5541135","DOIUrl":"https://doi.org/10.1109/ICCDA.2010.5541135","url":null,"abstract":"Efficient scheduling of jobs in grid environment is a challenging task. To perform better resource utilization and proper resource allocation, the factor job runtime is essential. Accurate estimation of runtime helps to reserve resources in advance, provide user level QoS. But it is difficult to estimate the runtime of data intensive applications. Users are required to provide the runtime estimate of the job, but the user given estimates are inaccurate leading to poor scheduling. In this paper, we have used rough set techniques to analyse the history of jobs and estimate the runtime of the job. This requires maintaining a history of jobs that have executed along with their respective runtime. Our proposed rough set engine groups similar jobs and identifies the group to which the newly submitted job belongs. Based on this similar group identified, the runtime is estimated. Mostly users are not aware of resources, submitting incomplete job requirements. These missing job requirements affect data analysis. Those missing values should be accurately predicted. Missing value handler designed using rough sets fills the most probable value for missing attributes and then the runtime is estimated.","PeriodicalId":190625,"journal":{"name":"2010 International Conference On Computer Design and Applications","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122022803","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"XML-based data mining design and implementation","authors":"Chen Qi, Hou Ming","doi":"10.1109/ICCDA.2010.5540744","DOIUrl":"https://doi.org/10.1109/ICCDA.2010.5540744","url":null,"abstract":"this paper studies the basic methods and techniques of XML-based Web data mining, describes data mining classification and process, as well as the related technologies of XML. On this basis, it designs an application system of XML in Web data mining and specifically provides the systemic and functional structure of it, finally, based on the MXL technology to achieve the Web log mining and improve the main algorithm.","PeriodicalId":190625,"journal":{"name":"2010 International Conference On Computer Design and Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125892274","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Hierarchical fingerprint quality estimation scheme","authors":"Zia U H. Saquib, S. Soni, R. Vig","doi":"10.1109/ICCDA.2010.5540717","DOIUrl":"https://doi.org/10.1109/ICCDA.2010.5540717","url":null,"abstract":"The performance of an automatic fingerprint authentication system relies heavily on the quality of the captured fingerprint images. The error rates of such systems can significantly be decreased just by detecting and removing poor quality images right at the early stages. In this paper, an effective quality estimation scheme is proposed, which performs analysis in hierarchical fashion. This multilayered quality analysis comprises eight quality measures, three at global level (whole image)) and five at local level (block-wise), each one giving its respective quality score. The scores at their respective levels are fused into a single quality score by their corresponding fusion engines at two different levels. Fusion engines are based on the ‘Weighted Sum rule’. The scheme is also evaluated using ‘Sum rule’. The proposed scheme is experimented against fingerprint Verification Competition (FVC 2004) datasets. The experimental results show that this arrangement could correctly estimate and assign grades (good/acceptable/poor) to nearly 95% images in the dataset. Also, it is observed that this novel fusion-based hierarchical quality assessment model estimates fingerprint quality more effectively with ‘weighted sum rule’ than ‘sum rule’ and their individual counterparts.","PeriodicalId":190625,"journal":{"name":"2010 International Conference On Computer Design and Applications","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126002858","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A fast star pattern recognition algorithm based on feature vector","authors":"Qi-Shen Li, Chang-ming Zhu, Jun Guan","doi":"10.1109/ICCDA.2010.5541455","DOIUrl":"https://doi.org/10.1109/ICCDA.2010.5541455","url":null,"abstract":"Stars in a star map can be regarded as a point pattern, and we can utilize the matching of point pattern to recognize the star pattern. First, the nth Radius-Weighted-Mean Points (RWMPs) are proposed which are invariant to translation, rotation and scaling, and then, a RWMP-based feature vector is constructed which is still invariant to translation and rotation. The candidate referenced star images and their corresponding attitudes are obtained by computing the Euclidean distance between the viewed star image and each of the star images in the pattern database. The verification process is introduced to confirm the identification results. The simulation results indicates that the average identification rate of this algorithm can be enhanced 3.5% as compared to the grid algorithm at the same position noise level from 0 to 3 pixels, and the identification time of the proposed algorithm reduces to 1/5.","PeriodicalId":190625,"journal":{"name":"2010 International Conference On Computer Design and Applications","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125082366","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A culture-based study on information density of e-commerce websites","authors":"Junjie Chu, Guang Yang","doi":"10.1109/ICCDA.2010.5541492","DOIUrl":"https://doi.org/10.1109/ICCDA.2010.5541492","url":null,"abstract":"This study investigates cultural effects on computer performance of users to e-commerce websites with different information density, and ways to design appropriate interfaces to accommodate cultural effects in order to enhance computer performance for Chinese users. There are many differences in page layout and information density of Chinese and Western e-commerce websites. An experiment was conducted to examine the usability and reliability of the selected Chinese and Western e-commerce websites. Thirty Chinese participants in China participated in this study. Information search tasks with different information density websites were designed to study selected websites' usability. And questionnaires were used to research their reliability. Results indicate that, for Chinese users, higher information density website is associated with lower usability and higher reliability.","PeriodicalId":190625,"journal":{"name":"2010 International Conference On Computer Design and Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129699657","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Multi-parameter inverse analysis research based on Comsol Multiphysics and Matlab","authors":"Renmin Li, L. Fang, Yongfeng Deng, Songyu Liu","doi":"10.1109/ICCDA.2010.5541182","DOIUrl":"https://doi.org/10.1109/ICCDA.2010.5541182","url":null,"abstract":"This paper constructs a parameter inverse analysis model for the diffusion analysis experiments of multi-ion transport model. It transforms multi-parameter inverse analysis problem into an optimization problem for nonlinear constrained planning. The optimization principle, the sequential quadratic programming method (SQP) for optimization and its implementation method in Matlab are introduced. With Comsol Multiphysics, multi-ion transport model is analyzed positively and is edited into the objective function (File .M) that can be directly utilized by Matlab. Then the fmincon function of the nonlinear optimization of Matlab to solve and analyze the problems. The proliferation tests of NaCl salt solution in the clay are under inverse analysis to obtain macro-transport parameters. The analysis shows that this method has fast convergence and high accuracy. This method is also beneficial to the similar optimization analysis of multi-parameter inverse analysis problems and complex nonlinear multi-field coupling problems.","PeriodicalId":190625,"journal":{"name":"2010 International Conference On Computer Design and Applications","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129722124","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Study on digital processing technology of the multidimensional graph in Mechanical Design Handbook","authors":"Y. Yue, Jinye Wang, Guang Yang, B. Wei","doi":"10.1109/ICCDA.2010.5541516","DOIUrl":"https://doi.org/10.1109/ICCDA.2010.5541516","url":null,"abstract":"The graphs in Mechanical Design Handbook are mostly obtained through mathematical statistical analysis and curve fitting of the discrete experimental data, most of which have irregular or even changing constraints, but no obvious regularity. In manual design, designers can more easily obtain the relevant information from the graph by checking table or using the measuring tool to take points on the curve, but the efficiency is very low. To effectively solve this problem, we can use the computer's fast retrieval and automatic point selection and calculation to obtain the required information from the graph, but firstly we must digitize these graphs so as to convert them into the analytical expression group which computer can recognize. Digital processing technology of the multidimensional graph is explored in this paper.","PeriodicalId":190625,"journal":{"name":"2010 International Conference On Computer Design and Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129813165","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}