{"title":"Locality Features Encoding in Regularized Linear Representation Learning for Face Recognition","authors":"Waqas Jadoon, Haixian Zhang","doi":"10.1109/FIT.2013.42","DOIUrl":"https://doi.org/10.1109/FIT.2013.42","url":null,"abstract":"Regularized linear regression based representation techniques for face recognition (FR) have attracted a lot of attention in past years. The l1-regularized sparse representation based classification (SRC) method achieves state-of-the-art results in FR. However, recently several studies have shown the role of collaborative representation (CR) that plays a crucial role for the success of SRC in robust classification and not the l1-regularization constraints on representation. In this paper, we propose a novel Robust Locality based Collaborative Representation (RLCR) method using weighted regularized least square regression approach that incorporates the locality structure and feature variance among data elements into linear representation. RLCR is an extension of collaborative representation based classification (CRC) approach, a recently proposed fast alternative to SRC. The performance of CRC method dramatically decreases when the feature dimension is low or the number of training samples per subject is limited. RLCR improves classification performance over that of original CRC formulation. Experimental results on real world face datasets using low dimensional as well as high dimensional linear feature space have demonstrated the effectiveness of the proposed method and is found to be very competitive with the state-of-the-art image classification methods.","PeriodicalId":179067,"journal":{"name":"2013 11th International Conference on Frontiers of Information Technology","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133469103","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
K. Ullah, Igor Vitorio Custodio, Nadir Shah, E. Moreira
{"title":"An Experimental Study on the Behavior of Received Signal Strength in Indoor Environment","authors":"K. Ullah, Igor Vitorio Custodio, Nadir Shah, E. Moreira","doi":"10.1109/FIT.2013.54","DOIUrl":"https://doi.org/10.1109/FIT.2013.54","url":null,"abstract":"Due to widespread availability of WiFi networks in buildings, indoor location based systems become a reality. To provide indoor location based services (LBS), finding the current location of a human, a computer, a mobile device or equipment such as a small UAV (like Quad copter) is of great interest. The most prominent method for this purpose is the received signal strength (RSS)-based location from WiFi Access Points (APs) inside a building. Considerable amount of research is carried out in estimating current location and providing different services based on different wireless technologies. On the other hand, little attention has been paid to study and analyze the behavior of the received signal strength itself. It is challenging because the intensity of signal can change very frequently due to environmental features like topology, temperature, interaction with objects etc. In this paper, we study the behavior of WiFi signals in an indoor environment for RSS based localization by analyzing signals from three different access points by using triangulation technique. Our method is based on fingerprinting method. The experimental results reveal that the behavior of signal changes very frequently. The results lead us to the conclusion that understanding signals behavior is important before estimating current location and providing different LBS.","PeriodicalId":179067,"journal":{"name":"2013 11th International Conference on Frontiers of Information Technology","volume":"84 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124933018","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"On the Fly Test Suite Optimization with FuzzyOptimizer","authors":"A. A. Haider, A. Nadeem, S. Rafiq","doi":"10.1109/FIT.2013.26","DOIUrl":"https://doi.org/10.1109/FIT.2013.26","url":null,"abstract":"There occur frequent requirement changes in software systems even after the software has been developed. Regression testing is continuously performed to identify the undesired affects of these requirement changes on already tested system. Test suites grow enormously with these changes due to addition of new test cases for enhanced functionality. Optimization of test suite to perform regression testing within the budgetary and time restrictions is ultimate choice for a tester because \"Retest all\" test suite is un-economical and is not suitable choice. Test suite optimization can be either static or on the fly. With on the fly optimization, optimal suite keeps on changing with the requirement changes. On the fly optimization of test suite is preferable option for regression testing. Presently, static test suite optimization approaches exist. We have proposed an application specific, on the fly optimization approach for test suite optimization problem. We have implemented our approach on an academic testing problem. We use fuzzy logic to optimize the test suite with multiple optimization objectives. Our approach has been successful to generate on the fly optimized test suites for changing requirements. In future, we will implement this approach on considerably large sized testing problems.","PeriodicalId":179067,"journal":{"name":"2013 11th International Conference on Frontiers of Information Technology","volume":"234 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114534274","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Muhammad Bilal Bashir, Misbah Masood, Abdul Qadir, Umm-E.-Humaira, Maria Batool, Umar Abbas
{"title":"bugMLX: Extended Software Bug Markup Language","authors":"Muhammad Bilal Bashir, Misbah Masood, Abdul Qadir, Umm-E.-Humaira, Maria Batool, Umar Abbas","doi":"10.1109/FIT.2013.47","DOIUrl":"https://doi.org/10.1109/FIT.2013.47","url":null,"abstract":"World Wide Web (www) is producing immense amount of data, making it difficult for users to manage, analyze, and then bring useful information out of it. A user may not be able to understand the data and concepts in a document produced by a user that belongs to a different domain. Also data produced on heterogeneous systems cannot be integrated with ease. XML (extensible markup language) helps in all these scenarios where users can describe their data and concepts using tags. Data can be shared and integrated among heterogeneous systems using XML. Also intelligent machines can be built to process data from XML documents and analyze them. The Software bug Markup Language (bugML) is an effort to provide XML structure to software bug information. We have extended this work and have proposed some additions to bugML so it can be described comprehensively. We have also implemented a tool that we name as Auto-bugMLX, to automate our work.","PeriodicalId":179067,"journal":{"name":"2013 11th International Conference on Frontiers of Information Technology","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115185280","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Signal Processing Requirements and System Design for Sensor Arrays Using COTS Components","authors":"U. Hamid, Syed Ali Abbas","doi":"10.1109/FIT.2013.36","DOIUrl":"https://doi.org/10.1109/FIT.2013.36","url":null,"abstract":"This paper provides signal processing requirements of algorithms involved in sensor array systems so that an appropriate processing architecture can be designed and developed. The system design parameters taken into consideration include number of sensors and sensor spacing, sensor array frequency and sampling rate along with computational complexity of processing algorithms such as beam forming and spectral analysis. In this paper a passive sensor array configuration has been presented for underwater surveillance systems. Signal processing requirements in terms of data rates and processor loading have been estimated for real-time operation. This paper presents a Commercial-off-the-Shelf (COTS) based processing system consisting of data acquisition boards, processing boards and a display processor for meeting the demands of real-time sensor array processing.","PeriodicalId":179067,"journal":{"name":"2013 11th International Conference on Frontiers of Information Technology","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129332352","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An Adaptive Learning Automata for Genetic Operators Allocation Probabilities","authors":"Korejo Imtiaz Ali, K. Brohi","doi":"10.1109/FIT.2013.18","DOIUrl":"https://doi.org/10.1109/FIT.2013.18","url":null,"abstract":"The conventional Genetic algorithms (GAs) use a single mutation operator for whole population, It means that all solutions in population apply same leaning strategy. This property may cause lack of intelligence for specific individual, which is difficult to deal with complex situation. Different mutation operators have been suggested in GAs, but it is difficult to select which mutation operator should be used in the evolutionary process of GAs. In this paper, the fast learning automata is applied in GAs to automatically choose the most optimal strategy while solving the problem. Experimental results on different benchmark problems determines that the proposed method obtains the fast convergence speed and improve the performance of GAs.","PeriodicalId":179067,"journal":{"name":"2013 11th International Conference on Frontiers of Information Technology","volume":"95 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126141146","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"SOAD: Securing Oncology EMR by Anonymizing DICOM Images","authors":"Sidra Shahbaz, Asiah Mahmood, Z. Anwar","doi":"10.1109/FIT.2013.30","DOIUrl":"https://doi.org/10.1109/FIT.2013.30","url":null,"abstract":"In the prevailing healthcare industry requirements, the demand of electronic medical record (EMR) has been increased to provide better healthcare to patients and provide convenient access to EMR. Healthcare providers are keen to move EMR's to the cloud. Cloud computing paradigm is giving insight to shared environment for EMR, however it brings a lot of challenges like security, privacy of medical data along with its advantages. Our designed system provides appropriate management of oncology patient records and provides privacy to patient textual and DICOM (Digital Imaging and Communications in Medicine) image information by anonymizing them. We have identified PDATA (Personal Data) and CDATA (Clinical Data) from the DICOM images retrieved from PACS (Picture Archiving and Communication System) server. The role based policies are been implemented and are stored in the database, which are used for the anonymization of PDATA.","PeriodicalId":179067,"journal":{"name":"2013 11th International Conference on Frontiers of Information Technology","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122694943","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Saima Rathore, M. A. Iftikhar, M. Hussain, A. Jalil
{"title":"A novel approach for ensemble clustering of colon biopsy images","authors":"Saima Rathore, M. A. Iftikhar, M. Hussain, A. Jalil","doi":"10.1109/FIT.2013.12","DOIUrl":"https://doi.org/10.1109/FIT.2013.12","url":null,"abstract":"Colon cancer diagnosis based on microscopic analysis of biopsy sample is a common medical practice. However, the process is subjective, biased and leads to interobserver variability. Further, histopathologists have to analyze many biopsy samples per day. Therefore, factors such as tiredness, experience and workload of histopathologists also affect the diagnosis. These shortcomings require a supporting system, which can help the histopathologists in accurately determining cancer. Image segmentation is one of the techniques, which can help in efficiently segregating colon biopsy image into constituent regions, and accurately localizing the cancer. In this work, we propose a novel colon biopsy image segmentation technique, wherein segmentation has been posed as a classification problem. Local binary patterns (LTP), local ternary patters (LTP), and Haralick features are extracted for each pixel of colon biopsy images. Features are reduced using genetic algorithms and F-Score. Reduced features are given as input to random forest, rotation forest, and rotation boost classifiers for segregation of image into normal, malignant and connecting tissues components. The clustering performance has been evaluated using segmentation accuracy and Davies bouldin index (DBI). Performance of classifiers has also been evaluated using receiver operating characteristics (ROC) curves, and area under the curve (AUC). It is observed that rotation boost in combination with F-Score has shown better results in segmenting the images compared to other classifiers.","PeriodicalId":179067,"journal":{"name":"2013 11th International Conference on Frontiers of Information Technology","volume":"6 3","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131496114","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Foreground Object Detection and Tracking for Visual Surveillance System: A Hybrid Approach","authors":"S. Oh, S. Javed, Soon Ki Jung","doi":"10.1109/FIT.2013.10","DOIUrl":"https://doi.org/10.1109/FIT.2013.10","url":null,"abstract":"Foreground detection is one of the fundamental preprocessing steps in many image processing and computer vision applications. In spite of significant efforts, however, slowly moving foregrounds or temporarily stationary foregrounds remains challenging problem. To address these problems, this paper presents a hybrid approach, which combines background segmentation and long-term tracking with selective tracking and reducing search area, we robustly and effectively detect the foreground objects. The evaluation of realistic sequences from i-LIDS dataset shows that the proposed methodology outperforms with most of the state-of-the-art methods.","PeriodicalId":179067,"journal":{"name":"2013 11th International Conference on Frontiers of Information Technology","volume":"49 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115882223","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Evaluating Security of Software Components Using Analytic Network Process","authors":"S. Nazir, Sara Shahzad, M. Nazir, Hanif ur Rehman","doi":"10.1109/FIT.2013.41","DOIUrl":"https://doi.org/10.1109/FIT.2013.41","url":null,"abstract":"Increasing use of Component Based Software Engineering (CBSE) has raised the issues related with the security of software components. Several methodologies are being used to evaluate security of software components and that of the base system with which it is integrated. Security characteristics of a component must be specified effectively and unambiguously. To make possible software development progression, it will be effective to have a method which evaluates the security of software components. The study presented here attempts to propose analytic network process (ANP) for component security evaluation. The method is applied using ISO/IEC 27002 (ISO 27002) standard.","PeriodicalId":179067,"journal":{"name":"2013 11th International Conference on Frontiers of Information Technology","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122218562","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}