{"title":"Systematic literature review on search based software testing","authors":"A. B. Sultan, Samaila Musa, S. Baharom","doi":"10.15866/IRECOS.V12I5.16856","DOIUrl":"https://doi.org/10.15866/IRECOS.V12I5.16856","url":null,"abstract":"The use of random search is very poor at finding solutions when those solutions occupy a very small part of the overall search space. Test data may be found faster and more reliably if the search is given some guidance. This work is a paper that explains the application of metaheuristic techniques in search-based software testing. The paper systematically review 47 papers selected randomly from online databases and conference proceeding based on the metaheuristic search techniques that have been most widely applied to problem solving, the different fitness function used for test data selection in each of the metaheuristic technique, and the limitation in the use of each search-based technique for software testing. It was found that GA outperformed its counterparts SA, HC, GP and random search approaches in generating test data automatically, different approaches were used to make sure that test data are selected within shorter period of time and also with wider coverage of the paths based on the fitness function, and most of the limitations of the articles are the handling of complex data types, like array, object types, and branch coverage. The paper also provides areas of possible future work on the use of metaheuristic techniques in search-based software testing.","PeriodicalId":392163,"journal":{"name":"International Review on Computers and Software","volume":"72 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127016197","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Noise Radiation Evaluation from Refineries by Computer Simulation","authors":"M. Abdulkadir","doi":"10.15866/irecos.v12i5.16852","DOIUrl":"https://doi.org/10.15866/irecos.v12i5.16852","url":null,"abstract":"","PeriodicalId":392163,"journal":{"name":"International Review on Computers and Software","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122332678","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Adaptive image fusion scheme based on Contourlet Transform and Machine Learning","authors":"M. Malik, M. Gillani, A. Ulhaq","doi":"10.15866/IREA.V5I5.14711","DOIUrl":"https://doi.org/10.15866/IREA.V5I5.14711","url":null,"abstract":"Adaptive image fusion scheme based on the combination of contourlet transform, Kernel Principal Component Analysis (K-PCA), Support Vector Machine (SVM) and Mutual Information (MI) is proposed. Contourlet is well suited to image fusion scheme because of its properties, such as localization, multiresolution, directionality and anisotropy. K-PCA operates on low frequency subband to extract feature and SVM is applied to high frequency subbands to obtain a composite image with extended information. Moreover, Mutual Information (MI) is used to adjust the contribution of each source image in the final fused image. Performance evaluation is carried out by using recently developed metric, Image Quality Index (IQI). The proposed scheme outperforms previous approaches both subjectively and quantitatively, and this is evident from the experimental results and findings.","PeriodicalId":392163,"journal":{"name":"International Review on Computers and Software","volume":"62 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114229924","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Mathematical Modeling and Experimental Evaluation of Optimal Agendas and Procedures for N-Issue Negotiation","authors":"Saidalavi Kalady, V. Govindan, A. T. Mathew","doi":"10.15866/IRECOS.V12I5.16854","DOIUrl":"https://doi.org/10.15866/IRECOS.V12I5.16854","url":null,"abstract":"","PeriodicalId":392163,"journal":{"name":"International Review on Computers and Software","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133353505","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Coding Categories Based Electrocardiogram (ECG) Lossy Compression Scheme for IoT Systems","authors":"A. Hatim, R. Latif, M. Arioua","doi":"10.15866/IRECOS.V12I4.12582","DOIUrl":"https://doi.org/10.15866/IRECOS.V12I4.12582","url":null,"abstract":"Nowadays, IoT is widely used for intelligent and distant monitoring. The IoT performances are mainly based on the wireless communication networks. This is the key stone of several applications in the medical applications like e health monitoring, vision and medical imaging. Several operations slow down such a communication systems. The most important one is the compression and decompression blocks. The paper presents a new ECG signal compressor/ decompressor. Low complexity and high accuracy are the principal characteristics of the introduced scheme. The proposed scheme is coding categories based. Low coding category and high coding category and a new frame format are defined. The new frame composition allows reaching high compression ratios. Tests are done using the physionet MIT-BIH and the PTB diagnostic databases. Over than 250 signals, with different cardiac pathologies were used for the tests. We reach a maximum compression ratio (CR) of 40 with a PRD of 0,5%. The introduced compressor outperforms the earlier techniques in the state of the art.","PeriodicalId":392163,"journal":{"name":"International Review on Computers and Software","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-07-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131663334","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Biometric Data Security Using Watermarking Based on Vector Quantization","authors":"A. Sabri, M. Ouslim","doi":"10.15866/IRECOS.V12I4.12737","DOIUrl":"https://doi.org/10.15866/IRECOS.V12I4.12737","url":null,"abstract":"In this paper, we present a new watermarking method applied to biometric data for the purpose of securing their transmission over a computer network. To perform this task, we combined the fingerprint modality with that of the iris. We proceeded by merging the two biometric signatures using the watermarking based on vector quantization. The proposed technique extracts the main characteristics from the fingerprint image to mask the binary iris code image. This iris code is obtained from the eye image after several transformations. This choice is found to be the obvious combination solution for the type of the manipulated standard images. The vector quantization method was implemented based on Voronoi diagram using a codebook generated from a new chaotic system. The robustness of this technique was extensively tested using several simulation scenarios handling adequate database images. The results show that the proposed method is robust enough against JPEG compression. Other tests covered also different simulated computer attacks of the watermarked image using several types of median, mean and Gaussian blur filters. In this case, the filter sizes are taken large enough i.e., (10 × 10). The overall obtained results are satisfactory and very encouraging.","PeriodicalId":392163,"journal":{"name":"International Review on Computers and Software","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-07-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129274786","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Detection and Demodulation of AM Signals Using Two Dimensional Spectral Surface","authors":"M. Al-Dwairi, D. Skopin","doi":"10.15866/IRECOS.V12I4.14709","DOIUrl":"https://doi.org/10.15866/IRECOS.V12I4.14709","url":null,"abstract":"This paper is designed to detect AM signals, evaluate its parameters and consequently provides demodulation procedure using the methods of Short Time Fourier Transform (STFT) and Discrete Fourier Transform (DFT) combined with two Dimensional Spectral Surface (2DSS). The proposed technique is noise robust since the energies of modulated signal and noise located in different position on 2DSS which could be discarded during the analysis. Comparing with the Hilbert Transform the proposed method can extract envelopes of different carriers transmitted in the same channel.","PeriodicalId":392163,"journal":{"name":"International Review on Computers and Software","volume":"79 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-07-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121774345","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Emergency Department Simulation: Proposed Model and Optimization","authors":"Soraia Oueida, Seifedine Kadry, Sorin Ionescu","doi":"10.15866/IRECOS.V12I4.13816","DOIUrl":"https://doi.org/10.15866/IRECOS.V12I4.13816","url":null,"abstract":"Emergency Department (ED), being a complex and critical entity of the healthcare system, is studied in this paper for several reasons. The main challenge faced by the ED is the growing number of patients who show up without any prior notice, the 24/7 operation of the ED and the open facility to any type of illness and all age categories. These challenges increased the waiting time and staff utilization rates in the ED. Therefore, patient flow is highly influenced resulting in unnecessary costs. The long waiting time is a major problem facing EDs nowadays and should be considered as a high priority in order to ensure patient satisfaction; knowing that patient LoS may also affect resource utilization rates and hospital revenue. In this study, simulation using Arena is used in order to build a realistic model for an ED at a hospital in North Lebanon. This model is then verified and validated in order to match the real system, where improvements can be suggested for a better patient flow process and management optimization. Improvements are proposed by running different simulations using Arena Process Analyzer tool and optimization is added in order to reach an optimal solution using Arena OptQuest tool.","PeriodicalId":392163,"journal":{"name":"International Review on Computers and Software","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-07-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124316630","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Noureddine Moussa, Zakaria Hamidi-Alaoui, Abdelbaki El Belrhiti El Alaoui
{"title":"A Novel Fault Tolerant Mechanism for Wireless Sensor Networks","authors":"Noureddine Moussa, Zakaria Hamidi-Alaoui, Abdelbaki El Belrhiti El Alaoui","doi":"10.15866/IRECOS.V12I3.12677","DOIUrl":"https://doi.org/10.15866/IRECOS.V12I3.12677","url":null,"abstract":"Failure of cluster heads in cluster-based wireless sensor networks is catastrophic since this type of nodes is responsible for collecting and aggregating data sensed by sensor nodes in order to send it to the sink node. Therefore, fault tolerance of cluster heads is an important issue in this type of networks. The existing fault tolerant mechanisms either consume considerably extra energy and time or require the use of supplementary material and software resources to detect and recover failures. In this paper, we propose a novel fault tolerant mechanism which deals with permanent and transient failures more efficiently. The performance of the proposed mechanism was tested by means of simulations and compared against the low-energy adaptive clustering hierarchy and informer homed routing protocols. Simulation results showed that our mechanism has better performance than these protocols in terms of energy and time costs needed to tolerate failures as well as the amount of data that reaches the sink.","PeriodicalId":392163,"journal":{"name":"International Review on Computers and Software","volume":"74 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-05-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125491138","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Abdelali Hadir, K. Zine-dine, M. Bakhouya, J. E. Kafi, J. Gaber
{"title":"An Enhanced DV-Hop for Nodes Localization in Static and Mobile Wireless Sensor Networks","authors":"Abdelali Hadir, K. Zine-dine, M. Bakhouya, J. E. Kafi, J. Gaber","doi":"10.15866/IRECOS.V12I3.12819","DOIUrl":"https://doi.org/10.15866/IRECOS.V12I3.12819","url":null,"abstract":"","PeriodicalId":392163,"journal":{"name":"International Review on Computers and Software","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-05-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124934958","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}