{"title":"Development of a non-invasive screening device of diabetic peripheral neuropathy based on the perception of micro-vibration","authors":"H. Sawada, K. Uchida, J. Danjo, Yu Nakamura","doi":"10.1109/CIBCB.2016.7758107","DOIUrl":"https://doi.org/10.1109/CIBCB.2016.7758107","url":null,"abstract":"An estimated number of 420 million people in the world had diabetes mellitus in 2014, which had become quadruple since 1980, and the number is estimated to be 700 million by 2025. Diabetes mellitus is a group of metabolic diseases, which causes high blood sugar to a person, due to the functional problems of the pancreas or the metabolism. Patients of untreated diabetes would be damaged by the high blood sugar in vessels, and this starts to destroy capillary vessels to lower the sensitivity of tactile sensations, then effects to various organs and nerve systems. Diabetic peripheral neuropathy is one of the complications of diabetes mellitus, and the authors pay attention to the decline of the sensitivity of tactile sensations in the early stage of diabetes. By using a novel micro-vibration actuator that employs a shape-memory alloy wire, we develop a non-invasive screening device of the level of diabetes based on the perception of micro-vibration patterns. Experiments are conducted in a medical clinic, and the relation between the tactile stimuli and the medical diagnosis of diabetes are examined.","PeriodicalId":368740,"journal":{"name":"2016 IEEE Conference on Computational Intelligence in Bioinformatics and Computational Biology (CIBCB)","volume":"35 2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123292513","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jing-doo Wang, Wen-Ling Chan, Charles C. N. Wang, Jan-Gowth Chang, J. Tsai
{"title":"Mining distinctive DNA patterns from the upstream of human coding&non-coding genes via class frequency distribution","authors":"Jing-doo Wang, Wen-Ling Chan, Charles C. N. Wang, Jan-Gowth Chang, J. Tsai","doi":"10.1109/CIBCB.2016.7758114","DOIUrl":"https://doi.org/10.1109/CIBCB.2016.7758114","url":null,"abstract":"The upstream of genes are expected to contain many still unknown regulatory regions that can increase or decrease the expression of specific genes. The processes of mining distinctive patterns (region) are to extract maximal repeats (patterns) from the upstream DNA sequences of human genes, and then filter out the patterns whose class frequency distribution can fit in with that is specified by domain experts; the class frequency distribution of one pattern is the frequencies of that pattern appearing in each of classes. The computation of extracting maximal repeats and meanwhile computing their class frequency distribution can be done by a scalable approach based on a previous work via MapReduce programming model. Experimental resources include the DNA sequences extracted from the upstream 5, 000 bp DNA sequences of 49, 267 human coding&non-coding genes. The classes of human genes are divided into four classes as “non-cancer related protein-coding gene”, “oncogene”, “tumor suppressor gene” and “non-coding genes”(RNA). Experimental results show that 17 distinctive patterns selected as core patters whose length is longer than 36 bp and, appear in more than 3, 000 genes and in all of four classes. To have more specific observation, there are 22 distinctive patterns selected that appear in at least 10 genes and whose lengths are greater than 15 bp and, most of all, just happen in two classes, “oncogene” and “tumor suppressor gene”. It is very attractive and expected to extend this approach to mine for another types of distinctive patterns, e.g. biomarkers, via this approach based on class frequency distribution of selected patterns if the targeted resources of genomic sequences, containing “genotypes”, are available and each of these sequences is labeled precisely according to the features, e.g. “phenotypes”, specified by domain experts in the future.","PeriodicalId":368740,"journal":{"name":"2016 IEEE Conference on Computational Intelligence in Bioinformatics and Computational Biology (CIBCB)","volume":"37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115320807","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Comparing features from ECG pattern and HRV analysis for emotion recognition system","authors":"H. Ferdinando, T. Seppänen, E. Alasaarela","doi":"10.1109/CIBCB.2016.7758108","DOIUrl":"https://doi.org/10.1109/CIBCB.2016.7758108","url":null,"abstract":"We propose new features for emotion recognition from short ECG signals. The features represent the statistical distribution of dominant frequencies, calculated using spectrogram analysis of intrinsic mode function after applying the bivariate empirical mode decomposition to ECG. KNN was used to classify emotions in valence and arousal for a 3-class problem (low-medium-high). Using ECG from the Mahnob-HCI database, the average accuracies for valence and arousal were 55.8% and 59.7% respectively with 10-fold cross validation. The accuracies using features from standard Heart Rate Variability analysis were 42.6% and 47.7% for valence and arousal respectively for the 3-class problem. These features were also tested using subject-independent validation, achieving an accuracy of 59.2% for valence and 58.7% for arousal. The proposed features also showed better performance compared to features based on statistical distribution of instantaneous frequency, calculated using Hilbert transform of intrinsic mode function after applying standard empirical mode decomposition and bivariate empirical mode decomposition to ECG. We conclude that the proposed features offer a promising approach to emotion recognition based on short ECG signals. The proposed features could be potentially used also in applications in which it is important to detect quickly any changes in emotional state.","PeriodicalId":368740,"journal":{"name":"2016 IEEE Conference on Computational Intelligence in Bioinformatics and Computational Biology (CIBCB)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125888395","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Control strategies for intelligent adjustment of pressure in intermittent pneumatic compression systems","authors":"Quanli Qiu, Pandeng Zhang, Jia Liu","doi":"10.1109/CIBCB.2016.7758095","DOIUrl":"https://doi.org/10.1109/CIBCB.2016.7758095","url":null,"abstract":"Deep vein thrombosis (DVT) often applies to patients in a long-term supine position in bed. This usually results in slowing down the venous blood velocity and therefore causing DVT. An intermittent pneumatic compression (IPC) treatment system has been used to prevent DVT. It is thus critical to control the pattern of pressure inflation/deflation in order to enhance the venous hemodynamics. The purpose of this research is to design algorithms in real-time to control the pressure in cuffs. We proposed two different methods. In contrast with current PID control, the result shows that our control methods make working pressure stays closer to the target pressure which equals to 45 mmHg. It has minimum standard variance about 2.97 mmHg. Therefore, idea gas law has better accuracy and stability.","PeriodicalId":368740,"journal":{"name":"2016 IEEE Conference on Computational Intelligence in Bioinformatics and Computational Biology (CIBCB)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133957349","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Protein secondary structure prediction through a novel framework of secondary structure transition sites and new encoding schemes","authors":"Masood Zamani, S. C. Kremer","doi":"10.1109/CIBCB.2016.7758118","DOIUrl":"https://doi.org/10.1109/CIBCB.2016.7758118","url":null,"abstract":"In this paper, we propose an ab initio two-stage protein secondary structure (PSS) prediction model through a novel framework of PSS transition site prediction by using Artificial Neural Networks (ANNs) and Genetic Programming (GP). In the proposed classifier, protein sequences are encoded by new amino acid encoding schemes derived from genetic Codon mappings, Clustering and Information theory. In the first stage, sequence segments are mapped to regions in the Ramachandran map (2D-plot), and weight scores are computed by using statistical information derived from clusters. In addition, score vectors are constructed for the mapped regions using the weight scores and PSS transition sites. The score vectors have fewer dimensions compared to those of commonly used encoding schemes and protein profile. In the second stage, a two-tier classifier is employed based on an ANN and a GP method. The performance of the two-stage classifier is compared to the state-of-the-art cascaded Machine Learning methods which commonly employ ANNs. The prediction method is examined with the latest dataset of nonhomologous protein sequences, PISCES [1]. The experimental results and statistical analyses indicate a significantly higher distribution of Q3 scores, approximately 7% with p-value <; 0.001, in comparison to that of cascaded ANN architectures. PSS transition sites are valuable information about the topological property of protein sequences and incorporating the information improves the overall performance of the PSS prediction model.","PeriodicalId":368740,"journal":{"name":"2016 IEEE Conference on Computational Intelligence in Bioinformatics and Computational Biology (CIBCB)","volume":"218 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134312407","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A* fast and scalable high-throughput sequencing data error correction via oligomers","authors":"F. Milicchio, I. Buchan, M. Prosperi","doi":"10.1109/CIBCB.2016.7758117","DOIUrl":"https://doi.org/10.1109/CIBCB.2016.7758117","url":null,"abstract":"Next-generation sequencing (NGS) technologies have superseded traditional Sanger sequencing approach in many experimental settings, given their tremendous yield and affordable cost. Nowadays it is possible to sequence any microbial organism or meta-genomic sample within hours, and to obtain a whole human genome in weeks. Nonetheless, NGS technologies are error-prone. Correcting errors is a challenge due to multiple factors, including the data sizes, the machine-specific and non-at-random characteristics of errors, and the error distributions. Errors in NGS experiments can hamper the subsequent data analysis and inference. This work proposes an error correction method based on the de Bruijn graph that permits its execution on Gigabyte-sized data sets using normal desktop/laptop computers, ideal for genome sizes in the Megabase range, e.g. bacteria. The implementation makes extensive use of hashing techniques, and implements an A* algorithm for optimal error correction, minimizing the distance between an erroneous read and its possible replacement with the Needleman-Wunsch score. Our approach outperforms other popular methods both in terms of random access memory usage and computing times.","PeriodicalId":368740,"journal":{"name":"2016 IEEE Conference on Computational Intelligence in Bioinformatics and Computational Biology (CIBCB)","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133056628","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Epileptic seizure prediction using zero-crossings analysis of EEG wavelet detail coefficients","authors":"Sahar Elgohary, S. Eldawlatly, M. Khalil","doi":"10.1109/CIBCB.2016.7758115","DOIUrl":"https://doi.org/10.1109/CIBCB.2016.7758115","url":null,"abstract":"Predicting the occurrence of epileptic seizures can provide an enormous aid to epileptic patients. This paper introduces a novel patient-specific method for seizure prediction applied to scalp Electroencephalography (EEG) signals. The proposed method relies on the count of zero-crossings of wavelet detail coefficients of EEG signals as the major feature. This is followed by a binary classifier that discriminates between preictal and interictal states. The proposed method is practical for real-time applications given its computational efficiency as it uses an adaptive algorithm for channel selection to identify the optimum number of needed channels. Moreover, this method is robust against the variability across seizures for the same patient. Applied to data from 8 patients, the proposed method achieved high accuracy and sensitivity with an average accuracy of 94% and an average sensitivity of 96%. These results were obtained using only 10 minutes of training data as opposed to using hours of recordings typically used in traditional approaches.","PeriodicalId":368740,"journal":{"name":"2016 IEEE Conference on Computational Intelligence in Bioinformatics and Computational Biology (CIBCB)","volume":"124 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116431758","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Daniele Ramazzotti, Marco S. Nobile, P. Cazzaniga, G. Mauri, M. Antoniotti
{"title":"Parallel implementation of efficient search schemes for the inference of cancer progression models","authors":"Daniele Ramazzotti, Marco S. Nobile, P. Cazzaniga, G. Mauri, M. Antoniotti","doi":"10.1109/CIBCB.2016.7758109","DOIUrl":"https://doi.org/10.1109/CIBCB.2016.7758109","url":null,"abstract":"The emergence and development of cancer is a consequence of the accumulation over time of genomic mutations involving a specific set of genes, which provides the cancer clones with a functional selective advantage. In this work, we model the order of accumulation of such mutations during the progression, which eventually leads to the disease, by means of probabilistic graphic models, i.e., Bayesian Networks (BNs). We investigate how to perform the task of learning the structure of such BNs, according to experimental evidence, adopting a global optimization meta-heuristics. In particular, in this work we rely on Genetic Algorithms, and to strongly reduce the execution time of the inference-which can also involve multiple repetitions to collect statistically significant assessments of the data-we distribute the calculations using both multi-threading and a multi-node architecture. The results show that our approach is characterized by good accuracy and specificity; we also demonstrate its feasibility, thanks to a 84× reduction of the overall execution time with respect to a traditional sequential implementation.","PeriodicalId":368740,"journal":{"name":"2016 IEEE Conference on Computational Intelligence in Bioinformatics and Computational Biology (CIBCB)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129452319","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
K. P. Aschenbrenner, Sebastian Butzek, C. Guthier, Matthias Krufczik, M. Hausmann, F. Bestvater, J. Hesser
{"title":"Compressed sensing denoising for segmentation of localization microscopy data","authors":"K. P. Aschenbrenner, Sebastian Butzek, C. Guthier, Matthias Krufczik, M. Hausmann, F. Bestvater, J. Hesser","doi":"10.1109/CIBCB.2016.7758097","DOIUrl":"https://doi.org/10.1109/CIBCB.2016.7758097","url":null,"abstract":"Localization microscopy (LM) allows to acquire pointillistic superresolution images of biological structures on the nanoscale. However, current structure reconstruction and segmentation approaches suffer from either exclusion of small structures or strong dependence on a-priori knowledge. We propose reconstruction methods based on compressed sensing (CS) denoising in combination with the isodata threshold for segmentation. The methods are verified on artificial test data. For the denoising, a Haar dictionary and a KSVD dictionary learning on artificial data are used. Both methods perform significantly better than the reference algorithm, a linear density filter, in terms of root-mean-square deviation from the ground truth. Furthermore, exemplary results on real LM data of irradiated cell nuclei with Heterochromatin labeling make small structures visible that are suppressed by the reference method. CS denoising demonstrates promising results for reconstruction of LM data.","PeriodicalId":368740,"journal":{"name":"2016 IEEE Conference on Computational Intelligence in Bioinformatics and Computational Biology (CIBCB)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128729234","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Automatic lung segmentation in chest radiographs using shadow filter and local thresholding","authors":"P. Pattrapisetwong, W. Chiracharit","doi":"10.1109/CIBCB.2016.7758113","DOIUrl":"https://doi.org/10.1109/CIBCB.2016.7758113","url":null,"abstract":"Lung segmentation is one of the essential steps in order to develop a Computer-aided Diagnosis (CAD) system for detection of some chest diseases in chest radiographs such as tuberculosis, lung cancer, atelectasis, etc. This paper proposes an unsupervised learning method for lung segmentation in chest radiographs based on shadow filter and local thresholding. The approach consists of three processes: pre-processing, initial lung field estimation and noise elimination. For the first step, the original images are resized and contrast enhanced. Then, each lung outlines are enhanced by shadow filter. The initial lung field estimation are obtained based on local thresholding, delete outer body regions, fill holes and filter regions from their property. However, noise has occurred in the result. To eliminate the noise, morphological operations techniques are used. To evaluate the performance, the proposed method was tested on a public JSRT dataset of 247 chest radiographs. The performance measures of proposed method (overlap, accuracy, sensitivity, specificity, precision, and F-score) are above 90%. The accuracy and overlap are 96.95% and 90.32% respectively with the average execution time of 18.68 s for 512 by 512 pixels resolutions. According to experimental results, our proposed method is unsupervised learning method, no training required and performed accurately.","PeriodicalId":368740,"journal":{"name":"2016 IEEE Conference on Computational Intelligence in Bioinformatics and Computational Biology (CIBCB)","volume":"79 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116415633","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}