{"title":"Network Pharmacology-Based Analysis on the Mechanism of Drug Pair of Astragalus Membranaceus and Acorus Tatarinowii in the Treatment of Alzheimer’s Disease","authors":"Xiuting Huang, Xiujin Zhang, Xiangning Li, Haozhen Wang, Chen Lu, Ziyin Lu, Xiuli Lu","doi":"10.3233/atde210250","DOIUrl":"https://doi.org/10.3233/atde210250","url":null,"abstract":"Alzheimer’s disease (AD) is a degenerative disease of central nervous system, which seriously threatens the life and health of the elderly people. It has been for long time that Traditional Chinese medicine (TCM) treatment for AD is effective. This study analyzed the potential target and molecular mechanism of the most often used drug pair of Astragalus membranaceus and Acorus tatarinowii to treat AD by network pharmacological method. Firstly, the method was performed to screen and sort out the active ingredients with good ADME properties and drug targets of Astragalus membranaceus and Acorus tatarinowii. Then, we searched for the disease targets related to AD, followed by the construction of the “active ingredients-target-disease” network. We implemented GO enrichment analysis and KEGG pathway enrichment analysis of related overlapped target proteins, and then constructed the “target-pathway” network diagram. Finally, the above overlapped target proteins are mapped to build a PPI high-position protein interoperability network, and we conducted the network topology analysis to screen out the core targets of Astragalus membranaceus-Acorus tatarinowii drug pair in the treatment of AD. According to network pharmacology, this research predicted the potential targets of the drug pair of Astragalus membranaceus and Acorus tatarinowii in the treatment of AD, and explored that Astragalus membranaceus-Acorus tatarinowii drug pair in the treatment of AD was the overall systematic regulating action of “multiple-ingredients, multiple-target and multiple-pathway”. It affords the reference for understanding the pathogenesis of AD and exploring new therapeutic methods and drug development in the future.","PeriodicalId":386877,"journal":{"name":"Computer Methods in Medicine and Health Care","volume":"12 10","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132580652","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"EHG-Based Preterm Delivery Prediction Algorithm Driven by Transfer Learning","authors":"Yanjun Deng, Yefei Zhang, Shenguan Wu, Lihuan Shao, Xiaohong Zhang","doi":"10.3233/atde210243","DOIUrl":"https://doi.org/10.3233/atde210243","url":null,"abstract":"Preterm delivery is currently a global concern of maternal and child health, which directly affects infants’ early morbidity, and even death in several severe cases. Therefore, it is particularly important to effectively monitor the uterine contraction of perinatal pregnant women, and to make effective prediction and timely treatment for the possibility of preterm delivery. Electromyography (EHG) signal, an important measurement to predict preterm delivery in clinical practice, shows obvious consistency and correlation with the frequency and intensity of uterine contraction. This paper proposed a deep convolution neural network (DCNN) model based on transfer learning. Specifically, it is based on the VGGNet model, combined with recurrence plot (RP) analysis and transfer learning techniques such as “Fine-tune”, marked as VGGNet19-I3. Optimized with the clinical measured term-preterm EHG database, it showed good auxiliary prediction performances in 78 training and test samples, and achieved a high accuracy of 97.00% in 100 validation samples.","PeriodicalId":386877,"journal":{"name":"Computer Methods in Medicine and Health Care","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125575536","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Cuff-Less Blood Pressure Estimation from Electrocardiogram and Photoplethysmography Based on VGG19-LSTM Network","authors":"Yanan Pu, Xiaoxue Xie, Ling Xiong, Heng Zhang","doi":"10.3233/atde210241","DOIUrl":"https://doi.org/10.3233/atde210241","url":null,"abstract":"In recent years, studies have found that the hierarchical neural network with LSTM network has higher accuracy than another feature engineering. Therefore, this paper first tries to build a multi-stage blood pressure estimation model through VGG19 and LSTM network. Based on the time node of the R wave peak in the QRS waveform in ECG, VGG19 is used to extract various higher-dimensional and rich life characteristics in the PPG signal segment by heartbeat as the unit and focus on processing the dynamics of SBP and DBP Correlation, finally use the LSTM model to extract the time dependence of the vital signs. Results: Experiments show that compared with similar multi-stage models, this model has higher accuracy. The performance of this method meets the Advancement of Medical Instrumentation (AAMI) standard and reaches the A level of the British Hypertension Society (BHS) standard. The average error and standard deviation of the estimated value of SBP were 1.7350 4.9606 mmHg, and the average error and standard deviation of the estimated value of DBP were 0.7839 2.7700 mmHg, respectively.","PeriodicalId":386877,"journal":{"name":"Computer Methods in Medicine and Health Care","volume":"60 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130278381","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Impact Measurement of a Collaborative Pathology Network and Its Digital Support","authors":"Paolo Locatelli, F. Cirilli, Fabio Chiodini","doi":"10.3233/atde210240","DOIUrl":"https://doi.org/10.3233/atde210240","url":null,"abstract":"The possibility to access healthcare fairly and equally among all the patients can be enhanced with the development of collaborative networks. To achieve their goals and exchange relevant information, they must be combined with a proper digital support. Several works dealing with this aspect can be found in literature; however, works defining a general methodological approach to design a digital solution for a collaborative network were not found. In addition to this, to assess the impact of a pathology network and its digital support, and ensure quality improvement as well as proper clinical outcomes, a suitable panel of key performance indicators (KPIs) should be designed. This paper describes a methodology to design a digital support of a collaborative pathology network, together with a set of KPIs to assess the impact of the pathology network and its digital solution. This approach was specifically applied for the Italian Rare Cancer Network in the context of the project “Italian Rare Cancer Network: Process monitoring and System Impact Assessment”.","PeriodicalId":386877,"journal":{"name":"Computer Methods in Medicine and Health Care","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114721126","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yunmei Du, Canhui Huang, Shuai Huang, Huiying Liang
{"title":"Recognition of Child Congenital Heart Disease Using Cardiac Cycle Segment of Electrocardiogram","authors":"Yunmei Du, Canhui Huang, Shuai Huang, Huiying Liang","doi":"10.3233/atde210249","DOIUrl":"https://doi.org/10.3233/atde210249","url":null,"abstract":"The results of previous studies showed that ECG could detect CHD in children with a detection rate of 76.43%. Although this result is better than the traditional CHD screening method, the sensitivity still needs to be improved if it is to be popularized clinically. Based on the previous ECG recording data, this study selects the more representative cardiac cycle segments to identify CHD, in order to achieve better screening effect. Firstly, better cardiac cycle segment data were extracted from ECG records of each patient. The final data set contains 72626 patients and each patient has a 9-lead ECG segment with duration of about one second. Then we trained a RoR network to identify the underlying patients with CHD using 62626 samples in the dataset. When tested on an independent set of 10000 patients, the network model yielded values for the sensitivity, specificity, and accuracy of 0.93, 86.3%, 85.7%, and 85.7% respectively. It can be seen that extracting more effective cardiac cycle fragments can significantly improve the sensitivity of CHD screening on the basis of ensuring better specificity, so as to find more potential patients with congenital heart disease.","PeriodicalId":386877,"journal":{"name":"Computer Methods in Medicine and Health Care","volume":"72 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130651906","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Dual Features Extraction Network for Image Super-Resolution","authors":"Guosheng Zhao, Kun Wang","doi":"10.3233/atde210239","DOIUrl":"https://doi.org/10.3233/atde210239","url":null,"abstract":"With the development of deep convolutional neural network, recent research on single image super-resolution (SISR) has achieved great achievements. In particular, the networks, which fully utilize features, achieve a better performance. In this paper, we propose an image super-resolution dual features extraction network (SRDFN). Our method uses the dual features extraction blocks (DFBs) to extract and combine low-resolution features, with less noise but less detail, and high-resolution features, with more detail but more noise. The output of DFB contains the advantages of low- and high-resolution features, with more detail and less noise. Moreover, due to that the number of DFB and channels can be set by weighting accuracy against size of model, SRDFN can be designed according to actual situation. The experimental results demonstrate that the proposed SRDFN performs well in comparison with the state-of-the-art methods.","PeriodicalId":386877,"journal":{"name":"Computer Methods in Medicine and Health Care","volume":"119 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122459060","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Network Pharmacology Study to Explore Mechanism of the Drug Pair of Astragalus-Saposhnikoviae Radix in the Treatment of Allergic Rhinitis","authors":"Chen-Xin Lu, Limin Ma, Haozhen Wang, Xiuting Huang, Xiujin Zhang, Ziyin Lu, Xiuli Lu","doi":"10.3233/atde210242","DOIUrl":"https://doi.org/10.3233/atde210242","url":null,"abstract":"Allergic rhinitis (AR) has now become one of the major diseases affecting people’s lives, and Traditional Chinese medicine (TCM) always has good efficacy in clinical treatment. In the present study, we analyzed the most frequently used drug pair of Astragalus-Saposhnikoviae Radix (SR) in prescriptions for the treatment of allergic rhinitis by network pharmacology to reveal the modern pharmacological mechanisms of drug prevention and treatment of the disease. Firstly, the 38 active ingredients with good ADME properties from the Astragalus-SR drug pair were collected from the database, and the collated drug targets of Astragalus and SR and the targets of allergic rhinitis were mapped against each other by the network visualization software Cytoscape, followed by the establishment of a “drug active ingredient-target-disease” network diagram and the construction of a high-confidence protein-protein interaction network. Then, the common targets obtained from the disease and drug active ingredients were imported by R language for GO enrichment analysis and KEGG pathway enrichment analysis. The KEGG pathways associated with the targets of Astragalus and SR for the treatment of allergic rhinitis obtained from R enrichment analysis were imported into Cytoscape, and the CytoNCA plug-in was loaded to construct a “target-pathway” network map, and the core target wogonin (FN1) was screened. These evidences suggest that the drug pair of Astragalus-SR works in a multi-component, multi-target and integrated modulation manner for the treatment of allergic rhinitis, which provides an important basis for the treatment of allergic rhinitis.","PeriodicalId":386877,"journal":{"name":"Computer Methods in Medicine and Health Care","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122842394","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Automatic Generalization of Residential Areas Based on “Paradigm” Theory and Big Data","authors":"Tian Duo, Peng Zhang","doi":"10.3233/atde210244","DOIUrl":"https://doi.org/10.3233/atde210244","url":null,"abstract":"“Paradigm” theory is an important ideological and practical tool for scientific research. The research means and methods of Geographic Information Science follow the laws of four paradigms. Automatic cartographic generalization is not only the key link of map making, but also a recognized difficult and hot issue. Based on large-scale map data and deep learning technology, an automatic cartographic generalization problem-solving model is proposed in this paper. According to the key and difficult problems faced by residential area selection and simplification, residential area selection models and simplification models based on big data and deep learning are constructed respectively, which provides new ideas and schemes to solve the key and difficult problems of residential area selection and simplification.","PeriodicalId":386877,"journal":{"name":"Computer Methods in Medicine and Health Care","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123754918","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Application of Artificial Intelligence in the Prevention, Diagnosis and Treatment of Alzheimer’s Disease: New Hope for Dealing with Aging in China","authors":"Hangtian Wang, Guofu Wang","doi":"10.3233/atde210248","DOIUrl":"https://doi.org/10.3233/atde210248","url":null,"abstract":"Alzheimer’s disease (AD) has become a major issue around world, including China. The two major challenges for AD are the difficulty in early detection and poor treatment outcomes. Over the past decades, artificial intelligence (AI) was more and more widely used in the prevention, diagnosis and treatment of AD, which might be helpful to deal with the aging of population in China. Here, after a systematic literature searching on three English databases (MEDLINE, EMBASE, the Cochrane library), we briefly reviewed recent progress on the utilization of AI in the susceptibility analysis, diagnosis and management of AD. However, it is still in its infancy. More researches should be performed to improve the prognosis of patients with AD in the future.","PeriodicalId":386877,"journal":{"name":"Computer Methods in Medicine and Health Care","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125317284","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Research on Design of Fog Computing Optimization Model for Medical Big Data","authors":"Baoling Qin","doi":"10.3233/atde210247","DOIUrl":"https://doi.org/10.3233/atde210247","url":null,"abstract":"Targeted at the current issues of communication delay, data congestion, and data redundancy in cloud computing for medical big data, a fog computing optimization model is designed, namely an intelligent front-end architecture of fog computing. It uses the network structure characteristics of fog computing and “decentralized and local” mind-sets to tackle the current medical IoT network’s narrow bandwidth, information congestion, heavy computing burden on cloud services, insufficient storage space, and poor data security and confidentiality. The model is composed of fog computing, deep learning, and big data technology. By full use of the advantages of WiFi and user mobile devices in the medical area, it can optimize the internal technology of the model, with the help of classification methods based on big data mining and deep learning algorithms based on artificial intelligence, and automatically process case diagnosis, multi-source heterogeneous data mining, and medical records. It will also improve the accuracy of medical diagnosis and the efficiency of multi-source heterogeneous data processing while reducing network delay and power consumption, ensuring patient data privacy and safety, reducing data redundancy, and reducing cloud overload. The response speed and network bandwidth of the system have been greatly optimized in the process, which improves the quality of medical information service.","PeriodicalId":386877,"journal":{"name":"Computer Methods in Medicine and Health Care","volume":"204 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129353119","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}