{"title":"Application of Intelligent Medical Self-Test Management","authors":"Chein-Ting Chen, Sheng-Huang Kuo","doi":"10.1109/ECBIOS57802.2023.10218467","DOIUrl":"https://doi.org/10.1109/ECBIOS57802.2023.10218467","url":null,"abstract":"COVID-19 has spread all over the world since 2019 with the number of confirmed cases of 670 million in January 2023 and the number of confirmed cases of 9.66 million in Taiwan. It resulted in 16,000 deaths (Taiwan Centers for Disease Control, 2023). Therefore, the concept and knowledge of preventive medicine, how to prevent, self-examine, and be self-aware, has become a critical topic. Therefore, taking management as the starting point, we collected relevant information from literature and expert interviews and established a set of “intelligent medical self-test management applications” on the Android system. When the symptoms of physical pains are input, the mobile phone intelligently self-detects possible diseases and provides a reference for medical treatment at the hospital located near the mobile phone's location. The self-detection function and early medical treatment are allowed with the system to protect and maintain the health of people.","PeriodicalId":334600,"journal":{"name":"2023 IEEE 5th Eurasia Conference on Biomedical Engineering, Healthcare and Sustainability (ECBIOS)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114970583","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Deep Learning-Assisted Lung Cancer Diagnosis from Histopathology Images","authors":"Chun-Cheng Peng, Jiawei Wu","doi":"10.1109/ECBIOS57802.2023.10218594","DOIUrl":"https://doi.org/10.1109/ECBIOS57802.2023.10218594","url":null,"abstract":"Early detection plays a critical role in enhancing patient survival rates as lung cancer continues to pose a significant global health challenge and remains one of the primary contributors to cancer-related mortality. Deep learning techniques are promising as they assist doctors in disease diagnosis, especially in medical imaging. In this research, we employed a dataset comprising histopathology images of lung cancer and colon cancer from Kaggle. The data encompassed five distinct categories of the tissues of the lung and colon. To classify the images, we used a deep learning methodology that leveraged the pre-trained neural network known as AlexNet. After fine-tuning the proposed model by substituting the last fully-connected layer, we optimized the parameters using the SGDM optimizer. As a result, the overall accuracy of the method reached 99.46%. Across all considered categories, the lung benign group performed best with 100% in terms of accuracy. The overall accuracy of this research surpassed that of three previously published journal papers and six conference papers, effectively proving the remarkable capability of deep learning in accurately classifying lung cancer images. In conclusion, this research result underscores the potential of deep learning in supporting medical professionals to diagnose lung cancer.","PeriodicalId":334600,"journal":{"name":"2023 IEEE 5th Eurasia Conference on Biomedical Engineering, Healthcare and Sustainability (ECBIOS)","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131246712","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Malek Anabtawi, Jhasketan Padhan, A. Al-Ansari, Zhigang Deng, Elias Yaacoub, A. Mohammed, N. Navkar
{"title":"Acquisition and Remote Transfer of Operative Field View During Open Surgery","authors":"Malek Anabtawi, Jhasketan Padhan, A. Al-Ansari, Zhigang Deng, Elias Yaacoub, A. Mohammed, N. Navkar","doi":"10.1109/ECBIOS57802.2023.10218549","DOIUrl":"https://doi.org/10.1109/ECBIOS57802.2023.10218549","url":null,"abstract":"Telementoring in open surgery has emerged as an alternative to on-site teaching. To ensure effective communication between the mentee and the mentor, the view of surgical operations in the operating room needs to be shared remotely in real-time. This study is carried out to propose a multi-threaded system that takes input from multi-color/depth cameras, aligns and creates a single point cloud representing the operative field in three-dimension, transfers it over the network, and reconstructs it for visualization for mentors. The system's performance is measured in terms of execution time of each threaded unit, and latency for transfer of multi-color/depth data over the network. The achieved registration accuracy is benchmarked for multiple cameras. The results show the potential usage of the proposed system for near real-time telementoring during open surgeries.","PeriodicalId":334600,"journal":{"name":"2023 IEEE 5th Eurasia Conference on Biomedical Engineering, Healthcare and Sustainability (ECBIOS)","volume":"48 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121959143","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Design of Emote Education Platform for Traditional Chinese Medicine Pharmacology Based on VR Technology","authors":"Chuying Wang, Yajie Zhang, Lijing Li, Yumei Li","doi":"10.1109/ECBIOS57802.2023.10218433","DOIUrl":"https://doi.org/10.1109/ECBIOS57802.2023.10218433","url":null,"abstract":"Virtual reality (VR) is one of the notable technologies. Using visual, auditory, and other functions, consumers are provided with an immersive experience. Therefore, we investigated the use of the combination of VR and teaching and explored learners' management. A remote education system was proposed for the pharmacology of traditional Chinese medicine. Teaching with VR surpassed traditional teaching methods and virtualized teaching in classrooms. It allowed learners to communicate directly and increase attractiveness and interactivity. It also made the teaching process simpler and broke through spatial limitations greatly by improving classroom effectiveness. This model had a positive impact on the improvement of traditional teaching, the transformation of teaching models, and the enhancement of new teaching models.","PeriodicalId":334600,"journal":{"name":"2023 IEEE 5th Eurasia Conference on Biomedical Engineering, Healthcare and Sustainability (ECBIOS)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122922044","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Yeh, Wei-Tse Hung, Chia-Chen Chang, Ting-Hao Wang
{"title":"Facial Image Emotion Recognition Based on Convolutional Neural Networks and Haar Classifiers","authors":"J. Yeh, Wei-Tse Hung, Chia-Chen Chang, Ting-Hao Wang","doi":"10.1109/ECBIOS57802.2023.10218544","DOIUrl":"https://doi.org/10.1109/ECBIOS57802.2023.10218544","url":null,"abstract":"Facial expression shows the richest human expression and mainly conveys emotions and social signals. In recent years, the development of artificial intelligence technology and sufficient data have broken through previous limitations, opening up the development of intelligent emotion recognition. In this study, emotion recognition is conducted by a deep learning model with multiple layers to describe global features of facial emotions with facial images as input data and neural networks to learn facial features such as eyebrows, eyes, and mouth. The proposed model objectively and quickly presents emotional results, making it applicable to customer service feedback, judgment basis for medical personnel, fatigue driving detection, and more. The model uses facial images as input into a Haar classifier to remove the background of the image and focus on capturing the facial region. Based on the Convolution Neural Network (CNN) and the FER-2013 (Facial Expression Recognition 2013) test dataset. After the user inputs the facial image, the system's prediction accuracy increased by 7.83% compared to the baseline system, effectively improving the accuracy of facial emotion recognition.","PeriodicalId":334600,"journal":{"name":"2023 IEEE 5th Eurasia Conference on Biomedical Engineering, Healthcare and Sustainability (ECBIOS)","volume":"102 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116215114","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Hejda, T. Sugiarto, P. Volf, Yi-Jia Lin, P. Kutílek, W. Hsu, Marek Sokol, Jia-Lin Wu, Lýdie Leová, Yah-Shiun Jiang, Yong-Jie Deng
{"title":"Use of Nonlinear Analysis Methods for Evaluating IMU Data of Bilateral Jump Landing Tasks","authors":"J. Hejda, T. Sugiarto, P. Volf, Yi-Jia Lin, P. Kutílek, W. Hsu, Marek Sokol, Jia-Lin Wu, Lýdie Leová, Yah-Shiun Jiang, Yong-Jie Deng","doi":"10.1109/ECBIOS57802.2023.10218375","DOIUrl":"https://doi.org/10.1109/ECBIOS57802.2023.10218375","url":null,"abstract":"The use of nonlinear analysis methods provides new information when evaluating linear acceleration and angular velocity from a system with Inertial Measurement Unit (IMU) recording. This information is used as additional input to improve the estimation of the angular displacements in a neural network model. The measurements were performed on 24 participants (18 males and 6 females of an average age of $22.6pm 2.6$ years old, average height of $172.6pm 10.3$ cm, and an average weight of $72.2pm 16.02 text{kg})$ during bilateral jump landing tasks. In order to assess the differences between IMU estimated angle and the gold standard, data obtained from Qualysis optical Mocap (Qualisys AB, Göteborg, Sweden) and Delsys inertial measurement systems (Delsys Inc., Boston, MA, USA) were used for measurements during bilateral jump landing tasks. A total of 8 IMU sensors were placed on the sternum, L5, bilateral thighs, shanks, and foot. The thigh and shank sensors were placed on the middle of each thigh and shank along the anterior-posterior axis (middle thigh and middle shank) while the foot sensors were placed on the dorsal surface of the foot. Thirty retroreflective markers were placed on the pelvis and bilateral thigh, shanks, and foot to form a 7-linkage lower extremity model. Static calibration on each of the participants was performed during standing with anatomical position to define the neutral joint angle at bilateral hip, knee, and ankle. For quantification purposes, the Hurst exponent, Lyapunov exponent, approximate entropy, and multiscale sample entropy were used. The results suggest that when evaluating the placement of IMU on the shank and thigh to determine the knee angle, the Hurst exponent is capable of best distinguishing individual axes based on linear acceleration and angular velocity.","PeriodicalId":334600,"journal":{"name":"2023 IEEE 5th Eurasia Conference on Biomedical Engineering, Healthcare and Sustainability (ECBIOS)","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126941300","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Detection of Bradycardia in Preterm Infants by Using ECG and Respiratory Signals","authors":"Ting-Kai Hsu, Hangzhang Cheng, S. Yao","doi":"10.1109/ECBIOS57802.2023.10218577","DOIUrl":"https://doi.org/10.1109/ECBIOS57802.2023.10218577","url":null,"abstract":"This study is conducted to detect bradycardia in preterm infants. The signal of electrocardiogram (ECG) has been widely applied to the detection. We propose to consider respiratory signals to improve the accuracy of classification. The machine learning model AutoML is used for feature selection and classification. The training data include ECG and respiratory signals. The target is to determine whether symptoms of bradycardia occur in preterm infants. Through the experimental results, the classification by analyzing the features of the ECG together with the respiratory signal showed an average accuracy of 79.2% which was better than 75.32% by using ECG only and 62.08% by using a respiratory signal only. The comparison result shows that in addition to ECG, respiratory signals are important in the detection of bradycardia.","PeriodicalId":334600,"journal":{"name":"2023 IEEE 5th Eurasia Conference on Biomedical Engineering, Healthcare and Sustainability (ECBIOS)","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125817581","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Minh Thanh Do, Hoang Nhut Huynh, Trung Nghia Tran, T. Hoang
{"title":"Prediction of Retina Damage in Optical Coherence Tomography Image Using Xception Architecture Model","authors":"Minh Thanh Do, Hoang Nhut Huynh, Trung Nghia Tran, T. Hoang","doi":"10.1109/ECBIOS57802.2023.10218586","DOIUrl":"https://doi.org/10.1109/ECBIOS57802.2023.10218586","url":null,"abstract":"One of the most vital human organs is the retina. Most of the data is gathered via our eyesight. Thus, maintaining good eye health is crucial for happy and healthy life and eyes. Unfortunately, hazardous eye conditions, including choroidal neovascularization (CNV), Drusen, and diabetic macular edema (DME), directly harm the retina. They are typically discovered late, making it frequently impossible to cure and restore vision. Significant vision loss or even complete blindness may result from this. Ophthalmologists can view the inner structure of the retina using the sophisticated medical imaging technology known as noninvasive retinal optical coherent tomography (OCT), which relies on the visual reflection of the tissues inside the eye. However, in practice, errors continue to occur in diagnosing illnesses in general and eye ailments in particular. Thus, we develop a deep learning model to help physicians diagnose CNV, Drusen, and DME more correctly and lessen medical examination and treatment mistakes. About 8,000 images from the Large Dataset of Labeled Optical Coherence Tomography (OCT) Images were used to train with the Xception's architecture model. The result of this classification study for three types of DME, CNV, and Drusen diseases showed an accuracy of 93%.","PeriodicalId":334600,"journal":{"name":"2023 IEEE 5th Eurasia Conference on Biomedical Engineering, Healthcare and Sustainability (ECBIOS)","volume":"138 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126745521","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Unleashing Potential of Employees through Artificial Intelligence","authors":"Aishwarya Gowda A G, Hui-Kai Su, W. Kuo","doi":"10.1109/ECBIOS57802.2023.10218636","DOIUrl":"https://doi.org/10.1109/ECBIOS57802.2023.10218636","url":null,"abstract":"Artificial intelligence (AI) is rapidly transforming various industries, including the labor force. The application of AI in the workplace has the potential to improve employees' productivity by automating repetitive and mundane tasks, providing personalized recommendations, and augmenting decision-making capabilities. We explore the impact of AI on employee productivity by examining the theoretical underpinnings of productivity and how it can be measured. Furthermore, we propose a model that integrates AI features to increase employee productivity. To illustrate the effectiveness of AI in increasing productivity, the proposed model integrates AI features such as natural language processing and machine learning algorithms. The model uses AI to automate repetitive tasks, provide personalized recommendations, and augment decision-making capabilities. The model is tested on a sample of employees, and the results demonstrate that AI significantly increases productivity.","PeriodicalId":334600,"journal":{"name":"2023 IEEE 5th Eurasia Conference on Biomedical Engineering, Healthcare and Sustainability (ECBIOS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129417015","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}