Pervasive and Mobile Computing最新文献

筛选
英文 中文
Efficient and secure heterogeneous online/offline signcryption for wireless body area network 无线体域网络的高效安全异构在线/离线签名加密
IF 4.3 3区 计算机科学
Pervasive and Mobile Computing Pub Date : 2024-02-10 DOI: 10.1016/j.pmcj.2024.101893
Huihui Zhu, Chunhua Jin, Yongliang Xu, Guanhua Chen, Liqing Chen
{"title":"Efficient and secure heterogeneous online/offline signcryption for wireless body area network","authors":"Huihui Zhu,&nbsp;Chunhua Jin,&nbsp;Yongliang Xu,&nbsp;Guanhua Chen,&nbsp;Liqing Chen","doi":"10.1016/j.pmcj.2024.101893","DOIUrl":"10.1016/j.pmcj.2024.101893","url":null,"abstract":"<div><p>As a special Internet of Things (IoT) application, the wireless body area network (WBAN) has gained widespread attention by medical institutions. However, existing schemes for WBAN data transmission lack heterogeneity support across certificateless cryptosystem (CLC) and public key infrastructure (PKI), resulting in issues like key escrow or complicated certificate management. In addition, for performance reasons, conventional signcryption protocols are unsuitable for WBAN applications. To address these gaps and enable secure and efficient sensitive data transmission from WBAN sensors to hospital servers, we design a heterogeneous online/offline signcryption scheme. Our scheme enables patients’ sensors implanted or worn to encrypt sensitive data in CLC and send it to the hospital server in PKI system. The CLC avoids key escrow issue while the PKI increases scalability. We minimize the online computational cost of WBAN sensors by dividing signcryption into offline and online phases, with time-consuming operations in the offline phase. Furthermore, we formally prove the security of our scheme and evaluate its performance. Results show our scheme has advantages in supporting heterogeneity across CLC and PKI with low computational costs, making it uniquely suitable for the protection of data privacy in WBAN applications compared to existing protocols.</p></div>","PeriodicalId":49005,"journal":{"name":"Pervasive and Mobile Computing","volume":null,"pages":null},"PeriodicalIF":4.3,"publicationDate":"2024-02-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139893128","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
PSO based Amorphous algorithm to reduce localization error in Wireless Sensor Network 基于 PSO 的 Amourphous 算法减少无线传感器网络中的定位误差
IF 4.3 3区 计算机科学
Pervasive and Mobile Computing Pub Date : 2024-02-10 DOI: 10.1016/j.pmcj.2024.101890
Pujasuman Tripathy, P.M. Khilar
{"title":"PSO based Amorphous algorithm to reduce localization error in Wireless Sensor Network","authors":"Pujasuman Tripathy,&nbsp;P.M. Khilar","doi":"10.1016/j.pmcj.2024.101890","DOIUrl":"10.1016/j.pmcj.2024.101890","url":null,"abstract":"<div><p>In recent years, localizing or identifying the position of unknown sensor nodes has become an essential problem in Wireless Sensor Networks (WSN). The improvement in localization accuracy leads to obtaining the exact location of the dumb node. Among all localization algorithms, Amorphous localization is highly suggested for usage in many application domains due to its simplicity, viability, low cost, and zero additional hardware requirements. Position estimation of the dumb node in the Amorphous algorithm considers three different practical scenarios, such as the position of dumb nodes falling within the range of anchor nodes, the position of the dumb node being in the opposite direction of the anchor node, and the position of the dumb node not within the range of anchor node. However, the localization error generated by the Amorphous algorithm is high. To address the limitations of Amorphous algorithm we have proposed a PSO based Amorphous algorithm. The proposed work reduces the average hop size of anchor nodes and reduces the localization error. The simulation results demonstrate that, in comparison to other existing Amorphous algorithms, the proposed PSO based Amorphous localization algorithm produced a superior performance in terms of MAE, MSE and RMSE.</p></div>","PeriodicalId":49005,"journal":{"name":"Pervasive and Mobile Computing","volume":null,"pages":null},"PeriodicalIF":4.3,"publicationDate":"2024-02-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139822193","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
TODO: Task Offloading Decision Optimizer for the efficient provision of offloading schemes TODO:任务卸载决策优化器,用于有效提供卸载方案
IF 4.3 3区 计算机科学
Pervasive and Mobile Computing Pub Date : 2024-02-10 DOI: 10.1016/j.pmcj.2024.101892
Shilin Chen , Xingwang Wang , Yafeng Sun
{"title":"TODO: Task Offloading Decision Optimizer for the efficient provision of offloading schemes","authors":"Shilin Chen ,&nbsp;Xingwang Wang ,&nbsp;Yafeng Sun","doi":"10.1016/j.pmcj.2024.101892","DOIUrl":"10.1016/j.pmcj.2024.101892","url":null,"abstract":"<div><p>As the volume of data stored on local devices increases, users turn to edge devices to help with processing tasks. Developing offloading schemes is challenging due to the varying configurations of edge devices and user preferences. While traditional methods provide schemes for offloading in various scenarios, they face unavoidable challenges, including the requirement to manage device workloads in real-time, significant computational costs, and the difficulty of balancing multi-objectives in offloading schemes. To solve these problems, we propose the Task Offloading Decision Optimizer, which offers efficient multi-objective offloading schemes that consider real-time device workload and user preference. The proposed offloading scheme contains three goals: reducing task execution time, decreasing device energy consumption, and lowering rental costs. It comprises two essential parts: Scheme Maker and Scheme Assistor. Scheme Maker utilizes deep reinforcement learning, optimizes the internal architecture, and enhances the performance of the operation. It optimizes buffer storage to generate dependable multi-objective offloading schemes considering real-time environmental conditions. Scheme Assistor utilizes the data in the Scheme Maker buffer to enhance efficiency by reducing computational costs. Extensive experiments have proved that the proposed framework efficiently provides offloading schemes considering the real-time conditions of the devices and the users, and it offers offloading schemes that enhance task completion rate by 50%. Compared to the baseline, the task execution time is reduced by 12%, and the device energy consumption is reduced by 11.1%.</p></div>","PeriodicalId":49005,"journal":{"name":"Pervasive and Mobile Computing","volume":null,"pages":null},"PeriodicalIF":4.3,"publicationDate":"2024-02-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139812680","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Reinforcement learning-based load balancing for heavy traffic Internet of Things 基于强化学习的大流量物联网负载均衡
IF 4.3 3区 计算机科学
Pervasive and Mobile Computing Pub Date : 2024-02-08 DOI: 10.1016/j.pmcj.2024.101891
Jianjun Lei, Jie Liu
{"title":"Reinforcement learning-based load balancing for heavy traffic Internet of Things","authors":"Jianjun Lei,&nbsp;Jie Liu","doi":"10.1016/j.pmcj.2024.101891","DOIUrl":"https://doi.org/10.1016/j.pmcj.2024.101891","url":null,"abstract":"<div><p>Aiming to large-scale data transmission requirements of resource-constrained IoT (Internet of Things) devices, the routing protocol for low power lossy network (RPL) is expected to handle the load imbalance and high energy consumption in heavy traffic scenarios. This paper proposes a novel <strong>R</strong>PL routing optimization <strong>A</strong>lgorithm based on deep <strong>R</strong>einforcement <strong>L</strong>earning (referred to as RARL), which employs the centralized training and decentralized execution architecture. Hence, the RARL can provide the intelligent parent selection policy for all nodes while improving the training efficiency of deep reinforcement learning (DRL) model. Furthermore, we integrate a new local observation into the RARL by exploiting multiple routing metrics and design a comprehensive reward function for enhancing the load-balance and energy efficiency. Meanwhile, we also optimize the Trickle timer mechanism for adaptively controlling the delivery of DIO messages, which further improves the interaction efficiency with environment of DRL model. Extensive simulation experiments are conducted to evaluate the effectiveness of RARL under various scenarios. Compared with some existing methods, the simulation results demonstrate the significant performance of RARL in terms of network lifetime, queue loss ratio, and packet reception ratio.</p></div>","PeriodicalId":49005,"journal":{"name":"Pervasive and Mobile Computing","volume":null,"pages":null},"PeriodicalIF":4.3,"publicationDate":"2024-02-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139748870","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A novel IoT trust model leveraging fully distributed behavioral fingerprinting and secure delegation 利用全分布式行为指纹和安全授权的新型物联网信任模型
IF 4.3 3区 计算机科学
Pervasive and Mobile Computing Pub Date : 2024-02-08 DOI: 10.1016/j.pmcj.2024.101889
Marco Arazzi , Serena Nicolazzo , Antonino Nocera
{"title":"A novel IoT trust model leveraging fully distributed behavioral fingerprinting and secure delegation","authors":"Marco Arazzi ,&nbsp;Serena Nicolazzo ,&nbsp;Antonino Nocera","doi":"10.1016/j.pmcj.2024.101889","DOIUrl":"https://doi.org/10.1016/j.pmcj.2024.101889","url":null,"abstract":"<div><p>The pervasiveness and high number of Internet of Things (IoT) applications in people’s daily lives make this context a very critical attack surface for cyber threats. The high heterogeneity of involved entities, both in terms of hardware and software characteristics, does not allow the definition of uniform, global, and efficient security solutions. Therefore, researchers have started to investigate novel mechanisms, in which a super node (a gateway, a hub, or a router) analyzes the interactions of the target node with other peers in the network, to detect possible anomalies. The most recent of these strategies base such an analysis on the modeling of the fingerprint of a node behavior in an IoT; nevertheless, existing solutions do not cope with the fully distributed nature of the referring scenario.</p><p>In this paper, we try to provide a contribution in this setting, by designing a novel and fully distributed trust model exploiting point-to-point devices’ behavioral fingerprints, a distributed consensus mechanism, and Blockchain technology. In our solution we tackle the non-trivial issue of equipping smart things with a secure mechanism to evaluate, also through their neighbors, the trustworthiness of an object in the network before interacting with it. Beyond the detailed description of our framework, we also illustrate the security model associated with it and the tests carried out to evaluate its correctness and performance.</p></div>","PeriodicalId":49005,"journal":{"name":"Pervasive and Mobile Computing","volume":null,"pages":null},"PeriodicalIF":4.3,"publicationDate":"2024-02-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S1574119224000154/pdfft?md5=e7b2906244cfb05dbee063203a65f60e&pid=1-s2.0-S1574119224000154-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139738296","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Optimization of mobility sampling in dynamic networks using predictive wavelet analysis 利用预测小波分析优化动态网络中的移动采样
IF 4.3 3区 计算机科学
Pervasive and Mobile Computing Pub Date : 2024-02-01 DOI: 10.1016/j.pmcj.2024.101887
Peppino Fazio , Miralem Mehic , Floriano De Rango , Mauro Tropea , Miroslav Voznak
{"title":"Optimization of mobility sampling in dynamic networks using predictive wavelet analysis","authors":"Peppino Fazio ,&nbsp;Miralem Mehic ,&nbsp;Floriano De Rango ,&nbsp;Mauro Tropea ,&nbsp;Miroslav Voznak","doi":"10.1016/j.pmcj.2024.101887","DOIUrl":"10.1016/j.pmcj.2024.101887","url":null,"abstract":"<div><p>In the last decade, the investigation of mobility features has gained enormous significance in many scenarios as a result of the significant diffusion and deployment of mobile devices covered by high-speed technologies (e.g., 5G). Many contributions in the literature have attempted to discover mobility properties, but most studies are based on the time features of the mobility process. No study has yet considered the effects of setting a proper sampling frequency (generally set to 1 s), in order to avoid information loss. Following our previous works, we propose a novel predictive spectral approach for mobility sampling based on the concept of a predictive wavelet. With this method, the choice of sampling frequency is governed by the current spectral components of the mobility process and derived from an analysis of future, predicted components. To assess whether our proposal may yield a helpful method, we conducted several simulation campaigns to test sampling accuracy and obtained results that confirmed our expectations.</p></div>","PeriodicalId":49005,"journal":{"name":"Pervasive and Mobile Computing","volume":null,"pages":null},"PeriodicalIF":4.3,"publicationDate":"2024-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S1574119224000130/pdfft?md5=4ee4cc1275b8ee0647dbfa8fed17e7b2&pid=1-s2.0-S1574119224000130-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139669507","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
B2auth: A contextual fine-grained behavioral biometric authentication framework for real-world deployment B2auth:用于真实世界部署的上下文细粒度行为生物识别身份验证框架
IF 4.3 3区 计算机科学
Pervasive and Mobile Computing Pub Date : 2024-01-24 DOI: 10.1016/j.pmcj.2024.101888
Ahmed Mahfouz , Ahmed Hamdy , Mohamed Alaa Eldin , Tarek M. Mahmoud
{"title":"B2auth: A contextual fine-grained behavioral biometric authentication framework for real-world deployment","authors":"Ahmed Mahfouz ,&nbsp;Ahmed Hamdy ,&nbsp;Mohamed Alaa Eldin ,&nbsp;Tarek M. Mahmoud","doi":"10.1016/j.pmcj.2024.101888","DOIUrl":"10.1016/j.pmcj.2024.101888","url":null,"abstract":"<div><p>Several behavioral biometric authentication frameworks have been proposed to authenticate smartphone users based on the analysis of sensors and services. These authentication frameworks verify the user identity by extracting a set of behavioral traits such as touch, sensors and keystroke dynamics, and use machine learning and deep learning techniques to develop the authentication models. Unfortunately, it is not clear how these frameworks perform in the real world deployment and most of the experiments in the literature have been conducted with cooperative users in a controlled environment. In this paper, we present a novel behavioral biometric authentication framework, called B2auth, designed specifically for smartphone users. The framework leverages raw data collected from touchscreen on smartphone to extract behavioral traits for authentication. A Multilayer Perceptron (MLP) neural network is employed to develop authentication models. Unlike many existing experiments conducted in controlled environments with cooperative users, we focused on real-world deployment scenarios, collecting data from 60 participants using smartphones in an uncontrolled environment. The framework achieves promising results in differentiating the legitimate owner and an attacker across various app contexts, showcasing its potential in practical use cases. By utilizing minimalist data collection and cloud-based model generation, the B2auth framework offers an efficient and effective approach to behavioral biometric authentication for smartphones.</p></div>","PeriodicalId":49005,"journal":{"name":"Pervasive and Mobile Computing","volume":null,"pages":null},"PeriodicalIF":4.3,"publicationDate":"2024-01-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139578059","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Computation and communication efficient approach for federated learning based urban sensing applications against inference attacks 基于联合学习的城市感知应用对抗推理攻击的计算和通信高效方法
IF 4.3 3区 计算机科学
Pervasive and Mobile Computing Pub Date : 2024-01-09 DOI: 10.1016/j.pmcj.2024.101875
Ayshika Kapoor, Dheeraj Kumar
{"title":"Computation and communication efficient approach for federated learning based urban sensing applications against inference attacks","authors":"Ayshika Kapoor,&nbsp;Dheeraj Kumar","doi":"10.1016/j.pmcj.2024.101875","DOIUrl":"https://doi.org/10.1016/j.pmcj.2024.101875","url":null,"abstract":"<div><p><span><span>Federated learning based participatory sensing has gained much attention lately for the vital task of urban sensing due to privacy and security issues in conventional </span>machine learning<span><span><span>. However, inference attacks by the honest-but-curious application server or a </span>malicious adversary<span> can leak the personal attributes of the participants, such as their home and workplace locations, routines, and habits. Approaches proposed in the literature to prevent such information leakage, such as secure multi-party computation and </span></span>homomorphic encryption<span>, are infeasible for urban sensing applications owing to high communication and computation costs due to multiple rounds of communication between the user and the server. Moreover, for effective modeling of urban sensing phenomenon, the application model needs to be updated frequently — every few minutes or hours, resulting in periodic data-intensive updates by the participants, which severely strains the already limited resources of their mobile devices<span>. This paper proposes a novel low-cost privacy-preserving framework for enhanced protection against the inference of participants’ personal and private attributes from the data leaked through inference attacks. We propose a novel approach of </span></span></span></span><em>strategically</em><span> leaking selected location traces by providing computation and communication-light direct (local) model updates, whereas the rest of the model updates (when the user is at sensitive locations) are provided using secure multi-party computation. We propose two new methods based on spatiotemporal entropy and Kullback–Leibler divergence for automatically deciding which model updates need to be sent through secure multi-party computation and which can be sent directly. The proposed approach significantly reduces the computation and communication overhead for participants compared to the fully secure multi-party computation protocols. It provides enhanced protection against the deduction of personal attributes from inferred location traces compared to the direct model updates by confusing the application server or malicious adversary while inferring personal attributes from location traces. Numerical experiments on the popular Geolife GPS trajectories dataset validate our proposed approach by reducing the computation and communication requirements by the participants significantly and, at the same time, enhancing privacy by decreasing the number of inferred sensitive and private locations of participants.</span></p></div>","PeriodicalId":49005,"journal":{"name":"Pervasive and Mobile Computing","volume":null,"pages":null},"PeriodicalIF":4.3,"publicationDate":"2024-01-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139433384","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Sensor-based agitation prediction in institutionalized people with dementia A systematic review 基于传感器的痴呆症住院患者躁动预测系统综述
IF 4.3 3区 计算机科学
Pervasive and Mobile Computing Pub Date : 2024-01-07 DOI: 10.1016/j.pmcj.2024.101876
Jan Kleine Deters , Sarah Janus , Jair A. Lima Silva , Heinrich J. Wörtche , Sytse U. Zuidema
{"title":"Sensor-based agitation prediction in institutionalized people with dementia A systematic review","authors":"Jan Kleine Deters ,&nbsp;Sarah Janus ,&nbsp;Jair A. Lima Silva ,&nbsp;Heinrich J. Wörtche ,&nbsp;Sytse U. Zuidema","doi":"10.1016/j.pmcj.2024.101876","DOIUrl":"10.1016/j.pmcj.2024.101876","url":null,"abstract":"<div><p>Early detection of agitation in individuals with dementia can lead to timely interventions, preventing the worsening of situations and enhancing their quality of life. The emergence of multi-modal sensing and advances in artificial intelligence make it feasible to explore and apply technology for this goal. We conducted a literature review to understand the current technical developments and challenges of its integration in caregiving institutions. Our systematic review used the Pubmed and IEEE scientific databases, considering studies from 2017 onwards. We included studies focusing on linking sensor data to vocal and/or physical manifestations of agitation. Out of 1622 identified studies, 12 were selected for the final review. Analysis was conducted on study design, technology, decisional data, and data analytics. We identified a gap in the standardized semantic representation of both behavioral descriptions and system event generation configurations. This research highlighted initiatives that leverage existing information in a caregiver's routine, such as correlating electronic health records with sensor data. As predictive systems become more integrated into caregiving routines, false positive reduction needs to be addressed as those will discourage their adoption. Therefore, to ensure adaptive predictive capacity and personalized system re-configuration, we suggest future work to evaluate a framework that incorporates a human-in-the-loop approach for detecting and predicting agitation.</p></div>","PeriodicalId":49005,"journal":{"name":"Pervasive and Mobile Computing","volume":null,"pages":null},"PeriodicalIF":4.3,"publicationDate":"2024-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S1574119224000026/pdfft?md5=f6a8397ac6e02acc44ce653a7d1a2e87&pid=1-s2.0-S1574119224000026-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139456701","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A Transfer Learning and Explainable Solution to Detect mpox from Smartphones images 从智能手机图像中检测 mpox 的迁移学习和可解释解决方案
IF 4.3 3区 计算机科学
Pervasive and Mobile Computing Pub Date : 2024-01-06 DOI: 10.1016/j.pmcj.2023.101874
Mattia Giovanni Campana , Marco Colussi , Franca Delmastro , Sergio Mascetti , Elena Pagani
{"title":"A Transfer Learning and Explainable Solution to Detect mpox from Smartphones images","authors":"Mattia Giovanni Campana ,&nbsp;Marco Colussi ,&nbsp;Franca Delmastro ,&nbsp;Sergio Mascetti ,&nbsp;Elena Pagani","doi":"10.1016/j.pmcj.2023.101874","DOIUrl":"10.1016/j.pmcj.2023.101874","url":null,"abstract":"<div><p>Monkeypox (mpox) virus has become a “public health emergency of international concern” in the last few months, as declared by the World Health Organization, especially for low-income countries. A symptom of mpox infection is the appearance of rashes and skin eruptions, which can lead people to seek medical advice. A technology that might help perform a preliminary screening based on the aspect of skin lesions is the use of Machine Learning<span><span> for image classification. However, to make this technology suitable on a large scale, it should be usable directly on people </span>mobile devices, with a possible notification to a remote medical expert.</span></p><p><span>In this work, we investigate the adoption of Deep Learning<span> to detect mpox from skin lesion images derived from smartphone cameras. The proposal leverages Transfer Learning to cope with the scarce availability of mpox image datasets. As a first step, a homogeneous, unpolluted, dataset was produced by manual selection and preprocessing of available image data, publicly released for research purposes. Subsequently, we compared multiple </span></span>Convolutional Neural Networks<span> (CNNs) using a rigorous 10-fold stratified cross-validation approach and we conducted an analysis to evaluate the models’ fairness towards different skin tones. The best models have been then optimized through quantization for use on mobile devices; measures of classification quality, memory footprint<span>, and processing times validated the feasibility of our proposal. The most favorable outcomes have been achieved by MobileNetV3Large, attaining an F-1 score of 0.928 in the binary task and 0.879 in the multi-class task. Furthermore, the application of quantization led to a reduction in the model size to less than one-third, while simultaneously decreasing the inference time from 0.016 to 0.014 s, with only a marginal loss of 0.004 in F-1 score. Additionally, the use of eXplainable AI has been investigated as a suitable instrument to both technically and clinically validate classification outcomes.</span></span></p></div>","PeriodicalId":49005,"journal":{"name":"Pervasive and Mobile Computing","volume":null,"pages":null},"PeriodicalIF":4.3,"publicationDate":"2024-01-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139372790","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信