Jahnavi Dasari, Telugu Sai Joshith, Duddupudi Daya Lokesh, Sanjipogu Sandeep Kumar, Ganesh Kumar Mahato, Swarnendu Kumar Chakraborty
{"title":"使用联邦学习和同态重加密保护医疗诊断敏感数据的隐私","authors":"Jahnavi Dasari, Telugu Sai Joshith, Duddupudi Daya Lokesh, Sanjipogu Sandeep Kumar, Ganesh Kumar Mahato, Swarnendu Kumar Chakraborty","doi":"10.1109/CONIT59222.2023.10205836","DOIUrl":null,"url":null,"abstract":"Federated learning is an emerging technique that allows multiple participants to train a shared model without exchanging their private data. This approach is particularly useful for IoT applications, where data is often collected locally and stored in distributed devices like edge nodes. Privacy concerns arise when using IoT devices to record medical data for distributed learning. Techniques such as differential privacy and federated learning can help ensure data security while preserving privacy. Encryption and secure multi-party computation can also be used to securely share and compute data. By training a model on a combination of data from these devices. In the medical research, federated learning can be used to train models on data from multiple healthcare devices such as wearables, smart medical sensors, and electronic health records. By training models on local data, healthcare providers can improve the accuracy of diagnosis and treatment while protecting patient privacy. Security analysis can involve evaluating the potential vulnerabilities and risks of the system and identifying measures to protect against them. Experimental results can involve testing the performance of the system in terms of accuracy, convergence speed, and other metrics, and comparing it to other methods.","PeriodicalId":377623,"journal":{"name":"2023 3rd International Conference on Intelligent Technologies (CONIT)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Privacy-Preserving sensitive data on Medical diagnosis using Federated Learning and Homomorphic Re-encryption\",\"authors\":\"Jahnavi Dasari, Telugu Sai Joshith, Duddupudi Daya Lokesh, Sanjipogu Sandeep Kumar, Ganesh Kumar Mahato, Swarnendu Kumar Chakraborty\",\"doi\":\"10.1109/CONIT59222.2023.10205836\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Federated learning is an emerging technique that allows multiple participants to train a shared model without exchanging their private data. This approach is particularly useful for IoT applications, where data is often collected locally and stored in distributed devices like edge nodes. Privacy concerns arise when using IoT devices to record medical data for distributed learning. Techniques such as differential privacy and federated learning can help ensure data security while preserving privacy. Encryption and secure multi-party computation can also be used to securely share and compute data. By training a model on a combination of data from these devices. In the medical research, federated learning can be used to train models on data from multiple healthcare devices such as wearables, smart medical sensors, and electronic health records. By training models on local data, healthcare providers can improve the accuracy of diagnosis and treatment while protecting patient privacy. Security analysis can involve evaluating the potential vulnerabilities and risks of the system and identifying measures to protect against them. Experimental results can involve testing the performance of the system in terms of accuracy, convergence speed, and other metrics, and comparing it to other methods.\",\"PeriodicalId\":377623,\"journal\":{\"name\":\"2023 3rd International Conference on Intelligent Technologies (CONIT)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-06-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 3rd International Conference on Intelligent Technologies (CONIT)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CONIT59222.2023.10205836\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 3rd International Conference on Intelligent Technologies (CONIT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CONIT59222.2023.10205836","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Privacy-Preserving sensitive data on Medical diagnosis using Federated Learning and Homomorphic Re-encryption
Federated learning is an emerging technique that allows multiple participants to train a shared model without exchanging their private data. This approach is particularly useful for IoT applications, where data is often collected locally and stored in distributed devices like edge nodes. Privacy concerns arise when using IoT devices to record medical data for distributed learning. Techniques such as differential privacy and federated learning can help ensure data security while preserving privacy. Encryption and secure multi-party computation can also be used to securely share and compute data. By training a model on a combination of data from these devices. In the medical research, federated learning can be used to train models on data from multiple healthcare devices such as wearables, smart medical sensors, and electronic health records. By training models on local data, healthcare providers can improve the accuracy of diagnosis and treatment while protecting patient privacy. Security analysis can involve evaluating the potential vulnerabilities and risks of the system and identifying measures to protect against them. Experimental results can involve testing the performance of the system in terms of accuracy, convergence speed, and other metrics, and comparing it to other methods.