{"title":"基于曲线显著性和深度卷积神经网络的眼底图像糖尿病视网膜病变分类","authors":"V. T. H. Tuyet, N. T. Binh, D. T. Tin","doi":"10.48084/etasr.4679","DOIUrl":null,"url":null,"abstract":"Retinal vessel images give a wide range of the abnormal pixels of patients. Therefore, classifying the diseases depending on fundus images is a popular approach. This paper proposes a new method to classify diabetic retinopathy in retinal blood vessel images based on curvelet saliency for segmentation. Our approach includes three periods: pre-processing of the quality of input images, calculating the saliency map based on curvelet coefficients, and classifying VGG16. To evaluate the results of the proposed method STARE and HRF datasets are used for testing with the Jaccard Index. The accuracy of the proposed method is about 98.42% and 97.96% with STARE and HRF datasets respectively.","PeriodicalId":11826,"journal":{"name":"Engineering, Technology & Applied Science Research","volume":"21 1","pages":""},"PeriodicalIF":1.5000,"publicationDate":"2022-02-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Improving the Curvelet Saliency and Deep Convolutional Neural Networks for Diabetic Retinopathy Classification in Fundus Images\",\"authors\":\"V. T. H. Tuyet, N. T. Binh, D. T. Tin\",\"doi\":\"10.48084/etasr.4679\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Retinal vessel images give a wide range of the abnormal pixels of patients. Therefore, classifying the diseases depending on fundus images is a popular approach. This paper proposes a new method to classify diabetic retinopathy in retinal blood vessel images based on curvelet saliency for segmentation. Our approach includes three periods: pre-processing of the quality of input images, calculating the saliency map based on curvelet coefficients, and classifying VGG16. To evaluate the results of the proposed method STARE and HRF datasets are used for testing with the Jaccard Index. The accuracy of the proposed method is about 98.42% and 97.96% with STARE and HRF datasets respectively.\",\"PeriodicalId\":11826,\"journal\":{\"name\":\"Engineering, Technology & Applied Science Research\",\"volume\":\"21 1\",\"pages\":\"\"},\"PeriodicalIF\":1.5000,\"publicationDate\":\"2022-02-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Engineering, Technology & Applied Science Research\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.48084/etasr.4679\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"0\",\"JCRName\":\"ENGINEERING, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Engineering, Technology & Applied Science Research","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.48084/etasr.4679","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"0","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
Improving the Curvelet Saliency and Deep Convolutional Neural Networks for Diabetic Retinopathy Classification in Fundus Images
Retinal vessel images give a wide range of the abnormal pixels of patients. Therefore, classifying the diseases depending on fundus images is a popular approach. This paper proposes a new method to classify diabetic retinopathy in retinal blood vessel images based on curvelet saliency for segmentation. Our approach includes three periods: pre-processing of the quality of input images, calculating the saliency map based on curvelet coefficients, and classifying VGG16. To evaluate the results of the proposed method STARE and HRF datasets are used for testing with the Jaccard Index. The accuracy of the proposed method is about 98.42% and 97.96% with STARE and HRF datasets respectively.