Mari Grace Corruz, Emil Filipina, Maria Julia Santiago, Sheila Mae Uy, Cristian Lazana, A. Bandala
{"title":"利用卷积神经网络开发和实现珊瑚白化监测应用","authors":"Mari Grace Corruz, Emil Filipina, Maria Julia Santiago, Sheila Mae Uy, Cristian Lazana, A. Bandala","doi":"10.1109/HNICEM54116.2021.9731907","DOIUrl":null,"url":null,"abstract":"This study aims to improve the accuracy of the coral bleaching monitoring method through the development and implementation of mobile application that can classify bleached corals images from non-bleached images using convolutional neural network. Monitoring the reef will be significant in finding the extent of damage, the current state of the Philippine coral reefs, and the possible reefs of hope. The system operates using Convolutional Neural Network (CNN) in classifying the bleaching severity of the corals. It is currently running on Android phones from 4.0 release up to 11. Researchers found that at least 3000 images are needed to train the CNN of the proposed coral bleaching application to achieve at least 90% accuracy, and 0.92 MP, -1 EV and 1600 ISO produces 93% accuracy. Salinity and turbidity of seawater was tested and presented that 1.000-060 g/cm3 of salinity and turbidity using 500-1000 grams of sand does not have substantial effect on the proposed system’s accuracy. The GPS used in the proposed system is 95% accurate. Finally, the researchers recommend for the continuous improvement of the dataset to produce better results in the future.","PeriodicalId":129868,"journal":{"name":"2021 IEEE 13th International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment, and Management (HNICEM)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"BahurApp: Development And Implementation Of Coral Bleaching Monitoring Application Using Convolutional Neural Network\",\"authors\":\"Mari Grace Corruz, Emil Filipina, Maria Julia Santiago, Sheila Mae Uy, Cristian Lazana, A. Bandala\",\"doi\":\"10.1109/HNICEM54116.2021.9731907\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This study aims to improve the accuracy of the coral bleaching monitoring method through the development and implementation of mobile application that can classify bleached corals images from non-bleached images using convolutional neural network. Monitoring the reef will be significant in finding the extent of damage, the current state of the Philippine coral reefs, and the possible reefs of hope. The system operates using Convolutional Neural Network (CNN) in classifying the bleaching severity of the corals. It is currently running on Android phones from 4.0 release up to 11. Researchers found that at least 3000 images are needed to train the CNN of the proposed coral bleaching application to achieve at least 90% accuracy, and 0.92 MP, -1 EV and 1600 ISO produces 93% accuracy. Salinity and turbidity of seawater was tested and presented that 1.000-060 g/cm3 of salinity and turbidity using 500-1000 grams of sand does not have substantial effect on the proposed system’s accuracy. The GPS used in the proposed system is 95% accurate. Finally, the researchers recommend for the continuous improvement of the dataset to produce better results in the future.\",\"PeriodicalId\":129868,\"journal\":{\"name\":\"2021 IEEE 13th International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment, and Management (HNICEM)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-11-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 IEEE 13th International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment, and Management (HNICEM)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/HNICEM54116.2021.9731907\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE 13th International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment, and Management (HNICEM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/HNICEM54116.2021.9731907","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
BahurApp: Development And Implementation Of Coral Bleaching Monitoring Application Using Convolutional Neural Network
This study aims to improve the accuracy of the coral bleaching monitoring method through the development and implementation of mobile application that can classify bleached corals images from non-bleached images using convolutional neural network. Monitoring the reef will be significant in finding the extent of damage, the current state of the Philippine coral reefs, and the possible reefs of hope. The system operates using Convolutional Neural Network (CNN) in classifying the bleaching severity of the corals. It is currently running on Android phones from 4.0 release up to 11. Researchers found that at least 3000 images are needed to train the CNN of the proposed coral bleaching application to achieve at least 90% accuracy, and 0.92 MP, -1 EV and 1600 ISO produces 93% accuracy. Salinity and turbidity of seawater was tested and presented that 1.000-060 g/cm3 of salinity and turbidity using 500-1000 grams of sand does not have substantial effect on the proposed system’s accuracy. The GPS used in the proposed system is 95% accurate. Finally, the researchers recommend for the continuous improvement of the dataset to produce better results in the future.