Owen Tamin, E. Moung, J. Dargham, Farashazillah Yahya, S. Omatu, L. Angeline
{"title":"基于YOLOv5结构的塑料垃圾RGB和RGNIR色彩空间检测比较","authors":"Owen Tamin, E. Moung, J. Dargham, Farashazillah Yahya, S. Omatu, L. Angeline","doi":"10.1109/IICAIET55139.2022.9936771","DOIUrl":null,"url":null,"abstract":"Plastic waste is a serious environmental issue that damages human health, wildlife, and habitats. Many researchers have come out with multiple solutions on the problem. One of the most efficient ways is to implement machine learning approaches to detect plastic waste in common areas. Deep learning is a powerful machine learning approach that automatically learns image features for object recognition tasks using an object detector. Therefore, this paper proposed a recent object detection model, YOLOv5m, to develop a plastic waste detection model. Two plastic waste datasets, which consist of red, green, and blue (RGB) and red, green, and near-infrared (RGNIR) images, are introduced to train the proposed model. The performance of the proposed model is evaluated using 10-fold cross-validation on the two datasets. The proposed model achieves the best result on RGNIR datasets for validation and testing with an average mAP@0.5:0.95 value of 69.39% and 69.45%, respectively. These results indicate that near-infrared information can be a valuable feature representation in machine learning. This opens more possible opportunities, such as the development of automated plastic detection for the robotic and waste management industry.","PeriodicalId":142482,"journal":{"name":"2022 IEEE International Conference on Artificial Intelligence in Engineering and Technology (IICAIET)","volume":"99 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-09-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"A Comparison of RGB and RGNIR Color Spaces for Plastic Waste Detection Using The YOLOv5 Architecture\",\"authors\":\"Owen Tamin, E. Moung, J. Dargham, Farashazillah Yahya, S. Omatu, L. Angeline\",\"doi\":\"10.1109/IICAIET55139.2022.9936771\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Plastic waste is a serious environmental issue that damages human health, wildlife, and habitats. Many researchers have come out with multiple solutions on the problem. One of the most efficient ways is to implement machine learning approaches to detect plastic waste in common areas. Deep learning is a powerful machine learning approach that automatically learns image features for object recognition tasks using an object detector. Therefore, this paper proposed a recent object detection model, YOLOv5m, to develop a plastic waste detection model. Two plastic waste datasets, which consist of red, green, and blue (RGB) and red, green, and near-infrared (RGNIR) images, are introduced to train the proposed model. The performance of the proposed model is evaluated using 10-fold cross-validation on the two datasets. The proposed model achieves the best result on RGNIR datasets for validation and testing with an average mAP@0.5:0.95 value of 69.39% and 69.45%, respectively. These results indicate that near-infrared information can be a valuable feature representation in machine learning. This opens more possible opportunities, such as the development of automated plastic detection for the robotic and waste management industry.\",\"PeriodicalId\":142482,\"journal\":{\"name\":\"2022 IEEE International Conference on Artificial Intelligence in Engineering and Technology (IICAIET)\",\"volume\":\"99 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-09-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE International Conference on Artificial Intelligence in Engineering and Technology (IICAIET)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IICAIET55139.2022.9936771\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Artificial Intelligence in Engineering and Technology (IICAIET)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IICAIET55139.2022.9936771","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A Comparison of RGB and RGNIR Color Spaces for Plastic Waste Detection Using The YOLOv5 Architecture
Plastic waste is a serious environmental issue that damages human health, wildlife, and habitats. Many researchers have come out with multiple solutions on the problem. One of the most efficient ways is to implement machine learning approaches to detect plastic waste in common areas. Deep learning is a powerful machine learning approach that automatically learns image features for object recognition tasks using an object detector. Therefore, this paper proposed a recent object detection model, YOLOv5m, to develop a plastic waste detection model. Two plastic waste datasets, which consist of red, green, and blue (RGB) and red, green, and near-infrared (RGNIR) images, are introduced to train the proposed model. The performance of the proposed model is evaluated using 10-fold cross-validation on the two datasets. The proposed model achieves the best result on RGNIR datasets for validation and testing with an average mAP@0.5:0.95 value of 69.39% and 69.45%, respectively. These results indicate that near-infrared information can be a valuable feature representation in machine learning. This opens more possible opportunities, such as the development of automated plastic detection for the robotic and waste management industry.