{"title":"Classification of Material Type from Optical Coherence Tomography Images Using Deep Learning","authors":"M. Sabuncu, Hakan Ozdemir","doi":"10.1155/2021/2520679","DOIUrl":null,"url":null,"abstract":"Classification of material type is crucial in the recycling industry since good quality recycling depends on the successful sorting of various materials. In textiles, the most commonly used fiber material types are wool, cotton, and polyester. When recycling fabrics, it is critical to identify and sort various fiber types quickly and correctly. The standard method of determining fabric fiber material type is the burn test followed by a microscopic examination. This traditional method is destructive, tedious, and slow since it involves cutting, burning, and examining the yarn of the fabric. We demonstrate that the identification procedure can be done nondestructively using optical coherence tomography (OCT) and deep learning. The OCT image scans of fabrics that are composed of different fiber material types such as wool, cotton, and polyester are used to train a deep neural network. We present the results of the created deep learning models’ capability to classify fabric fiber material types. We conclude that fiber material types can be identified nondestructively with high precision and recall by OCT imaging and deep learning. Because classification of material type can be performed by OCT and deep learning, this novel technique can be employed in recycling plants in sorting wool, cotton, and polyester fabrics automatically.","PeriodicalId":55995,"journal":{"name":"International Journal of Optics","volume":" ","pages":""},"PeriodicalIF":1.8000,"publicationDate":"2021-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Optics","FirstCategoryId":"101","ListUrlMain":"https://doi.org/10.1155/2021/2520679","RegionNum":4,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"OPTICS","Score":null,"Total":0}
引用次数: 3
Abstract
Classification of material type is crucial in the recycling industry since good quality recycling depends on the successful sorting of various materials. In textiles, the most commonly used fiber material types are wool, cotton, and polyester. When recycling fabrics, it is critical to identify and sort various fiber types quickly and correctly. The standard method of determining fabric fiber material type is the burn test followed by a microscopic examination. This traditional method is destructive, tedious, and slow since it involves cutting, burning, and examining the yarn of the fabric. We demonstrate that the identification procedure can be done nondestructively using optical coherence tomography (OCT) and deep learning. The OCT image scans of fabrics that are composed of different fiber material types such as wool, cotton, and polyester are used to train a deep neural network. We present the results of the created deep learning models’ capability to classify fabric fiber material types. We conclude that fiber material types can be identified nondestructively with high precision and recall by OCT imaging and deep learning. Because classification of material type can be performed by OCT and deep learning, this novel technique can be employed in recycling plants in sorting wool, cotton, and polyester fabrics automatically.
期刊介绍:
International Journal of Optics publishes papers on the nature of light, its properties and behaviours, and its interaction with matter. The journal considers both fundamental and highly applied studies, especially those that promise technological solutions for the next generation of systems and devices. As well as original research, International Journal of Optics also publishes focused review articles that examine the state of the art, identify emerging trends, and suggest future directions for developing fields.