{"title":"A scattering transform combination with local binary pattern for texture classification","authors":"Vu-Lam Nguyen, Ngoc-Son Vu, P. Gosselin","doi":"10.1109/CBMI.2016.7500238","DOIUrl":null,"url":null,"abstract":"In this paper, we propose a combined feature approach which takes full advantages of local structure information and the more global one for improving texture image classification results. In this way, Local Binary Pattern is used for extracting local features, whilst the Scattering Transform feature plays the role of a global descriptor. Intensive experiments conducted on many texture benchmarks such as ALOT, CUReT, KTH-TIPS2-a, KTH-TIPS2b, and OUTEX show that the combined method outweigh each one which stands alone in term of classification accuracy. Also, our method outperforms many others, whilst it is comparable to state of the art on the experimented datasets.","PeriodicalId":356608,"journal":{"name":"2016 14th International Workshop on Content-Based Multimedia Indexing (CBMI)","volume":"38 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-06-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 14th International Workshop on Content-Based Multimedia Indexing (CBMI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CBMI.2016.7500238","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6
Abstract
In this paper, we propose a combined feature approach which takes full advantages of local structure information and the more global one for improving texture image classification results. In this way, Local Binary Pattern is used for extracting local features, whilst the Scattering Transform feature plays the role of a global descriptor. Intensive experiments conducted on many texture benchmarks such as ALOT, CUReT, KTH-TIPS2-a, KTH-TIPS2b, and OUTEX show that the combined method outweigh each one which stands alone in term of classification accuracy. Also, our method outperforms many others, whilst it is comparable to state of the art on the experimented datasets.