{"title":"Multi-modal broad learning for material recognition","authors":"Zhaoxin Wang, Huaping Liu, Xinying Xu, Fuchun Sun","doi":"10.1049/ccs2.12004","DOIUrl":null,"url":null,"abstract":"Joint Fund of Science & Technology Department of Liaoning Province and State Key Laboratory of Robotics, China, Grant/Award Number: 2020‐KF‐ 22‐06 Abstract Material recognition plays an important role in the interaction between robots and the external environment. For example, household service robots need to replace humans in the home environment to complete housework, so they need to interact with daily necessities and obtain their material performance. Images provide rich visual information about objects; however, it is often difficult to apply when objects are not visually distinct. In addition, tactile signals can be used to capture multiple characteristics of objects, such as texture, roughness, softness, and friction, which provides another crucial way for perception. How to effectively integrate multi‐modal information is an urgent problem to be addressed. Therefore, a multi‐modal material recognition framework CFBRL‐KCCA for target recognition tasks is proposed in the paper. The preliminary features of each model are extracted by cascading broad learning, which is combined with the kernel canonical correlation learning, considering the differences among different models of heterogeneous data. Finally, the open dataset of household objects is evaluated. The results demonstrate that the proposed fusion algorithm provides an effective strategy for material recognition.","PeriodicalId":33652,"journal":{"name":"Cognitive Computation and Systems","volume":"3 2","pages":"123-130"},"PeriodicalIF":1.2000,"publicationDate":"2021-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ietresearch.onlinelibrary.wiley.com/doi/epdf/10.1049/ccs2.12004","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Cognitive Computation and Systems","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1049/ccs2.12004","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 3
Abstract
Joint Fund of Science & Technology Department of Liaoning Province and State Key Laboratory of Robotics, China, Grant/Award Number: 2020‐KF‐ 22‐06 Abstract Material recognition plays an important role in the interaction between robots and the external environment. For example, household service robots need to replace humans in the home environment to complete housework, so they need to interact with daily necessities and obtain their material performance. Images provide rich visual information about objects; however, it is often difficult to apply when objects are not visually distinct. In addition, tactile signals can be used to capture multiple characteristics of objects, such as texture, roughness, softness, and friction, which provides another crucial way for perception. How to effectively integrate multi‐modal information is an urgent problem to be addressed. Therefore, a multi‐modal material recognition framework CFBRL‐KCCA for target recognition tasks is proposed in the paper. The preliminary features of each model are extracted by cascading broad learning, which is combined with the kernel canonical correlation learning, considering the differences among different models of heterogeneous data. Finally, the open dataset of household objects is evaluated. The results demonstrate that the proposed fusion algorithm provides an effective strategy for material recognition.