{"title":"Discovering Conceptual Knowledge with Analytic Ontology Templates for Articulated Objects","authors":"Jianhua Sun, Yuxuan Li, Longfei Xu, Jiude Wei, Liang Chai, Cewu Lu","doi":"arxiv-2409.11702","DOIUrl":null,"url":null,"abstract":"Human cognition can leverage fundamental conceptual knowledge, like geometric\nand kinematic ones, to appropriately perceive, comprehend and interact with\nnovel objects. Motivated by this finding, we aim to endow machine intelligence\nwith an analogous capability through performing at the conceptual level, in\norder to understand and then interact with articulated objects, especially for\nthose in novel categories, which is challenging due to the intricate geometric\nstructures and diverse joint types of articulated objects. To achieve this\ngoal, we propose Analytic Ontology Template (AOT), a parameterized and\ndifferentiable program description of generalized conceptual ontologies. A\nbaseline approach called AOTNet driven by AOTs is designed accordingly to equip\nintelligent agents with these generalized concepts, and then empower the agents\nto effectively discover the conceptual knowledge on the structure and\naffordance of articulated objects. The AOT-driven approach yields benefits in\nthree key perspectives: i) enabling concept-level understanding of articulated\nobjects without relying on any real training data, ii) providing analytic\nstructure information, and iii) introducing rich affordance information\nindicating proper ways of interaction. We conduct exhaustive experiments and\nthe results demonstrate the superiority of our approach in understanding and\nthen interacting with articulated objects.","PeriodicalId":501031,"journal":{"name":"arXiv - CS - Robotics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Robotics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.11702","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Human cognition can leverage fundamental conceptual knowledge, like geometric
and kinematic ones, to appropriately perceive, comprehend and interact with
novel objects. Motivated by this finding, we aim to endow machine intelligence
with an analogous capability through performing at the conceptual level, in
order to understand and then interact with articulated objects, especially for
those in novel categories, which is challenging due to the intricate geometric
structures and diverse joint types of articulated objects. To achieve this
goal, we propose Analytic Ontology Template (AOT), a parameterized and
differentiable program description of generalized conceptual ontologies. A
baseline approach called AOTNet driven by AOTs is designed accordingly to equip
intelligent agents with these generalized concepts, and then empower the agents
to effectively discover the conceptual knowledge on the structure and
affordance of articulated objects. The AOT-driven approach yields benefits in
three key perspectives: i) enabling concept-level understanding of articulated
objects without relying on any real training data, ii) providing analytic
structure information, and iii) introducing rich affordance information
indicating proper ways of interaction. We conduct exhaustive experiments and
the results demonstrate the superiority of our approach in understanding and
then interacting with articulated objects.