潜在空间中的符号推理:以经典规划为例

Masataro Asai, Hiroshi Kajino, A. Fukunaga, Christian Muise
{"title":"潜在空间中的符号推理:以经典规划为例","authors":"Masataro Asai, Hiroshi Kajino, A. Fukunaga, Christian Muise","doi":"10.3233/faia210349","DOIUrl":null,"url":null,"abstract":"Symbolic systems require hand-coded symbolic representation as input, resulting in a knowledge acquisition bottleneck. Meanwhile, although deep learning has achieved significant success in many fields, the knowledge is encoded in a subsymbolic representation which is incompatible with symbolic systems. To address the gap between the two fields, one has to solve Symbol Grounding problem: The question of how a machine can generate symbols automatically. We discuss our recent work called Latplan, an unsupervised architecture combining deep learning and classical planning. Given only an unlabeled set of image pairs showing a subset of transitions allowed in the environment (training inputs), Latplan learns a complete propositional PDDL action model of the environment. Later, when a pair of images representing the initial and the goal states (planning inputs) is given, Latplan finds a plan to the goal state in a symbolic latent space and returns a visualized plan execution. We discuss several key ideas that made Latplan possible which would hopefully extend to many other symbolic paradigms outside classical planning.","PeriodicalId":250200,"journal":{"name":"Neuro-Symbolic Artificial Intelligence","volume":"42 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Symbolic Reasoning in Latent Space: Classical Planning as an Example\",\"authors\":\"Masataro Asai, Hiroshi Kajino, A. Fukunaga, Christian Muise\",\"doi\":\"10.3233/faia210349\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Symbolic systems require hand-coded symbolic representation as input, resulting in a knowledge acquisition bottleneck. Meanwhile, although deep learning has achieved significant success in many fields, the knowledge is encoded in a subsymbolic representation which is incompatible with symbolic systems. To address the gap between the two fields, one has to solve Symbol Grounding problem: The question of how a machine can generate symbols automatically. We discuss our recent work called Latplan, an unsupervised architecture combining deep learning and classical planning. Given only an unlabeled set of image pairs showing a subset of transitions allowed in the environment (training inputs), Latplan learns a complete propositional PDDL action model of the environment. Later, when a pair of images representing the initial and the goal states (planning inputs) is given, Latplan finds a plan to the goal state in a symbolic latent space and returns a visualized plan execution. We discuss several key ideas that made Latplan possible which would hopefully extend to many other symbolic paradigms outside classical planning.\",\"PeriodicalId\":250200,\"journal\":{\"name\":\"Neuro-Symbolic Artificial Intelligence\",\"volume\":\"42 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-12-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neuro-Symbolic Artificial Intelligence\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.3233/faia210349\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neuro-Symbolic Artificial Intelligence","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3233/faia210349","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

符号系统需要手工编码的符号表示作为输入,导致知识获取的瓶颈。与此同时,尽管深度学习在许多领域取得了显著的成功,但知识被编码为与符号系统不兼容的子符号表示。为了解决这两个领域之间的差距,人们必须解决符号接地问题:机器如何自动生成符号的问题。我们讨论了我们最近的作品Latplan,这是一个结合了深度学习和经典规划的无监督建筑。给定一组未标记的图像对,显示环境中允许的过渡子集(训练输入),Latplan学习环境的完整命题PDDL动作模型。然后,当给定一对表示初始状态和目标状态的图像(规划输入)时,Latplan在符号潜在空间中找到一个到目标状态的计划,并返回一个可视化的计划执行。我们讨论了使拉丁规划成为可能的几个关键思想,这些思想有望扩展到经典规划之外的许多其他符号范式。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Symbolic Reasoning in Latent Space: Classical Planning as an Example
Symbolic systems require hand-coded symbolic representation as input, resulting in a knowledge acquisition bottleneck. Meanwhile, although deep learning has achieved significant success in many fields, the knowledge is encoded in a subsymbolic representation which is incompatible with symbolic systems. To address the gap between the two fields, one has to solve Symbol Grounding problem: The question of how a machine can generate symbols automatically. We discuss our recent work called Latplan, an unsupervised architecture combining deep learning and classical planning. Given only an unlabeled set of image pairs showing a subset of transitions allowed in the environment (training inputs), Latplan learns a complete propositional PDDL action model of the environment. Later, when a pair of images representing the initial and the goal states (planning inputs) is given, Latplan finds a plan to the goal state in a symbolic latent space and returns a visualized plan execution. We discuss several key ideas that made Latplan possible which would hopefully extend to many other symbolic paradigms outside classical planning.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信