EuclidNet: Deep Visual Reasoning for Constructible Problems in Geometry

M. Wong, Xintong Qi, C. Tan
{"title":"EuclidNet: Deep Visual Reasoning for Constructible Problems in Geometry","authors":"M. Wong, Xintong Qi, C. Tan","doi":"10.54364/aaiml.2023.1152","DOIUrl":null,"url":null,"abstract":"In this paper, we present a visual reasoning framework driven by deep learning for solving constructible problems in geometry that is useful for automated geometry theorem proving. Constructible problems in geometry often ask for the sequence of straightedge-and-compass constructions to construct a given goal given some initial setup. Our EuclidNet framework leverages the neural network architecture Mask R-CNN to extract the visual features from the initial setup and goal configuration with extra points of intersection, and then generate possible construction steps as intermediary data models that are used as feedback in the training process for further refinement of the construction step sequence. This process is repeated recursively until either a solution is found, in which case we backtrack the path for a step-by-step construction guide, or the problem is identified as unsolvable. Our EuclidNet framework is validated on the problem set of Euclidea with an average of 75% accuracy without prior knowledge and complex Japanese Sangaku geometry problems, demonstrating its capacity to leverage backtracking for deep visual reasoning of challenging problems.","PeriodicalId":373878,"journal":{"name":"Adv. Artif. Intell. Mach. Learn.","volume":"41 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Adv. Artif. Intell. Mach. Learn.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.54364/aaiml.2023.1152","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

In this paper, we present a visual reasoning framework driven by deep learning for solving constructible problems in geometry that is useful for automated geometry theorem proving. Constructible problems in geometry often ask for the sequence of straightedge-and-compass constructions to construct a given goal given some initial setup. Our EuclidNet framework leverages the neural network architecture Mask R-CNN to extract the visual features from the initial setup and goal configuration with extra points of intersection, and then generate possible construction steps as intermediary data models that are used as feedback in the training process for further refinement of the construction step sequence. This process is repeated recursively until either a solution is found, in which case we backtrack the path for a step-by-step construction guide, or the problem is identified as unsolvable. Our EuclidNet framework is validated on the problem set of Euclidea with an average of 75% accuracy without prior knowledge and complex Japanese Sangaku geometry problems, demonstrating its capacity to leverage backtracking for deep visual reasoning of challenging problems.
欧几里得网:几何中可构造问题的深度视觉推理
在本文中,我们提出了一个由深度学习驱动的视觉推理框架,用于解决几何中的可构造问题,该框架可用于自动几何定理证明。几何中的可构造问题通常要求直线和罗盘构造序列来构造给定初始设置的给定目标。我们的EuclidNet框架利用神经网络架构Mask R-CNN从初始设置和目标配置中提取具有额外交集点的视觉特征,然后生成可能的构建步骤作为中间数据模型,在训练过程中用作反馈,以进一步细化构建步骤序列。这个过程递归地重复,直到找到解决方案,在这种情况下,我们回溯一步一步构建指南的路径,或者问题被确定为无法解决。我们的EuclidNet框架在没有先验知识和复杂的日本Sangaku几何问题的Euclidea问题集上以平均75%的准确率进行了验证,证明了它利用回溯进行具有挑战性问题的深度视觉推理的能力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信