GESTURAL DESIGN - HAND TRACKING FOR DIGITAL DRAWING

Jordan A. Kanter, Kamil Quinteros
{"title":"GESTURAL DESIGN - HAND TRACKING FOR DIGITAL DRAWING","authors":"Jordan A. Kanter, Kamil Quinteros","doi":"10.54729/2789-8547.1198","DOIUrl":null,"url":null,"abstract":"Abstract Computational design is increasingly interested in the active feedback between the user/designer and the digital space. Often, our initial instinct as designers comes from a gesture, a movement of the hands that gets translated into sketches and 3D models via the tools available to us. While the physical realm allows for muscle memory, tactile feedback, and creative output via movement, digital design often negates the body of the designer as it sequesters us into a screen-mouse-hand relationship. Moreover, current CAD software tools often reinforce this standardization, further limiting the potential of physical bodily gestures as a vehicle for architectural form-making. Seeking new opportunities for a gestural interface, this research explores how Machine Learning and parametric design tools can be used to translate active movements and gestural actions into rich and complex digital models without the need of specialized equipment. In this paper, we present an open-source and economically accessible methodology for designers to translate hand movements into the digital world, implementing the MediaPipe Hands tracking library. In developing this workflow, this research explores opportunities to create more direct, vital links between expressive gesture and architectural form, with an emphasis on creating platforms that are accessible not only to design experts, but also the broader public.","PeriodicalId":113089,"journal":{"name":"Architecture and Planning Journal (APJ)","volume":"104 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Architecture and Planning Journal (APJ)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.54729/2789-8547.1198","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Abstract Computational design is increasingly interested in the active feedback between the user/designer and the digital space. Often, our initial instinct as designers comes from a gesture, a movement of the hands that gets translated into sketches and 3D models via the tools available to us. While the physical realm allows for muscle memory, tactile feedback, and creative output via movement, digital design often negates the body of the designer as it sequesters us into a screen-mouse-hand relationship. Moreover, current CAD software tools often reinforce this standardization, further limiting the potential of physical bodily gestures as a vehicle for architectural form-making. Seeking new opportunities for a gestural interface, this research explores how Machine Learning and parametric design tools can be used to translate active movements and gestural actions into rich and complex digital models without the need of specialized equipment. In this paper, we present an open-source and economically accessible methodology for designers to translate hand movements into the digital world, implementing the MediaPipe Hands tracking library. In developing this workflow, this research explores opportunities to create more direct, vital links between expressive gesture and architectural form, with an emphasis on creating platforms that are accessible not only to design experts, but also the broader public.
手势设计——数字绘图的手部跟踪
计算设计越来越关注用户/设计师与数字空间之间的主动反馈。通常,作为设计师,我们最初的直觉来自于一个手势,一个手的运动,通过我们可用的工具转化为草图和3D模型。虽然物理领域允许肌肉记忆,触觉反馈和通过运动产生的创造性输出,但数字设计往往否定设计师的身体,因为它将我们隔离在屏幕-鼠标-手的关系中。此外,当前的CAD软件工具经常加强这种标准化,进一步限制了身体姿势作为建筑形式制作工具的潜力。为了寻求手势界面的新机会,本研究探索了如何使用机器学习和参数化设计工具将主动运动和手势动作转化为丰富而复杂的数字模型,而不需要专门的设备。在本文中,我们提出了一种开源且经济可行的方法,供设计师将手部运动转化为数字世界,实现MediaPipe手部跟踪库。在开发此工作流程时,本研究探索了在表达姿态和建筑形式之间创建更直接,重要联系的机会,重点是创建不仅可供设计专家访问的平台,而且可供更广泛的公众访问。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信