无人机自主操纵手势语言研究综述

Fotini Patrona, Ioannis Mademlis, I. Pitas
{"title":"无人机自主操纵手势语言研究综述","authors":"Fotini Patrona, Ioannis Mademlis, I. Pitas","doi":"10.1109/AIRPHARO52252.2021.9571027","DOIUrl":null,"url":null,"abstract":"Camera-equipped Unmanned Aerial Vehicles (UAVs, or drones) have revolutionized several application domains, with a steadily increasing degree of cognitive autonomy in commercial drones paving the way for unprecedented robotization of daily life. Dynamic cooperation of UAV s with human collaborators is typically necessary during a mission; a fact that has led to various solutions for high-level UAV-operator interaction. Hand gestures are an effective way of facilitating this remote drone handling, giving rise to new gesture languages for visual communication between operators and autonomous UAV s. This paper reviews all the available languages which could be used or have been created for this purpose, as well as relevant gesture recognition datasets for training machine learning models. Moreover, a novel, generic, base gesture language for handling camera-equipped UAV s is proposed, along with a corresponding, large-scale, publicly available video dataset. The presented language can easily and consistently be extended in the future to more specific scenarios/profiles, tailored for particular application domains and/or additional UAV equipment (e.g., aerial manipulators/arms). Finally, we evaluate: a) the performance of state-of-the-art gesture recognition algorithms on the proposed dataset, in a quantitative and objective manner, and b) the intuitiveness, effectiveness and completeness of the proposed gesture language, in a qualitative and subjective manner.","PeriodicalId":415722,"journal":{"name":"2021 Aerial Robotic Systems Physically Interacting with the Environment (AIRPHARO)","volume":"95 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"An Overview of Hand Gesture Languages for Autonomous UAV Handling\",\"authors\":\"Fotini Patrona, Ioannis Mademlis, I. Pitas\",\"doi\":\"10.1109/AIRPHARO52252.2021.9571027\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Camera-equipped Unmanned Aerial Vehicles (UAVs, or drones) have revolutionized several application domains, with a steadily increasing degree of cognitive autonomy in commercial drones paving the way for unprecedented robotization of daily life. Dynamic cooperation of UAV s with human collaborators is typically necessary during a mission; a fact that has led to various solutions for high-level UAV-operator interaction. Hand gestures are an effective way of facilitating this remote drone handling, giving rise to new gesture languages for visual communication between operators and autonomous UAV s. This paper reviews all the available languages which could be used or have been created for this purpose, as well as relevant gesture recognition datasets for training machine learning models. Moreover, a novel, generic, base gesture language for handling camera-equipped UAV s is proposed, along with a corresponding, large-scale, publicly available video dataset. The presented language can easily and consistently be extended in the future to more specific scenarios/profiles, tailored for particular application domains and/or additional UAV equipment (e.g., aerial manipulators/arms). Finally, we evaluate: a) the performance of state-of-the-art gesture recognition algorithms on the proposed dataset, in a quantitative and objective manner, and b) the intuitiveness, effectiveness and completeness of the proposed gesture language, in a qualitative and subjective manner.\",\"PeriodicalId\":415722,\"journal\":{\"name\":\"2021 Aerial Robotic Systems Physically Interacting with the Environment (AIRPHARO)\",\"volume\":\"95 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 Aerial Robotic Systems Physically Interacting with the Environment (AIRPHARO)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/AIRPHARO52252.2021.9571027\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 Aerial Robotic Systems Physically Interacting with the Environment (AIRPHARO)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AIRPHARO52252.2021.9571027","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

摘要

配备摄像头的无人驾驶飞行器(uav,或无人机)已经彻底改变了几个应用领域,商用无人机的认知自主程度稳步提高,为前所未有的日常生活机器人化铺平了道路。在执行任务期间,无人机与人类合作者的动态合作通常是必要的;这一事实导致了各种解决方案的高层次无人机操作人员的互动。手势是促进这种远程无人机处理的有效方法,为操作员和自主无人机之间的视觉通信提供了新的手势语言。本文回顾了为此目的可以使用或已经创建的所有可用语言,以及用于训练机器学习模型的相关手势识别数据集。此外,提出了一种新的、通用的、用于处理配备摄像头的无人机的基本手势语言,以及相应的、大规模的、公开可用的视频数据集。所呈现的语言可以在未来轻松且一致地扩展到更具体的场景/配置文件,为特定的应用领域和/或额外的无人机设备(例如,空中操纵器/臂)量身定制。最后,我们以定量和客观的方式评估了最先进的手势识别算法在所提出数据集上的性能,以及b)以定性和主观的方式评估了所提出手势语言的直观性、有效性和完整性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
An Overview of Hand Gesture Languages for Autonomous UAV Handling
Camera-equipped Unmanned Aerial Vehicles (UAVs, or drones) have revolutionized several application domains, with a steadily increasing degree of cognitive autonomy in commercial drones paving the way for unprecedented robotization of daily life. Dynamic cooperation of UAV s with human collaborators is typically necessary during a mission; a fact that has led to various solutions for high-level UAV-operator interaction. Hand gestures are an effective way of facilitating this remote drone handling, giving rise to new gesture languages for visual communication between operators and autonomous UAV s. This paper reviews all the available languages which could be used or have been created for this purpose, as well as relevant gesture recognition datasets for training machine learning models. Moreover, a novel, generic, base gesture language for handling camera-equipped UAV s is proposed, along with a corresponding, large-scale, publicly available video dataset. The presented language can easily and consistently be extended in the future to more specific scenarios/profiles, tailored for particular application domains and/or additional UAV equipment (e.g., aerial manipulators/arms). Finally, we evaluate: a) the performance of state-of-the-art gesture recognition algorithms on the proposed dataset, in a quantitative and objective manner, and b) the intuitiveness, effectiveness and completeness of the proposed gesture language, in a qualitative and subjective manner.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信