{"title":"Ready-Aim-Fly !基于免提人脸的无人机三维轨迹控制HRI","authors":"Jake Bruce, Jacob M. Perron, R. Vaughan","doi":"10.1109/CRV.2017.39","DOIUrl":null,"url":null,"abstract":"We present a novel user interface for aiming andlaunching flying robots on user-defined trajectories. The methodrequires no user instrumentation and is easy to learn by analogyto a slingshot. With a few minutes of practice users can sendrobots along a desired 3D trajectory and place them in 3D space, including at high altitude and beyond line-of-sight. With the robot hovering in front of the user, the robot tracksthe user's face to estimate its relative pose. The azimuth, elevationand distance of this pose control the parameters of the robot'ssubsequent trajectory. The user triggers the robot to fly thetrajectory by making a distinct pre-trained facial expression. Wepropose three different trajectory types for different applications:straight-line, parabola, and circling. We also describe a simple training/startup interaction to selecta trajectory type and train the aiming and triggering faces. Inreal-world experiments we demonstrate and evaluate the method. We also show that the face-recognition system is resistant to inputfrom unauthorized users.","PeriodicalId":308760,"journal":{"name":"2017 14th Conference on Computer and Robot Vision (CRV)","volume":"38 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"Ready—Aim—Fly! Hands-Free Face-Based HRI for 3D Trajectory Control of UAVs\",\"authors\":\"Jake Bruce, Jacob M. Perron, R. Vaughan\",\"doi\":\"10.1109/CRV.2017.39\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We present a novel user interface for aiming andlaunching flying robots on user-defined trajectories. The methodrequires no user instrumentation and is easy to learn by analogyto a slingshot. With a few minutes of practice users can sendrobots along a desired 3D trajectory and place them in 3D space, including at high altitude and beyond line-of-sight. With the robot hovering in front of the user, the robot tracksthe user's face to estimate its relative pose. The azimuth, elevationand distance of this pose control the parameters of the robot'ssubsequent trajectory. The user triggers the robot to fly thetrajectory by making a distinct pre-trained facial expression. Wepropose three different trajectory types for different applications:straight-line, parabola, and circling. We also describe a simple training/startup interaction to selecta trajectory type and train the aiming and triggering faces. Inreal-world experiments we demonstrate and evaluate the method. We also show that the face-recognition system is resistant to inputfrom unauthorized users.\",\"PeriodicalId\":308760,\"journal\":{\"name\":\"2017 14th Conference on Computer and Robot Vision (CRV)\",\"volume\":\"38 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-05-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 14th Conference on Computer and Robot Vision (CRV)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CRV.2017.39\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 14th Conference on Computer and Robot Vision (CRV)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CRV.2017.39","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Ready—Aim—Fly! Hands-Free Face-Based HRI for 3D Trajectory Control of UAVs
We present a novel user interface for aiming andlaunching flying robots on user-defined trajectories. The methodrequires no user instrumentation and is easy to learn by analogyto a slingshot. With a few minutes of practice users can sendrobots along a desired 3D trajectory and place them in 3D space, including at high altitude and beyond line-of-sight. With the robot hovering in front of the user, the robot tracksthe user's face to estimate its relative pose. The azimuth, elevationand distance of this pose control the parameters of the robot'ssubsequent trajectory. The user triggers the robot to fly thetrajectory by making a distinct pre-trained facial expression. Wepropose three different trajectory types for different applications:straight-line, parabola, and circling. We also describe a simple training/startup interaction to selecta trajectory type and train the aiming and triggering faces. Inreal-world experiments we demonstrate and evaluate the method. We also show that the face-recognition system is resistant to inputfrom unauthorized users.