Deanna Kocher, L. Sarmiento, Samantha Heller, Yupei Yang, T. Kushnir, K. Green
{"title":"不,是你的左边!儿童用来指挥机器人的语言","authors":"Deanna Kocher, L. Sarmiento, Samantha Heller, Yupei Yang, T. Kushnir, K. Green","doi":"10.1109/ICDL-EpiRob48136.2020.9278108","DOIUrl":null,"url":null,"abstract":"We present an analysis of how children between 4-and 9-years-old give directions to a robot. Thirty-eight children in this age range participated in a direction giving game with a virtual robot and with their caregiver. We considered two different viewpoints (aerial and in-person) and three different affordances (non-humanoid robot, caregiver with eyes closed, and caregiver with eyes open). We report on the frequency of commands that children used, the complexity of the commands, and the navigation styles children used at different ages. We found that pointing and gesturing decreased with age, while “left-right” directions and the use of distances increased with age. From this, we make several recommendations for robot design that would enable a robot to successfully follow directions from children of different ages, and help advance children's direction giving.","PeriodicalId":114948,"journal":{"name":"2020 Joint IEEE 10th International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"No, Your Other Left! Language Children Use To Direct Robots\",\"authors\":\"Deanna Kocher, L. Sarmiento, Samantha Heller, Yupei Yang, T. Kushnir, K. Green\",\"doi\":\"10.1109/ICDL-EpiRob48136.2020.9278108\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We present an analysis of how children between 4-and 9-years-old give directions to a robot. Thirty-eight children in this age range participated in a direction giving game with a virtual robot and with their caregiver. We considered two different viewpoints (aerial and in-person) and three different affordances (non-humanoid robot, caregiver with eyes closed, and caregiver with eyes open). We report on the frequency of commands that children used, the complexity of the commands, and the navigation styles children used at different ages. We found that pointing and gesturing decreased with age, while “left-right” directions and the use of distances increased with age. From this, we make several recommendations for robot design that would enable a robot to successfully follow directions from children of different ages, and help advance children's direction giving.\",\"PeriodicalId\":114948,\"journal\":{\"name\":\"2020 Joint IEEE 10th International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob)\",\"volume\":\"23 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-10-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 Joint IEEE 10th International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICDL-EpiRob48136.2020.9278108\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 Joint IEEE 10th International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDL-EpiRob48136.2020.9278108","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
No, Your Other Left! Language Children Use To Direct Robots
We present an analysis of how children between 4-and 9-years-old give directions to a robot. Thirty-eight children in this age range participated in a direction giving game with a virtual robot and with their caregiver. We considered two different viewpoints (aerial and in-person) and three different affordances (non-humanoid robot, caregiver with eyes closed, and caregiver with eyes open). We report on the frequency of commands that children used, the complexity of the commands, and the navigation styles children used at different ages. We found that pointing and gesturing decreased with age, while “left-right” directions and the use of distances increased with age. From this, we make several recommendations for robot design that would enable a robot to successfully follow directions from children of different ages, and help advance children's direction giving.