Yifan Wu, Timothy R. N. Holder, Marc Foster, Evan Williams, M. Enomoto, B. Lascelles, A. Bozkurt, David L. Roberts
{"title":"利用位置和行为数据评价潜在导盲犬的时空分析管道","authors":"Yifan Wu, Timothy R. N. Holder, Marc Foster, Evan Williams, M. Enomoto, B. Lascelles, A. Bozkurt, David L. Roberts","doi":"10.1145/3565995.3566033","DOIUrl":null,"url":null,"abstract":"Training guide dogs for visually-impaired people is a resource-consuming task for guide dog schools. This task is further complicated by a dearth of capabilities to objectively measure and analyze candidate guide dogs’ temperaments as they are placed with volunteer raisers away from guide dog schools for months during the raising process. In this work, we demonstrate a preliminary data analysis workflow that is able to provide detailed information about candidate guide dogs’ day to day physical exercise levels and gait activities using objective environmental and behavioral data collected from a wearable collar-based Internet of Things device. We trained and tested machine learning models to analyze different gait types including walking, pacing, trotting and mixture of walk and trot. By analyzing data both spatially and temporally, a location and behavior summary for candidate dogs is generated to provide insight for guide dog training experts, so that they can more accurately and comprehensively evaluate the future success of the candidate. The preliminary analysis revealed movement patterns for different location types which reflected the behaviors of candidate guide dogs.","PeriodicalId":432998,"journal":{"name":"Proceedings of the Ninth International Conference on Animal-Computer Interaction","volume":"24 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Spatial and Temporal Analytic Pipeline for Evaluation of Potential Guide Dogs Using Location and Behavior Data\",\"authors\":\"Yifan Wu, Timothy R. N. Holder, Marc Foster, Evan Williams, M. Enomoto, B. Lascelles, A. Bozkurt, David L. Roberts\",\"doi\":\"10.1145/3565995.3566033\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Training guide dogs for visually-impaired people is a resource-consuming task for guide dog schools. This task is further complicated by a dearth of capabilities to objectively measure and analyze candidate guide dogs’ temperaments as they are placed with volunteer raisers away from guide dog schools for months during the raising process. In this work, we demonstrate a preliminary data analysis workflow that is able to provide detailed information about candidate guide dogs’ day to day physical exercise levels and gait activities using objective environmental and behavioral data collected from a wearable collar-based Internet of Things device. We trained and tested machine learning models to analyze different gait types including walking, pacing, trotting and mixture of walk and trot. By analyzing data both spatially and temporally, a location and behavior summary for candidate dogs is generated to provide insight for guide dog training experts, so that they can more accurately and comprehensively evaluate the future success of the candidate. The preliminary analysis revealed movement patterns for different location types which reflected the behaviors of candidate guide dogs.\",\"PeriodicalId\":432998,\"journal\":{\"name\":\"Proceedings of the Ninth International Conference on Animal-Computer Interaction\",\"volume\":\"24 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-12-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the Ninth International Conference on Animal-Computer Interaction\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3565995.3566033\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Ninth International Conference on Animal-Computer Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3565995.3566033","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Spatial and Temporal Analytic Pipeline for Evaluation of Potential Guide Dogs Using Location and Behavior Data
Training guide dogs for visually-impaired people is a resource-consuming task for guide dog schools. This task is further complicated by a dearth of capabilities to objectively measure and analyze candidate guide dogs’ temperaments as they are placed with volunteer raisers away from guide dog schools for months during the raising process. In this work, we demonstrate a preliminary data analysis workflow that is able to provide detailed information about candidate guide dogs’ day to day physical exercise levels and gait activities using objective environmental and behavioral data collected from a wearable collar-based Internet of Things device. We trained and tested machine learning models to analyze different gait types including walking, pacing, trotting and mixture of walk and trot. By analyzing data both spatially and temporally, a location and behavior summary for candidate dogs is generated to provide insight for guide dog training experts, so that they can more accurately and comprehensively evaluate the future success of the candidate. The preliminary analysis revealed movement patterns for different location types which reflected the behaviors of candidate guide dogs.