AgriAcT:使用多媒体和可穿戴传感技术的农业活动培训

Somya Sharma, B. Jagyasi, Jabal Raval, Prashant A. Patil
{"title":"AgriAcT:使用多媒体和可穿戴传感技术的农业活动培训","authors":"Somya Sharma, B. Jagyasi, Jabal Raval, Prashant A. Patil","doi":"10.1109/PERCOMW.2015.7134078","DOIUrl":null,"url":null,"abstract":"There has been immense work in past on the human activities detection and context recognition using the wearable sensing technologies. However, a more challenging problem of providing training on the activities to the users with the help of wearable sensors has not been adequately attempted. Specially, in the agriculture applications, an appropriate training to the farmers on performing the agricultural activities would result in the sustainable agriculture practices for achieving higher and better quality yield. In this paper, a novel first-of-a-kind, multimedia and wearable sensors based Agricultural Activity Training (AgriAcT) system has been proposed for the dissemination of agricultural technologies to the remotely located farmers. In the proposed system, a training video of an expert farmer performing an activity is captured along with the gesture data obtained from the wearable motion sensors from the expert's body while the activity is being performed. A trainee farmer, can learn a selected activity by watching the multimedia content of the expert performing that activity on the mobile phone and subsequently perform the activity by wearing the required motion sensors. We present a novel K-Nearest Neighbor based Agriculture Activity Performance Score (KAAPS) engine to generate an Activity performance score (AcT-Score) which suggest how efficiently the activity had been performed by the trainee as compared to the expert's performance. The exhaustive experimental results by collecting data from eight experts and ten trainees for two different activities are used to present the inferences on the impact made by the Act-Score on the performance of the trainee farmers.","PeriodicalId":180959,"journal":{"name":"2015 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops)","volume":"233 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"AgriAcT: Agricultural Activity Training using multimedia and wearable sensing\",\"authors\":\"Somya Sharma, B. Jagyasi, Jabal Raval, Prashant A. Patil\",\"doi\":\"10.1109/PERCOMW.2015.7134078\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"There has been immense work in past on the human activities detection and context recognition using the wearable sensing technologies. However, a more challenging problem of providing training on the activities to the users with the help of wearable sensors has not been adequately attempted. Specially, in the agriculture applications, an appropriate training to the farmers on performing the agricultural activities would result in the sustainable agriculture practices for achieving higher and better quality yield. In this paper, a novel first-of-a-kind, multimedia and wearable sensors based Agricultural Activity Training (AgriAcT) system has been proposed for the dissemination of agricultural technologies to the remotely located farmers. In the proposed system, a training video of an expert farmer performing an activity is captured along with the gesture data obtained from the wearable motion sensors from the expert's body while the activity is being performed. A trainee farmer, can learn a selected activity by watching the multimedia content of the expert performing that activity on the mobile phone and subsequently perform the activity by wearing the required motion sensors. We present a novel K-Nearest Neighbor based Agriculture Activity Performance Score (KAAPS) engine to generate an Activity performance score (AcT-Score) which suggest how efficiently the activity had been performed by the trainee as compared to the expert's performance. The exhaustive experimental results by collecting data from eight experts and ten trainees for two different activities are used to present the inferences on the impact made by the Act-Score on the performance of the trainee farmers.\",\"PeriodicalId\":180959,\"journal\":{\"name\":\"2015 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops)\",\"volume\":\"233 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-03-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2015 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/PERCOMW.2015.7134078\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PERCOMW.2015.7134078","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9

摘要

利用可穿戴传感技术在人类活动检测和环境识别方面已经做了大量的工作。然而,在可穿戴传感器的帮助下为用户提供活动培训这一更具挑战性的问题还没有得到充分的尝试。特别是在农业应用方面,对农民进行适当的农业活动培训,将导致可持续农业实践,以实现更高质量的产量。本文提出了一种新型的基于多媒体和可穿戴传感器的农业活动培训(AgriAcT)系统,用于向偏远地区的农民传播农业技术。在所提出的系统中,捕获专家农民执行活动的培训视频以及在执行活动时从专家身体的可穿戴运动传感器获得的手势数据。一个实习农民,可以通过观看专家在手机上表演该活动的多媒体内容来学习选定的活动,然后通过佩戴所需的运动传感器来执行该活动。我们提出了一种新颖的基于k近邻的农业活动绩效评分(KAAPS)引擎来生成活动绩效评分(AcT-Score),该评分表明与专家的表现相比,受训人员执行活动的效率如何。通过对8位专家和10位学员在两种不同活动中收集的数据进行详尽的实验,得出Act-Score对实习农民绩效影响的推论。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
AgriAcT: Agricultural Activity Training using multimedia and wearable sensing
There has been immense work in past on the human activities detection and context recognition using the wearable sensing technologies. However, a more challenging problem of providing training on the activities to the users with the help of wearable sensors has not been adequately attempted. Specially, in the agriculture applications, an appropriate training to the farmers on performing the agricultural activities would result in the sustainable agriculture practices for achieving higher and better quality yield. In this paper, a novel first-of-a-kind, multimedia and wearable sensors based Agricultural Activity Training (AgriAcT) system has been proposed for the dissemination of agricultural technologies to the remotely located farmers. In the proposed system, a training video of an expert farmer performing an activity is captured along with the gesture data obtained from the wearable motion sensors from the expert's body while the activity is being performed. A trainee farmer, can learn a selected activity by watching the multimedia content of the expert performing that activity on the mobile phone and subsequently perform the activity by wearing the required motion sensors. We present a novel K-Nearest Neighbor based Agriculture Activity Performance Score (KAAPS) engine to generate an Activity performance score (AcT-Score) which suggest how efficiently the activity had been performed by the trainee as compared to the expert's performance. The exhaustive experimental results by collecting data from eight experts and ten trainees for two different activities are used to present the inferences on the impact made by the Act-Score on the performance of the trainee farmers.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信