{"title":"基于手势传感器的虚拟学习系统融合","authors":"M. Weise, Raphael Zender, U. Lucke","doi":"10.1109/ICALT.2015.47","DOIUrl":null,"url":null,"abstract":"The combination of body gesture recognition and virtual reality opens a broad field of possibilities for practical and activity-oriented learning scenarios based on virtual simulations. Many educational projects address this potential by using specific gesture sensors to interact with virtual artefacts in 3D learning environments. Unfortunately, most of these projects are stand-alone developments for dedicated use cases. A systematic approach for sensor integration is currently missing. This paper presents the development and use of a framework for a systematic integration and even fusion of different gesture sensors for innovative, mixed-reality learning scenarios. Thus, developers of learning applications will be enabled to create interactive virtual environments without the time-consuming familiarization with each sensor hardware and SDK. This leads to a reduced time-to-market as well as a higher transferability.","PeriodicalId":170914,"journal":{"name":"2015 IEEE 15th International Conference on Advanced Learning Technologies","volume":"149 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-07-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Systematic Fusion of Gesture Sensors for Practical Learning in Virtual Environments\",\"authors\":\"M. Weise, Raphael Zender, U. Lucke\",\"doi\":\"10.1109/ICALT.2015.47\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The combination of body gesture recognition and virtual reality opens a broad field of possibilities for practical and activity-oriented learning scenarios based on virtual simulations. Many educational projects address this potential by using specific gesture sensors to interact with virtual artefacts in 3D learning environments. Unfortunately, most of these projects are stand-alone developments for dedicated use cases. A systematic approach for sensor integration is currently missing. This paper presents the development and use of a framework for a systematic integration and even fusion of different gesture sensors for innovative, mixed-reality learning scenarios. Thus, developers of learning applications will be enabled to create interactive virtual environments without the time-consuming familiarization with each sensor hardware and SDK. This leads to a reduced time-to-market as well as a higher transferability.\",\"PeriodicalId\":170914,\"journal\":{\"name\":\"2015 IEEE 15th International Conference on Advanced Learning Technologies\",\"volume\":\"149 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-07-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2015 IEEE 15th International Conference on Advanced Learning Technologies\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICALT.2015.47\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 IEEE 15th International Conference on Advanced Learning Technologies","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICALT.2015.47","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Systematic Fusion of Gesture Sensors for Practical Learning in Virtual Environments
The combination of body gesture recognition and virtual reality opens a broad field of possibilities for practical and activity-oriented learning scenarios based on virtual simulations. Many educational projects address this potential by using specific gesture sensors to interact with virtual artefacts in 3D learning environments. Unfortunately, most of these projects are stand-alone developments for dedicated use cases. A systematic approach for sensor integration is currently missing. This paper presents the development and use of a framework for a systematic integration and even fusion of different gesture sensors for innovative, mixed-reality learning scenarios. Thus, developers of learning applications will be enabled to create interactive virtual environments without the time-consuming familiarization with each sensor hardware and SDK. This leads to a reduced time-to-market as well as a higher transferability.