Kevin Ponto, Ryan Kimmel, Joe Kohlmann, Aaron Bartholomew, Robert G Radwin
{"title":"虚拟操作:结合视觉信息、运动美学和生物反馈的用户界面,用于虚拟对象操作。","authors":"Kevin Ponto, Ryan Kimmel, Joe Kohlmann, Aaron Bartholomew, Robert G Radwin","doi":"10.1109/3DUI.2012.6184189","DOIUrl":null,"url":null,"abstract":"<p><p>Virtual Reality environments have the ability to present users with rich visual representations of simulated environments. However, means to interact with these types of illusions are generally unnatural in the sense that they do not match the methods humans use to grasp and move objects in the physical world. We demonstrate a system that enables users to interact with virtual objects with natural body movements by combining visual information, kinesthetics and biofeedback from electromyograms (EMG). Our method allows virtual objects to be grasped, moved and dropped through muscle exertion classification based on physical world masses. We show that users can consistently reproduce these calibrated exertions, allowing them to interface with objects in a novel way.</p>","PeriodicalId":90698,"journal":{"name":"Proceedings. IEEE Symposium on 3D User Interfaces","volume":"2012 ","pages":"85-88"},"PeriodicalIF":0.0000,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/3DUI.2012.6184189","citationCount":"13","resultStr":"{\"title\":\"Virtual Exertions: a user interface combining visual information, kinesthetics and biofeedback for virtual object manipulation.\",\"authors\":\"Kevin Ponto, Ryan Kimmel, Joe Kohlmann, Aaron Bartholomew, Robert G Radwin\",\"doi\":\"10.1109/3DUI.2012.6184189\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Virtual Reality environments have the ability to present users with rich visual representations of simulated environments. However, means to interact with these types of illusions are generally unnatural in the sense that they do not match the methods humans use to grasp and move objects in the physical world. We demonstrate a system that enables users to interact with virtual objects with natural body movements by combining visual information, kinesthetics and biofeedback from electromyograms (EMG). Our method allows virtual objects to be grasped, moved and dropped through muscle exertion classification based on physical world masses. We show that users can consistently reproduce these calibrated exertions, allowing them to interface with objects in a novel way.</p>\",\"PeriodicalId\":90698,\"journal\":{\"name\":\"Proceedings. IEEE Symposium on 3D User Interfaces\",\"volume\":\"2012 \",\"pages\":\"85-88\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2012-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1109/3DUI.2012.6184189\",\"citationCount\":\"13\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings. IEEE Symposium on 3D User Interfaces\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/3DUI.2012.6184189\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings. IEEE Symposium on 3D User Interfaces","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/3DUI.2012.6184189","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Virtual Exertions: a user interface combining visual information, kinesthetics and biofeedback for virtual object manipulation.
Virtual Reality environments have the ability to present users with rich visual representations of simulated environments. However, means to interact with these types of illusions are generally unnatural in the sense that they do not match the methods humans use to grasp and move objects in the physical world. We demonstrate a system that enables users to interact with virtual objects with natural body movements by combining visual information, kinesthetics and biofeedback from electromyograms (EMG). Our method allows virtual objects to be grasped, moved and dropped through muscle exertion classification based on physical world masses. We show that users can consistently reproduce these calibrated exertions, allowing them to interface with objects in a novel way.