{"title":"移动环境下多模态用户界面的适应性","authors":"R. Kernchen, K. Moessner, R. Tafazolli","doi":"10.1109/ISADS.2005.1452112","DOIUrl":null,"url":null,"abstract":"Multimodality is a fact of human communication, increasingly our ways to communicate change and humans do interact with machines (be it a mundane ATM transaction, the calling of an automised call center, or the setting/disarming of a residential alarm system). However, these interactions are mostly limited to single input and output schemes, thus loosing a lot of additional information a human communication partner would sense, Multimodality was perceived to exactly tackle this point. This paper describes a framework and approach to operate multimodal interaction mechanisms in both the fixed as well as the mobile environments. The paper describes a scheme that facilitates the dynamic binding and release of user-interface devices (such as screens, keyboards, etc.) to support multimodal interactions in mobile environments and to enable the user to 'make use' of any possible user interface device available (and allowed), thus supporting the individuals changing communication environment. The principles and basic functionality of an adaptive multimodal human interface-device binding engine are outlined.","PeriodicalId":120577,"journal":{"name":"Proceedings Autonomous Decentralized Systems, 2005. ISADS 2005.","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2005-04-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Adaptivity for multimodal user interfaces in mobile situations\",\"authors\":\"R. Kernchen, K. Moessner, R. Tafazolli\",\"doi\":\"10.1109/ISADS.2005.1452112\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Multimodality is a fact of human communication, increasingly our ways to communicate change and humans do interact with machines (be it a mundane ATM transaction, the calling of an automised call center, or the setting/disarming of a residential alarm system). However, these interactions are mostly limited to single input and output schemes, thus loosing a lot of additional information a human communication partner would sense, Multimodality was perceived to exactly tackle this point. This paper describes a framework and approach to operate multimodal interaction mechanisms in both the fixed as well as the mobile environments. The paper describes a scheme that facilitates the dynamic binding and release of user-interface devices (such as screens, keyboards, etc.) to support multimodal interactions in mobile environments and to enable the user to 'make use' of any possible user interface device available (and allowed), thus supporting the individuals changing communication environment. The principles and basic functionality of an adaptive multimodal human interface-device binding engine are outlined.\",\"PeriodicalId\":120577,\"journal\":{\"name\":\"Proceedings Autonomous Decentralized Systems, 2005. ISADS 2005.\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2005-04-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings Autonomous Decentralized Systems, 2005. ISADS 2005.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISADS.2005.1452112\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings Autonomous Decentralized Systems, 2005. ISADS 2005.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISADS.2005.1452112","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Adaptivity for multimodal user interfaces in mobile situations
Multimodality is a fact of human communication, increasingly our ways to communicate change and humans do interact with machines (be it a mundane ATM transaction, the calling of an automised call center, or the setting/disarming of a residential alarm system). However, these interactions are mostly limited to single input and output schemes, thus loosing a lot of additional information a human communication partner would sense, Multimodality was perceived to exactly tackle this point. This paper describes a framework and approach to operate multimodal interaction mechanisms in both the fixed as well as the mobile environments. The paper describes a scheme that facilitates the dynamic binding and release of user-interface devices (such as screens, keyboards, etc.) to support multimodal interactions in mobile environments and to enable the user to 'make use' of any possible user interface device available (and allowed), thus supporting the individuals changing communication environment. The principles and basic functionality of an adaptive multimodal human interface-device binding engine are outlined.