{"title":"Fusion in multimodal interactive systems: an HMM-based algorithm for user-induced adaptation","authors":"Bruno Dumas, B. Signer, D. Lalanne","doi":"10.1145/2305484.2305490","DOIUrl":"https://doi.org/10.1145/2305484.2305490","url":null,"abstract":"Multimodal interfaces have shown to be ideal candidates for interactive systems that adapt to a user either automatically or based on user-defined rules. However, user-based adaptation demands for the corresponding advanced software architectures and algorithms. We present a novel multimodal fusion algorithm for the development of adaptive interactive systems which is based on hidden Markov models (HMM). In order to select relevant modalities at the semantic level, the algorithm is linked to temporal relationship properties. The presented algorithm has been evaluated in three use cases from which we were able to identify the main challenges involved in developing adaptive multimodal interfaces.","PeriodicalId":163033,"journal":{"name":"Engineering Interactive Computing System","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134225203","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Using ontologies to reason about the usability of interactive medical devices in multiple situations of use","authors":"Judy Bowen, A. Hinze","doi":"10.1145/2305484.2305525","DOIUrl":"https://doi.org/10.1145/2305484.2305525","url":null,"abstract":"Formally modelling interactive software systems and devices allows us to prove properties of correctness about such devices, and thus ensure effectiveness of their use. It also enables us to consider interaction properties such as usability and consistency between the interface and system functionality. Interactive modal devices, that have a fixed interface but whose behaviour is dependent on the mode of the device, can be similarly modelled. Such devices always behave in the same way (i.e. have the same functionality and interaction possibilities) irrespective of how, or where, they are used. However, a user's interaction with such devices may vary according to the physical location or environment in which they are situated (we refer to this as a system's context and usage situation). In this paper we look at a particular example of a safety-critical system, that of a modal interactive medical syringe pump, which is used in multiple situations. We consider how ontologies can be used to reason about the effects of different situations on the use of such devices.","PeriodicalId":163033,"journal":{"name":"Engineering Interactive Computing System","volume":"119 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133979059","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Distributed interaction","authors":"J. Bardram","doi":"10.1145/2305484.2305487","DOIUrl":"https://doi.org/10.1145/2305484.2305487","url":null,"abstract":"The personal computer as used by most people still to a large degree follows an interaction and technological design dating back to Allan Kay's Dynabook and the Xerox Star. This implies that interaction is confined to a single device with a single keyboard/mouse/display hardware configuration sitting on a desk, and personal rather than collaborative work is in focus.\u0000 The challenges of \"moving the computer beyond the desktop\" are being addressed within different research fields. For example, Ubiquitous Computing (Ubicomp) investigates how computing can be embedded in everyday life; Computer Supported Cooperative Work (CSCW) researches collaborative interaction; and many researchers in the CHI and EICS community explores basic infrastructure and technologies for handling multiple devices and displays in e.g. smart room setups.\u0000 In this talk, I will present our approach to these challenges. Specifically, I will introduce the term of \"distributed interaction,\" which is a research agenda focusing on researching theory, conceptual frameworks, interaction design, user interfaces, and infrastructure that allow interaction with computers to be distributed along three dimension: Devices -- computers should not be viewed as single device but as (inter)networked devices. Hence, interaction is not confined to one device, but should encompass multiple devices. Space -- computers are distributed in space and time, and are not confined to one setting. This includes mobility, but more importantly that devices are to be found in all sorts of odd settings where they need to adapt to, and collaborate with, their surroundings, including other devices, people, interaction devices, etc. People -- computers are to a large degree the primary way of collaboration in distributed organizations. Hence, a lot has changed since the personal computer was designed for small office collaboration and there is a need for incorporating support for global interaction as a fundamental mechanism in the computing platforms.\u0000 I will present our current approach for supporting distributed interaction called \"activity-based computing\" (ABC). Based on a strong theoretical foothold in Activity Theory, ABC provides a conceptual framework, interaction design, user interface, and a distributed programming and runtime infrastructure for distributed interaction. I will present ABC and show how it has been applied in building support for clinical work in hospitals and for smart space technology.","PeriodicalId":163033,"journal":{"name":"Engineering Interactive Computing System","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131705289","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Anke Dittmar, Alfonso García Frey, Sophie Dupuy-Chessa
{"title":"What can model-based UI design offer to end-user software engineering?","authors":"Anke Dittmar, Alfonso García Frey, Sophie Dupuy-Chessa","doi":"10.1145/2305484.2305515","DOIUrl":"https://doi.org/10.1145/2305484.2305515","url":null,"abstract":"End-User Programming enables end users to create their own programs. This can be accomplished in different ways, where one of them is by appropriation or reconfiguration of existing software. However, there is a trade-off between end users' 'situated design' and quality design which is addressed in End-User Software Engineering. This paper investigates how methods and techniques from Model-Based UI Design can contribute to End-User Software Engineering. Applying the concept of Extra-UI, the paper describes a Model-Based approach that allows to extend core applications in a way that some of the underlying models and assumptions become manipulable by end users. The approach is discussed through a running example.","PeriodicalId":163033,"journal":{"name":"Engineering Interactive Computing System","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125667787","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A logical framework for multi-device user interfaces","authors":"F. Paternò, C. Santoro","doi":"10.1145/2305484.2305494","DOIUrl":"https://doi.org/10.1145/2305484.2305494","url":null,"abstract":"In this paper, we present a framework for describing various design dimensions that can help in better understanding the features provided by tools and applications for multi-device environments. We indicate the possible options for each dimension, and also discuss how various research proposals in the area are located in our framework. The final discussion also points out important areas for future research.","PeriodicalId":163033,"journal":{"name":"Engineering Interactive Computing System","volume":"49 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125270797","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"jQMultiTouch: lightweight toolkit and development framework for multi-touch/multi-device web interfaces","authors":"Michael Nebeling, M. Norrie","doi":"10.1145/2305484.2305497","DOIUrl":"https://doi.org/10.1145/2305484.2305497","url":null,"abstract":"Application developers currently have to deal with the increased proliferation of new touch devices and the diversity in terms of both the native platform support for common gesture-based interactions and touch input sensing and processing techniques, in particular, for custom multi-touch behaviours. This paper presents jQMultiTouch - a lightweight web toolkit and development framework for multi-touch interfaces that can run on many different devices and platforms. jQMultiTouch is inspired from the popular jQuery toolkit for implementing interfaces in a device-independent way based on client-side web technologies. Similar to jQuery, the framework resolves cross-browser compatibility issues and implementation differences between device platforms by providing a uniform method for the specification of multi-touch interface elements and associated behaviours that seamlessly translate to browser-specific code. At the core of jQMultiTouch is a novel input stream query language for filtering and processing touch event data based on an extensible set of match predicates and aggregate functions. We demonstrate design simplicity for developers along several example applications and discuss performance, scalability and portability of the framework.","PeriodicalId":163033,"journal":{"name":"Engineering Interactive Computing System","volume":"99 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125553621","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Specifying and running rich graphical components with Loa","authors":"Olivier Beaudoux, Mickael Clavreul, Arnaud Blouin, Mengqian Yang, Olivier Barais, J. Jézéquel","doi":"10.1145/2305484.2305513","DOIUrl":"https://doi.org/10.1145/2305484.2305513","url":null,"abstract":"Interactive system designs often require the use of rich graphical components whose capabilities go beyond the set of widgets provided by GUI toolkits. The implementation of such rich graphical components require a high programming effort that GUI toolkits do not alleviate. In this paper, we propose the Loa framework that allows both the specification of rich graphical components and their integration within running interactive applications. We illustrate the specification and integration with the Loa framework as part of a global process for the design of interactive systems.","PeriodicalId":163033,"journal":{"name":"Engineering Interactive Computing System","volume":"115 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115252343","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. V. D. Bergh, D. Sahni, Mieke Haesen, K. Luyten, K. Coninx
{"title":"GRIP: get better results from interactive prototypes","authors":"J. V. D. Bergh, D. Sahni, Mieke Haesen, K. Luyten, K. Coninx","doi":"10.1145/1996461.1996508","DOIUrl":"https://doi.org/10.1145/1996461.1996508","url":null,"abstract":"Prototypes are often used to clarify and evaluate design alternatives for a graphical user interface. They help stakeholders to decide on different aspects by making them visible and concrete. This is a highly iterative process in which the prototypes evolve into a design artifact that is close enough to the envisioned result to be implemented. People with different roles are involved in prototyping. Our claim is that integrated or inter-operable tools help design information propagate among people while prototyping and making the transition more accurately into the software development phase.\u0000 We make a first step towards such a solution by offering a framework, GRIP, in which such a tool should fit. We conducted a preliminary evaluation of the framework by using it to classify existing tools for prototyping and implementing a limited prototyping tool, GRIP-it, which can be integrated into the overall process.","PeriodicalId":163033,"journal":{"name":"Engineering Interactive Computing System","volume":"337 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-06-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123417113","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jiajian Chen, Jun Xiao, Jian Fan, Eamonn O'Brien-Strain
{"title":"PageSpark: an E-magazine reader with enhanced reading experiences on handheld devices","authors":"Jiajian Chen, Jun Xiao, Jian Fan, Eamonn O'Brien-Strain","doi":"10.1145/1996461.1996510","DOIUrl":"https://doi.org/10.1145/1996461.1996510","url":null,"abstract":"In this paper we present PageSpark, a system that automatically converts static magazine content to interactive and engaging reading apps on handheld reading devices. PageSpark enhances the reading experience in three general aspects: page layout reorganization, page element interactions and page transitions. We explored and implemented several design variations in each aspect with the prototype running on the iPad. Participants from our initial user study showed strong interest of using PageSpark over existing magazine reading apps.","PeriodicalId":163033,"journal":{"name":"Engineering Interactive Computing System","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-06-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126826386","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
K. Luyten, D. Vanacken, M. Weiss, Jan O. Borchers, Miguel A. Nacenta
{"title":"Second workshop on engineering patterns for multi-touch interfaces","authors":"K. Luyten, D. Vanacken, M. Weiss, Jan O. Borchers, Miguel A. Nacenta","doi":"10.1145/1996461.1996553","DOIUrl":"https://doi.org/10.1145/1996461.1996553","url":null,"abstract":"Multi-touch gained a lot of interest in the last couple of years and the increased availability of multi-touch enabled hardware boosted its development. However, the current diversity of hardware, toolkits, and tools for creating multi-touch interfaces has its downsides: there is only little reusable material and no generally accepted body of knowledge when it comes to the development of multi-touch interfaces. This workshop is the second workshop on this topic and the workshop goal remains unchanged: to seek a consensus on methods, approaches, toolkits, and tools that aid in the engineering of multi-touch interfaces and transcend the differences in available platforms. The patterns mentioned in the title indicate that we are aiming to create a reusable body of knowledge.","PeriodicalId":163033,"journal":{"name":"Engineering Interactive Computing System","volume":"39 10","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-06-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131892829","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}