{"title":"Direct and indirect multi-touch interaction on a wall display","authors":"Jérémie Gilliot, Géry Casiez, Nicolas Roussel","doi":"10.1145/2670444.2670445","DOIUrl":"https://doi.org/10.1145/2670444.2670445","url":null,"abstract":"Multi-touch wall displays allow to take advantage of co-located interaction (direct interaction) on very large surfaces. However interacting with content beyond arms' reach requires body movements, introducing fatigue and impacting performance. Interacting with distant content using a pointer can alleviate these problems but introduces legibility issues and loses the benefits of multi-touch interaction. We introduce WallPad, a widget designed to quickly access remote content on wall displays while addressing legibility issues and supporting direct multi-touch interaction. After briefly describing how we supported multi-touch interaction on a wall display, we present the WallPad widget and explain how it supports direct, indirect and de-localized direct interaction.","PeriodicalId":131420,"journal":{"name":"Interaction Homme-Machine","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132796400","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Sarthak Ghosh, G. Bailly, Robin Despouys, É. Lecolinet, R. Sharrock
{"title":"SuperVision: spatial control of connected objects in smart-home","authors":"Sarthak Ghosh, G. Bailly, Robin Despouys, É. Lecolinet, R. Sharrock","doi":"10.1145/2670444.2670471","DOIUrl":"https://doi.org/10.1145/2670444.2670471","url":null,"abstract":"In this paper, we propose SuperVision, a novel interaction technique for controlling distant connect objects in smart-home. Users point an object with their remote control to visualize its state, and select its functionalities. To achieve this goal, 1) we present a novel remote control augmented with a video-projector and a slider; 2) we introduce a visualization allowing users to see through the walls in order to control objects in the line of sight as well as objects in another rooms; 3) we describe applications relying on this interaction technique","PeriodicalId":131420,"journal":{"name":"Interaction Homme-Machine","volume":"95 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130723948","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Camille Fayollas, Philippe A. Palanque, J. Fabre, D. Navarre, Eric Barboni, Martin Cronel, Y. Déléris
{"title":"A fault-tolerant architecture for resilient interactive systems","authors":"Camille Fayollas, Philippe A. Palanque, J. Fabre, D. Navarre, Eric Barboni, Martin Cronel, Y. Déléris","doi":"10.1145/2670444.2670462","DOIUrl":"https://doi.org/10.1145/2670444.2670462","url":null,"abstract":"Research contributions to improve interactive systems reliability as, for now, mainly focused towards fault occurrence prevention by removing software bugs at development time. However, Interactive Systems complexity is so high that whatever efforts are deployed at development time, faults and failures occur at operation time. Root causes of such failures may be due to transient hardware faults or (when systems are used in high atmosphere) may be so called \"natural faults\" triggered by alpha particles in processors or neutrons from cosmic radiations. This paper proposes an exhaustive identification of faults to be handled in order to improve interactive systems reliability. As currently no research has been carried out in the field of interactive systems to detect and remove natural faults, this paper proposes a software architecture providing fault-tolerant mechanisms dedicated to interactive systems. More precisely, the paper how such architecture addresses the various component of interactive applications namely widgets, user application and window manager. These concepts are demonstrated through a case study from the domain of interactive cockpits of large civil aircrafts.","PeriodicalId":131420,"journal":{"name":"Interaction Homme-Machine","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116170608","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Leonardo Angelini, M. Caon, D. Lalanne, Omar Abou Khaled, E. Mugellini
{"title":"An anthropomorphic lamp for the communication of emotions","authors":"Leonardo Angelini, M. Caon, D. Lalanne, Omar Abou Khaled, E. Mugellini","doi":"10.1145/2670444.2670472","DOIUrl":"https://doi.org/10.1145/2670444.2670472","url":null,"abstract":"This article presents the design of a lamp that is able to represent and collect users' emotional states through a multimodal interaction based on tangible gestures on the users' side, and colors and facial expressions on the lamp side. In particular, the lamp benefits of anthropomorphic form and behavior in order to make the interaction more natural. Two application scenarios are presented, as well as the implementation details of one of these scenarios.","PeriodicalId":131420,"journal":{"name":"Interaction Homme-Machine","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124820773","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A system for user task monitoring and assistance in ambient intelligent settings","authors":"Asma Gharsellaoui, Y. Bellik, Christophe Jacquet","doi":"10.1145/2670444.2670451","DOIUrl":"https://doi.org/10.1145/2670444.2670451","url":null,"abstract":"Existing task models are generally static (not used at runtime) and are used for the design or predictive evaluation of interactive systems. We propose to use the task model at runtime, in order to monitor user actions, check that they have not made any mistakes and give help when needed. We present a task model suitable for ambient environments that dynamically assigns states to tasks at the runtime. We also describe a monitoring and assistance system that uses our dynamic task model. Finally, we present a validation of our system through a simulation that shows how the interactions with the task model at runtime results in a dynamic system capable of providing assistance to users while they are carrying out their daily tasks.","PeriodicalId":131420,"journal":{"name":"Interaction Homme-Machine","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121333379","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Gaze-based interaction: evaluation of progressive feedback","authors":"V. Nguyen, F. Jambon, Gaëlle Calvary","doi":"10.1145/2670444.2670463","DOIUrl":"https://doi.org/10.1145/2670444.2670463","url":null,"abstract":"In monomodal approaches, eye-tracking for gaze-based interaction suffers from a tight coupling between perception and action: making the distinction between user action and user perception of information is almost impossible. This paper proposes the concept of progressive feedback to release this coupling. First experiments confirm that gaze-based interaction can be credible in some contexts of use. Moreover, progressive feedback appears as possibly valuable.","PeriodicalId":131420,"journal":{"name":"Interaction Homme-Machine","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132796847","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Adaptive hand-tracked system for 3D authoring","authors":"A. Héloir, Fabrizio Nunnari, C. Kolski","doi":"10.1145/2670444.2670456","DOIUrl":"https://doi.org/10.1145/2670444.2670456","url":null,"abstract":"We present the interaction design and the component architecture of an adaptiveauthoring system based on a consumer-range 3D input device. We claim that thissystem can help both novice and experienced users performing authoring tasks ina 3D authoring environment. The system uses a keyboardlessself-adaptive interaction controller built upon a rule-based system that learnsand infers the user's behavior/condition on the fly according to her actions;rearranging rules when necessary and suggesting breaks to avoid performancedrops caused by fatigue or the so-called gorilla-arm effect.","PeriodicalId":131420,"journal":{"name":"Interaction Homme-Machine","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122709017","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Predictive usability evaluation: aligning HCI and software engineering practices","authors":"","doi":"10.1145/2670444.2670467","DOIUrl":"https://doi.org/10.1145/2670444.2670467","url":null,"abstract":"Can we - software developers, usability experts, user interface designers - predict usability from the early user interface (UI) design artifacts and models? Can we define predictive measures to evaluate usability without a concrete UI? These questions seemed natural for us since UI modeling (task, user, concepts, etc.) is being largely explored in recent years for the automatic generation of final UI. To answer those questions we propose a model-based predictive usability evaluation approach that uses a set of usability measures. These measures are the essence of a framework we are developing for usability prediction. Initial empirical studies were performed to support this approach. This paper presents the fundamental basis on top of which we have developed this approach.","PeriodicalId":131420,"journal":{"name":"Interaction Homme-Machine","volume":"130 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121499487","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Y. Gauthier, Joran Marcy, David Duprat, Alexis Paoleschi, C. Letondal, Rémi Lesbordes, Jean-Luc Vinot, Christophe Hurter
{"title":"Gesture-based interaction for Strip'TIC, a tangible space for air traffic controllers","authors":"Y. Gauthier, Joran Marcy, David Duprat, Alexis Paoleschi, C. Letondal, Rémi Lesbordes, Jean-Luc Vinot, Christophe Hurter","doi":"10.1145/2670444.2670457","DOIUrl":"https://doi.org/10.1145/2670444.2670457","url":null,"abstract":"In this paper, we explore gesture-based interactions in a mixed interactive system for Air Traffic Controllers. This exploration lies on an analysis of controller gestures, that we were able to observe in a control tower and in a simulator centre. In our design, we focus on gesture-based interaction for the virtual objects associated with the physical objects.","PeriodicalId":131420,"journal":{"name":"Interaction Homme-Machine","volume":"536 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133389772","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Hybrid BCI for palliation of severe motor disability","authors":"Alban Duprès, J. Rouillard, F. Cabestaing","doi":"10.1145/2670444.2670466","DOIUrl":"https://doi.org/10.1145/2670444.2670466","url":null,"abstract":"This article presents work in progress concerning a hybrid brain computer interface (hBCI). Our goal is the palliation of severe motor disability for patients suffering from Duchenne muscular dystrophy (DMD). A hBCI involves several control channels to a pure brain computer interface (BCI). In our case we associate a motor imagery based BCI with EMG (Electromyography) control channel and distal movement sensors. The idea is to detect a finger movement at three levels of the motor command: cerebral, motor and distal. Data from these levels are merged taking into account state of the patient in order to adapt the system to the changing nature of his/her disease and to his/her high fatigability during the day.","PeriodicalId":131420,"journal":{"name":"Interaction Homme-Machine","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126064938","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}