{"title":"Augmenting the input space of portable displays using add-on hall-sensor grid","authors":"Rong-Hao Liang","doi":"10.1145/2508468.2508470","DOIUrl":"https://doi.org/10.1145/2508468.2508470","url":null,"abstract":"Since handheld and wearable displays are highly mobile, various applications are enabled to enrich our daily life. In addition to displaying high-fidelity information, these devices also support natural and effective user interactions by exploiting the capability of various embedded sensors. Nonetheless, the set of built-in sensors has limitations. Add-on sensor technologies, therefore, are needed. This work chooses to exploit magnetism as an additional channel of user input. The author first explains the reasons of developing the add-on magnetic field sensing technology based on neodymium magnets and the analog Hall-sensor grid. Then, the augmented input space is showcased through two instances. 1) For handheld displays, the sensor extends the object tracking capability to the near-surface 3D space by simply attaching it to the back of devices. 2) For wearable displays, the sensor enables private and rich-haptic 2D input by wearing it on user's fingernails. Limitations and possible research directions of this approach are highlighted in the end of paper.","PeriodicalId":196872,"journal":{"name":"Adjunct Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133650207","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Taik-Heon Rhee, Minkyu Jung, Sungwook Baek, Hyun-Jin Kim, Sungbin Kuk, Seonghoon Kang, Hark-Joon Kim
{"title":"Ambient surface: enhancing interface capabilities of mobile objects aided by ambient environment","authors":"Taik-Heon Rhee, Minkyu Jung, Sungwook Baek, Hyun-Jin Kim, Sungbin Kuk, Seonghoon Kang, Hark-Joon Kim","doi":"10.1145/2508468.2516910","DOIUrl":"https://doi.org/10.1145/2508468.2516910","url":null,"abstract":"We introduce Ambient Surface, an interactive surrounded equipment for enhancing interface capabilities of mobile devices placed on an ordinary surface. Object information and a user's interaction are captured by 2D/3D cameras, and appropriate feedback images are projected on the surface. By the help of the ambient system, we may not only provide a wider screen for mobile devices with a limited screen size, but also allow analog objects to dynamically interact with users. We believe that this demo will help interaction designers to draw new inspiration of utilizing mobile objects with ambient environment.","PeriodicalId":196872,"journal":{"name":"Adjunct Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130156220","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jessica Tsimeris, D. Stevenson, Matt Adcock, Tom Gedeon, Michael Broughton
{"title":"User created tangible controls using ForceForm: a dynamically deformable interactive surface","authors":"Jessica Tsimeris, D. Stevenson, Matt Adcock, Tom Gedeon, Michael Broughton","doi":"10.1145/2508468.2514727","DOIUrl":"https://doi.org/10.1145/2508468.2514727","url":null,"abstract":"Touch surfaces are common devices but they are often uniformly flat and provide little flexibility beyond changing the visual information communicated to the user via software. Furthermore, controls for interaction are not tangible and are usually specified and placed by the user interface designer. Using ForceForm, a dynamically deformable interactive surface, the user is able to directly sculpt the surface to create tangible controls with force feedback properties. These controls can be made according to the user's specifications, and can then be relinquished when no longer needed. We describe this method of interaction, provide an implementation of a slider, and ideas for further controls.","PeriodicalId":196872,"journal":{"name":"Adjunct Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114301833","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pragun Goyal, Harshit Agrawal, J. Paradiso, P. Maes
{"title":"BoardLab: PCB as an interface to EDA software","authors":"Pragun Goyal, Harshit Agrawal, J. Paradiso, P. Maes","doi":"10.1145/2508468.2514936","DOIUrl":"https://doi.org/10.1145/2508468.2514936","url":null,"abstract":"The tools used to work with Printed Circuit Boards (PCBs), for example soldering iron, multi-meter and oscilloscope involve working directly with the board and the board components. However, the Electronic Design Automation (EDA) software used to query a PCB's design data requires using a keyboard and a mouse. These different interfaces make it difficult to connect both kinds of operations in a workflow. Further, the measurements made by tools like a multi-meter have to be understood in the context of the schematics of the board manually. We propose a solution to reduce the cognitive load of this disconnect by introducing a handheld probe that allows for direct interactions with the PCB for just-in-time information on board schematics, component datasheets and source code. The probe also doubles up as a voltmeter and annotates the schematics of the board with voltage measurements.","PeriodicalId":196872,"journal":{"name":"Adjunct Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114478230","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Obake: interactions on a 2.5D elastic display","authors":"Dhairya Dand, R. Hemsley","doi":"10.1145/2508468.2514734","DOIUrl":"https://doi.org/10.1145/2508468.2514734","url":null,"abstract":"In this poster we present an interaction language for the manipulation of an elastic deformable 2.5D display. We discuss a range of gestures to interact and directly deform the surface. To demonstrate these affordances and the associated interactions, we present a scenario of a topographic data viewer using this prototype system.","PeriodicalId":196872,"journal":{"name":"Adjunct Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121694894","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The nudging technique: input method without fine-grained pointing by pushing a segment","authors":"Shota Yamanaka, Homei Miyashita","doi":"10.1145/2508468.2514927","DOIUrl":"https://doi.org/10.1145/2508468.2514927","url":null,"abstract":"The Nudging Technique is a new manipulation paradigm for GUIs. With traditional techniques, the user sometimes has to perform a fine-grained operation (e.g., pointing at the edge of a window to resize). When the user makes a mistake in the pointing, problems may arise such as an accidental switching of the foreground window. The nudging technique relieves the user from the fine pointing before dragging; the user just moves the cursor to a target then pushes it. Visual and acoustic feedbacks also help the user's operation. We describe two application examples: window resizing and spreadsheet cell resizing systems.","PeriodicalId":196872,"journal":{"name":"Adjunct Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121773897","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"eyeCan: affordable and versatile gaze interaction","authors":"Sang-won Leigh","doi":"10.1145/2508468.2514719","DOIUrl":"https://doi.org/10.1145/2508468.2514719","url":null,"abstract":"We present eyeCan, a software system that promises rich, sophisticated, and still usable gaze interactions with low-cost gaze tracking setups. The creation of this practical system was to drastically lower the hurdle of gaze interaction by presenting easy-to-use gaze gestures, and by reducing the cost-of-entry with the utilization of low precision gaze trackers. Our system effectively compensates for the noise from tracking sensors and involuntary eye movements, boosting both the precision and speed in cursor control. Also the possible variety of gaze gestures was explored and defined. By combining eyelid actions and gaze direction cues, our system provides rich set of gaze events and therefore enables the use of sophisticated applications e.g. playing video games or navigating street view.","PeriodicalId":196872,"journal":{"name":"Adjunct Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125565638","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"DDMixer2.5D: drag and drop to mix 2.5D video objects","authors":"Tatsuya Kurihara, Makoto Okabe, R. Onai","doi":"10.1145/2508468.2514714","DOIUrl":"https://doi.org/10.1145/2508468.2514714","url":null,"abstract":"We propose a 2.5D video editing system called DDMixer2.5D. 2.5D video contains not only color channels but also a depth channel, which can be recorded easily using recently available depth sensors, such as Microsoft Kinect. Our system employs this depth channel to allow a user to quickly and easily edit video objects by using simple drag-and-drop gestures. For example, a user can copy a video object of a dancing figure from video to video simply by dragging and dropping using finger on the touch screen of a mobile phone handset. In addition, the user can drag to adjust the 3D position in the new video so that contact between foot and floor is preserved and the size of the body is automatically adjusted according to the depth. DDMixer2.5D has other useful functions required for practical use, including object removal, editing 3D camera path, creating of anaglyph 3D video, as well as a timeline interface.","PeriodicalId":196872,"journal":{"name":"Adjunct Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117130485","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Haptic props: semi-actuated tangible props for haptic interaction on the surface","authors":"Dimitar Valkov, Andreas Mantler, K. Hinrichs","doi":"10.1145/2508468.2514736","DOIUrl":"https://doi.org/10.1145/2508468.2514736","url":null,"abstract":"While multiple methods to extend the expressiveness of tangible interaction have been proposed, e. g., self-motion, stacking and transparency, providing haptic feedback to the tangible prop itself has rarely been considered. In this poster we present a semi-actuated, nano-powered, tangible prop, which is able to provide programmable friction for interaction with a tabletop setup. We have conducted a preliminary user study evaluating the users' acceptance for the device and their ability to detect changes in the programmed level of friction and received some promising results.","PeriodicalId":196872,"journal":{"name":"Adjunct Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121821379","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Simon Voelker, Kosuke Nakajima, Christian Thoresen, Yuichi Itoh, Kjell Ivar Øvergård, Jan O. Borchers
{"title":"PUCs: detecting transparent, passive untouched capacitive widgets on unmodified multi-touch displays","authors":"Simon Voelker, Kosuke Nakajima, Christian Thoresen, Yuichi Itoh, Kjell Ivar Øvergård, Jan O. Borchers","doi":"10.1145/2508468.2514926","DOIUrl":"https://doi.org/10.1145/2508468.2514926","url":null,"abstract":"Capacitive multi-touch displays are not typically designed to detect passive objects placed on them. In fact, these systems usually contain filters to actively reject such input data. We present a technical analysis of this problem and introduce Passive Untouched Capacitive Widgets (PUCs). Unlike previous approaches, PUCs do not require power, they can be made entirely transparent, and they do not require internal electrical or software modifications. Most importantly they are detected reliably even when no user is touching them.","PeriodicalId":196872,"journal":{"name":"Adjunct Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2013-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128315730","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}