{"title":"Session details: Session 4B: Human-Robot Interaction","authors":"Walter S. Lasecki","doi":"10.1145/3368376","DOIUrl":"https://doi.org/10.1145/3368376","url":null,"abstract":"","PeriodicalId":431403,"journal":{"name":"Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123716705","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yizheng Gu, Chun Yu, Zhipeng Li, Weiqi Li, Shuchang Xu, Xiaoying Wei, Yuanchun Shi
{"title":"Accurate and Low-Latency Sensing of Touch Contact on Any Surface with Finger-Worn IMU Sensor","authors":"Yizheng Gu, Chun Yu, Zhipeng Li, Weiqi Li, Shuchang Xu, Xiaoying Wei, Yuanchun Shi","doi":"10.1145/3332165.3347947","DOIUrl":"https://doi.org/10.1145/3332165.3347947","url":null,"abstract":"Head-mounted Mixed Reality (MR) systems enable touch interaction on any physical surface. However, optical methods (i.e., with cameras on the headset) have difficulty in determining the touch contact accurately. We show that a finger ring with Inertial Measurement Unit (IMU) can substantially improve the accuracy of contact sensing from 84.74% to 98.61% (f1 score), with a low latency of 10 ms. We tested different ring wearing positions and tapping postures (e.g., with different fingers and parts). Results show that an IMU-based ring worn on the proximal phalanx of the index finger can accurately sense touch contact of most usable tapping postures. Participants preferred wearing a ring for better user experience. Our approach can be used in combination with the optical touch sensing to provide robust and low-latency contact detection.","PeriodicalId":431403,"journal":{"name":"Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129808290","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ye Tao, Youngwook Do, Humphrey Yang, Yi-Chin Lee, Guanyun Wang, Catherine Mondoa, Jianxun Cui, Wen Wang, Lining Yao
{"title":"Morphlour: Personalized Flour-based Morphing Food Induced by Dehydration or Hydration Method","authors":"Ye Tao, Youngwook Do, Humphrey Yang, Yi-Chin Lee, Guanyun Wang, Catherine Mondoa, Jianxun Cui, Wen Wang, Lining Yao","doi":"10.1145/3332165.3347949","DOIUrl":"https://doi.org/10.1145/3332165.3347949","url":null,"abstract":"In this paper, we explore personalized morphing food that enhances traditional food with new HCI capabilities, rather than replacing the chef and authentic ingredients (e.g. flour) with an autonomous machine and heterogeneous mixtures (e.g. gel). Thus, we contribute a unique transformation mechanism of kneaded and sheeted flour-based dough, with an integrated design strategy for morphing food during two general cooking methods: dehydration (e.g. baking) or hydration (e.g. water boiling). We also enrich the design space of morphing food by demonstrating several applications. We end by discussing hybrid cooking between human and a design tool that we developed to ensure accuracy while preserving customizability for morphing food.","PeriodicalId":431403,"journal":{"name":"Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123906994","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Soft Inkjet Circuits: Rapid Multi-Material Fabrication of Soft Circuits using a Commodity Inkjet Printer","authors":"Arshad Khan, J. Roo, T. Kraus, Jürgen Steimle","doi":"10.1145/3332165.3347892","DOIUrl":"https://doi.org/10.1145/3332165.3347892","url":null,"abstract":"Despite the increasing popularity of soft interactive devices, their fabrication remains complex and time consuming. We contribute a process for rapid do-it-yourself fabrication of soft circuits using a conventional desktop inkjet printer. It supports inkjet printing of circuits that are stretchable, ultrathin, high resolution, and integrated with a wide variety of materials used for prototyping. We introduce multi-ink functional printing on a desktop printer for realizing multi-material devices, including conductive and isolating inks. We further present DIY techniques to enhance compatibility between inks and substrates and the circuits' elasticity. This enables circuits on a wide set of materials including temporary tattoo paper, textiles, and thermoplastic. Four application cases demonstrate versatile uses for realizing stretchable devices, e-textiles, body-based and re-shapeable interfaces.","PeriodicalId":431403,"journal":{"name":"Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121264409","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Zheer Xu, Pui Chung Wong, Jun Gong, Te-Yen Wu, A. Nittala, Xiaojun Bi, Jürgen Steimle, Hongbo Fu, Kening Zhu, Xing-Dong Yang
{"title":"TipText: Eyes-Free Text Entry on a Fingertip Keyboard","authors":"Zheer Xu, Pui Chung Wong, Jun Gong, Te-Yen Wu, A. Nittala, Xiaojun Bi, Jürgen Steimle, Hongbo Fu, Kening Zhu, Xing-Dong Yang","doi":"10.1145/3332165.3347865","DOIUrl":"https://doi.org/10.1145/3332165.3347865","url":null,"abstract":"In this paper, we propose and investigate a new text entry technique using micro thumb-tip gestures. Our technique features a miniature QWERTY keyboard residing invisibly on the first segment of the user's index finger. Text entry can be carried out using the thumb-tip to tap the tip of the index finger. The keyboard layout was optimized for eyes-free input by utilizing a spatial model reflecting the users' natural spatial awareness of key locations on the index finger. We present our approach of designing and optimizing the keyboard layout through a series of user studies and computer simulated text entry tests over 1,146,484 possibilities in the design space. The outcome is a 2×3 grid with the letters highly confining to the alphabetic and spatial arrangement of QWERTY. Our user evaluation showed that participants achieved an average text entry speed of 11.9 WPM and were able to type as fast as 13.3 WPM towards the end of the experiment.","PeriodicalId":431403,"journal":{"name":"Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology","volume":"79 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114312254","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Session details: Session 7A: Text","authors":"Parmit K. Chilana","doi":"10.1145/3368381","DOIUrl":"https://doi.org/10.1145/3368381","url":null,"abstract":"","PeriodicalId":431403,"journal":{"name":"Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology","volume":"50 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121625256","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Robiot: A Design Tool for Actuating Everyday Objects with Automatically Generated 3D Printable Mechanisms","authors":"Jiahao Li, Jeeeun Kim, Xiang 'Anthony' Chen","doi":"10.1145/3332165.3347894","DOIUrl":"https://doi.org/10.1145/3332165.3347894","url":null,"abstract":"Users can now easily communicate digital information with an Internet of Things; in contrast, there remains a lack of support to automate physical tasks that involve legacy static objects, e.g. adjusting a desk lamp's angle for optimal brightness, turning on/off a manual faucet when washing dishes, sliding a window to maintain a preferred indoor temperature. Automating these simple physical tasks has the potential to improve people's quality of life, which is particularly important for people with a disability or in situational impairment. We present Robiot -- a design tool for generating mechanisms that can be attached to, motorized, and actuating legacy static objects to perform simple physical tasks. Users only need to take a short video manipulating an object to demonstrate an intended physical behavior. Robiot then extracts requisite parameters and automatically generates 3D models of the enabling actuation mechanisms by performing a scene and motion analysis of the 2D video in alignment with the object's 3D model. In an hour-long design session, six participants used Robiot to actuate seven everyday objects, imbuing them with the robotic capability to automate various physical tasks.","PeriodicalId":431403,"journal":{"name":"Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology","volume":"261 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124413969","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Tom Valkeneers, D. Leen, Daniel Ashbrook, Raf Ramakers
{"title":"StackMold: Rapid Prototyping of Functional Multi-Material Objects with Selective Levels of Surface Details","authors":"Tom Valkeneers, D. Leen, Daniel Ashbrook, Raf Ramakers","doi":"10.1145/3332165.3347915","DOIUrl":"https://doi.org/10.1145/3332165.3347915","url":null,"abstract":"We present StackMold, a DIY molding technique to prototype multi-material and multi-colored objects with embedded electronics. The key concept of our approach is a novel multi-stage mold buildup in which casting operations are interleaved with the assembly of the mold to form independent compartments for casting different materials. To build multi-stage molds, we contribute novel algorithms that computationally design and optimize the mold and casting procedure. By default, the multi-stage mold is fabricated in slices using a laser cutter. For regions that require more surface detail, a high-fidelity 3D-printed mold subsection can be incorporated. StackMold is an integrated end-to-end system, supporting all stages of the process: it provides a UI to specify material and detail regions of a 3D~object; it generates fabrication files for the molds; and it produces a step-by-step casting instruction manual.","PeriodicalId":431403,"journal":{"name":"Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114839610","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Sean J. Liu, Maneesh Agrawala, S. DiVerdi, Aaron Hertzmann
{"title":"View-Dependent Video Textures for 360° Video","authors":"Sean J. Liu, Maneesh Agrawala, S. DiVerdi, Aaron Hertzmann","doi":"10.1145/3332165.3347887","DOIUrl":"https://doi.org/10.1145/3332165.3347887","url":null,"abstract":"A major concern for filmmakers creating 360° video is ensuring that the viewer does not miss important narrative elements because they are looking in the wrong direction. This paper introduces gated clips which do not play the video past a gate time until a filmmaker-defined viewer gaze condition is met, such as looking at a specific region of interest (ROI). Until the condition is met, we seamlessly loop video playback using view-dependent video textures, a new variant of standard video textures that adapt the looping behavior to the portion of the scene that is within the viewer's field of view. We use our desktop GUI to edit live action and computer animated 360° videos. In a user study with casual viewers, participants prefer our looping videos over the standard versions and are able to successfully see all of the looping videos' ROIs without fear of missing important narrative content.","PeriodicalId":431403,"journal":{"name":"Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115122841","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Daniel Groeger, Martin Feick, A. Withana, Jürgen Steimle
{"title":"Tactlets","authors":"Daniel Groeger, Martin Feick, A. Withana, Jürgen Steimle","doi":"10.1145/3332165.3347937","DOIUrl":"https://doi.org/10.1145/3332165.3347937","url":null,"abstract":"Rapid prototyping of haptic output on 3D objects promises to enable a more widespread use of the tactile channel for ubiquitous, tangible, and wearable computing. Existing prototyping approaches, however, have limited tactile output capabilities, require advanced skills for design and fabrication, or are incompatible with curved object geometries. In this paper, we present a novel digital fabrication approach for printing custom, high-resolution controls for electro-tactile output with integrated touch sensing on interactive objects. It supports curved geometries of everyday objects. We contribute a design tool for modeling, testing, and refining tactile input and output at a high level of abstraction, based on parameterized electro-tactile controls. We further contribute an inventory of 10 parametric Tactlet controls that integrate sensing of user input with real-time electro-tactile feedback. We present two approaches for printing Tactlets on 3D objects, using conductive inkjet printing or FDM 3D printing. Empirical results from a psychophysical study and findings from two practical application cases confirm the functionality and practical feasibility of the Tactlets approach.","PeriodicalId":431403,"journal":{"name":"Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology","volume":"2014 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114646668","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}