Che-Ming Chang, Lucas Gerez, Nathan Elangovan, Agisilaos G. Zisimatos, Minas Liarokapis
{"title":"Unconventional Uses of Structural Compliance in Adaptive Hands","authors":"Che-Ming Chang, Lucas Gerez, Nathan Elangovan, Agisilaos G. Zisimatos, Minas Liarokapis","doi":"10.1109/RO-MAN46459.2019.8956340","DOIUrl":"https://doi.org/10.1109/RO-MAN46459.2019.8956340","url":null,"abstract":"Adaptive robot hands are typically created by introducing structural compliance either in their joints (e.g., implementation of flexure joints) or in their finger-pads. In this paper, we present a series of alternative uses of structural compliance for the development of simple, adaptive, compliant and/or under-actuated robot grippers and hands that can efficiently and robustly execute a variety of grasping and dexterous, in-hand manipulation tasks. The proposed designs utilize only one actuator per finger to control multiple degrees of freedom and they retain the superior grasping capabilities of the adaptive grasping mechanisms even under significant object pose or other environmental uncertainties. More specifically, in this work, we introduce, discuss, and evaluate: a) the concept of compliance adjustable motions that can be predetermined by tuning the in-series compliance of the tendon routing system and by appropriately selecting the imposed tendon loads, b) a design paradigm of pre-shaped, compliant robot fingers that adapt / conform to the object geometry and, c) a hyper-adaptive finger-pad design that maximizes the area of the contact patches between the hand and the object, maximizing also grasp stability. The proposed hands use mechanical adaptability to facilitate and simplify the efficient execution of robust grasping and dexterous, in-hand manipulation tasks by design.","PeriodicalId":286478,"journal":{"name":"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","volume":"131 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131955548","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"AppGAN: Generative Adversarial Networks for Generating Robot Approach Behaviors into Small Groups of People","authors":"Fangkai Yang, Christopher E. Peters","doi":"10.1109/RO-MAN46459.2019.8956425","DOIUrl":"https://doi.org/10.1109/RO-MAN46459.2019.8956425","url":null,"abstract":"Robots that navigate to approach free-standing conversational groups should do so in a safe and socially acceptable manner. This is challenging since it not only requires the robot to plot trajectories that avoid collisions with members of the group, but also to do so without making those in the group feel uncomfortable, for example, by moving too close to them or approaching them from behind. Previous trajectory prediction models focus primarily on formations of walking pedestrians, and those models that do consider approach behaviours into free-standing conversational groups typically have handcrafted features and are only evaluated via simulation methods, limiting their effectiveness. In this paper, we propose AppGAN, a novel trajectory prediction model capable of generating trajectories into free-standing conversational groups trained on a dataset of safe and socially acceptable paths. We evaluate the performance of our model with state-of-the-art trajectory prediction methods on a semi-synthetic dataset. We show that our model outperforms baselines by taking advantage of the GAN framework and our novel group interaction module.","PeriodicalId":286478,"journal":{"name":"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134068033","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
S. Samarakoon, M. Muthugala, A. Jayasekara, M. R. Elara
{"title":"An Exploratory Study on Proxemics Preferences of Humans in Accordance with Attributes of Service Robots","authors":"S. Samarakoon, M. Muthugala, A. Jayasekara, M. R. Elara","doi":"10.1109/RO-MAN46459.2019.8956297","DOIUrl":"https://doi.org/10.1109/RO-MAN46459.2019.8956297","url":null,"abstract":"Service robots that possess social interactive capabilities are vital to cater to the demand in emerging domains of robotic applications. A service robot frequently needs to interact with users when performing service tasks. The comfortability of users depends on the human-robot proxemics during these interactions. Hence, a service robot should be capable of maintaining proper proxemics that improves the comfort of users. The proxemics preferences of users might depend on diverse attributes of a robot, such as emotional state, noise level, and physical appearance. Therefore, it is vital to gain a better understanding of a robot’s attributes which influence human-robot proxemics behavior. This paper contributes to an exploratory study to analyze the effects on human-robot proxemics preferences due to a robot’s attributes; facial and vocal emotions, level of internal noises, and the physical appearance. Four sub-studies have been conducted to gather the required human-robot proxemics data. The gathered data have been analyzed through statistical tests. The test statistics reveal that facial and vocal emotions, internal noise level, and the physical appearance of a robot have significant effects on proxemics preferences of humans. The outcomes of this exploratory study would be useful in designing and developing human-robot proxemics strategies of a service robot that would enhance social interaction.","PeriodicalId":286478,"journal":{"name":"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","volume":"107 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134520652","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Eyes on you: field study of robot vendor using human-like eye component “Akagachi”","authors":"Kotaro Hayashi, Yasunori Toshimitsu","doi":"10.1109/RO-MAN46459.2019.8956362","DOIUrl":"https://doi.org/10.1109/RO-MAN46459.2019.8956362","url":null,"abstract":"Eye gaze is an important non-verbal behavior for communication robots as it serves as the onset of communication. Existing communication robots have various eyes because design choices for an appropriate eye have yet to be determined, so many robots are designed on the basis of individual designers’ ideas. Thus, this study focuses on human-like eye gaze in a real environment. We developed an independent human-like eye gaze component called Akagachi for various robots and conducted an observational field study by implementing it to a vendor robot called Reika. We conducted a field study in a theme park where Reika sells soft-serve ice cream in a food stall and analyzed the behaviors of 984 visitors. Our results indicate that Reika elicits significantly more interaction from people with eye gaze than without it.","PeriodicalId":286478,"journal":{"name":"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","volume":"120 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131472229","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Human Prediction for the Natural Instruction of Handovers in Human Robot Collaboration","authors":"Jens Lambrecht, Sebastian Nimpsch","doi":"10.1109/RO-MAN46459.2019.8956379","DOIUrl":"https://doi.org/10.1109/RO-MAN46459.2019.8956379","url":null,"abstract":"Human robot collaboration is aspiring to establish hybrid work environments in accordance with specific strengths of humans and robots. We present an approach of flexibly integrating robotic handover assistance into collaborative assembly tasks through the use of natural communication. For flexibly instructed handovers, we implement recent Convolutional Neural Networks in terms of object detection and grasping of arbitrary objects based on an RGB-D camera equipped to a robot following the eye-in-hand principle. In order to increase fluency and efficiency of the overall assembly process, we investigate the human ability to instruct the robot predictively with voice commands. We conduct a user study quantitatively and qualitatively evaluating the predictive instruction in order to achieve just-in-time handovers of tools needed for following subtasks. We compare our predictive strategy with a pure manual assembly having all tools in direct reach and a stepby-step reactive handover. The results reveal that the human is able to predict the handover comparable to algorithmbased predictors. Nevertheless, human prediction does not rely on extensive prior knowledge and is thus suitable for more flexible usage. However, the cognitive workload for the worker is increased compared to manual or reactive assembly.","PeriodicalId":286478,"journal":{"name":"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","volume":"65 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132764036","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Common Social Distance Scale for Robots and Humans*","authors":"J. Banks, Autumn P. Edwards","doi":"10.1109/RO-MAN46459.2019.8956316","DOIUrl":"https://doi.org/10.1109/RO-MAN46459.2019.8956316","url":null,"abstract":"From keeping robots as in-home helpers to banning their presence or functions, a person’s willingness to engage in variably intimate interactions are signals of social distance: the degree of felt understanding of and intimacy with an individual or group that characterizes pre-social and social connections. To date, social distance has been examined through surrogate metrics not actually representing the construct (e.g., self-disclosure or physical proximity). To address this gap between operations and measurement, this project details a four-stage social distance scale development project, inclusive of systematic item pool-generation, candidate item ratings for laypersons thinking about social distance, testing of candidate items via scalogram and initial validity analyses, and final testing for cumulative structure and predictive validity. The final metric yields a 15-item (18, counting applications with a ‘none’ option), three-dimension scale for physical distance, relational distance, and conversational distance.","PeriodicalId":286478,"journal":{"name":"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133239841","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jesse J. Rond, Alan Sanchez, Jaden Berger, H. Knight
{"title":"Improv with Robots: Creativity, Inspiration, Co-Performance","authors":"Jesse J. Rond, Alan Sanchez, Jaden Berger, H. Knight","doi":"10.1109/RO-MAN46459.2019.8956410","DOIUrl":"https://doi.org/10.1109/RO-MAN46459.2019.8956410","url":null,"abstract":"Improvisational actors are adept at creative exploration within a set of boundaries. These boundaries come from each scene having “games” that establish the rules of-play. In this paper, we introduce a game that allows an expressive motion robot to collaboratively develop a narrative with an improviser. When testing this game on eight improv performers, our team explored two research questions: (1) Can a simple robot be a creative partner to a human improviser, and (2) Can improvisers expand our understanding of robot expressive motion? After conducting 16 scenes and 40 motion demonstrations, we found that performers viewed our robot as a supportive teammate who positively inspired the scene’s direction. The improvisers also provided insightful perspectives on robot motion, which led us to create a movement categorization scheme based on their various interpretations. We discuss our lessons learned, show the benefits of merging social robotics with improvisational theater, and hope this will encourage further exploration of this cross-disciplinary intersection.","PeriodicalId":286478,"journal":{"name":"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128862860","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Naomi T. Fitter, Youngseok Joung, Zijian Hu, Marton Demeter, M. Matarić
{"title":"User Interface Tradeoffs for Remote Deictic Gesturing","authors":"Naomi T. Fitter, Youngseok Joung, Zijian Hu, Marton Demeter, M. Matarić","doi":"10.1109/RO-MAN46459.2019.8956354","DOIUrl":"https://doi.org/10.1109/RO-MAN46459.2019.8956354","url":null,"abstract":"Telepresence robots can help to connect people by providing videoconferencing and navigation abilities in faraway environments. Despite this potential, current commercial telepresence robots lack certain nonverbal expressive abilities that are important for permitting the operator to communicate effectively in the remote environment. To help improve the utility of telepresence robots, we added an expressive, non-manipulating arm to our custom telepresence robot system and developed three user interfaces to control deictic gesturing by the arm: onscreen, dial-based, and skeleton tracking methods. A usability study helped us to evaluate user presence feelings, task load, preferences, and opinions while performing deictic gestures with the robot arm during a mock order packing task. The majority of participants preferred the dial-based method of controlling the robot, and survey responses revealed differences in physical demand and effort level across user interfaces. These results can guide robotics researchers interested in extending the nonverbal communication abilities of telepresence robots.","PeriodicalId":286478,"journal":{"name":"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133850494","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Intention Detection and Gait Recognition (IDGR) System for Gait Assessment: A Pilot Study","authors":"Yogesh Singh, Manan Kher, V. Vashista","doi":"10.1109/RO-MAN46459.2019.8956299","DOIUrl":"https://doi.org/10.1109/RO-MAN46459.2019.8956299","url":null,"abstract":"Gait abnormality is the most significant symptom in the neurologically affected patients. To improve their quality of life, it is important to complement and further enhance the existing qualitative gait analysis protocol with a technically sound quantitative paradigm. In this paper, we present a pilot study and the development of a wearable intention detection and gait recognition (IDGR) system. This system comprises a well-established integrated network of microcontrollers and sensors which acts as a diagnostic tool for gait correction. IDGR system provides real-time feedback of the temporal gait parameter on a user interface. Furthermore, this system classifies the subject’s intention - standing still, walking or ascending the stairs using simple logic inherent to an individual’s walking style. It offers reliable tools for functional assessment of the patient’s progress by measuring physical parameters. We conducted an experiment on a healthy participant as a validation of our approach and proof-of-concept.","PeriodicalId":286478,"journal":{"name":"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","volume":"156 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115743423","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Persuasive ChairBots: A (Mostly) Robot-Recruited Experiment","authors":"Abhijeet Agnihotri, H. Knight","doi":"10.1109/RO-MAN46459.2019.8956262","DOIUrl":"https://doi.org/10.1109/RO-MAN46459.2019.8956262","url":null,"abstract":"Robot furniture is a growing area of robotics research, as people easily anthropomorphize these simple robots and they fit in easily to many human environments. Could they also be of service in recruiting people to play chess? Prior work has found motion gestures to aid in persuasion, but this work has mostly occurred in in-lab studies and has not yet been applied to robot furniture. This paper assessed the efficacy of four motion strategies in persuading passerbyers to participate in a ChairBot Chess Tournament, which consisted of a table with a chessboard and two ChairBots – one for the white team, and another for the black team. The study occurred over a six-week period, seeking passersby to play chess in the atrium of our Computer Science building for an hour each Friday. Forward-Back motion was the most effective strategy in getting people to come to the table and play chess, while Spinning was the worst. Overall, people found the ChairBots to be friendly and somewhat dog-like. In-the-wild studies are challenging, but produce data that is highly likely to be replicable in future versions of the system. The results also support the potential of future robots to recruit participants to activities that they might already enjoy.","PeriodicalId":286478,"journal":{"name":"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115794847","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}