{"title":"Living and Interacting with Robots: Engaging Users in the Development of a Mobile Robot","authors":"Valerie Varney, Christoph Henke, D. Janssen","doi":"10.5772/intechopen.90112","DOIUrl":"https://doi.org/10.5772/intechopen.90112","url":null,"abstract":"Mobile robots such as Aldebaran’s humanoid Pepper currently find their way into society. Many research projects already try to match humanoid robots with humans by letting them assist, e.g., in geriatric care or simply for purposes of keeping company or entertainment. However, many of these projects deal with acceptance issues that come with a new type of interaction between humans and robots. These issues partly originate from different types of robot locomotion, limited human-like behaviour as well as limited functionalities in general. At the same time, animal-type robots—quadrupeds such as Boston Dynamic‘s WildCat—and underactuated robots are on the rise and present social scientists with new challenges such as the concept of uncanny valley. The possible positive aspects of the unusual cooperations and interactions, however, are mostly pushed into the background. This paper describes an approach of a project at a research institution in Germany that aims at developing a setting of human–robot-interaction and collaboration that engages the designated users in the whole process.","PeriodicalId":411781,"journal":{"name":"Becoming Human with Humanoid - From Physical Interaction to Social Intelligence","volume":"68 6","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-02-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114011394","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Electromechanical Analysis (MEMS) of a Capacitive Pressure Sensor of a Neuromate Robot Probe","authors":"Hacene Ameddah","doi":"10.5772/INTECHOPEN.88946","DOIUrl":"https://doi.org/10.5772/INTECHOPEN.88946","url":null,"abstract":"The domain of medicine, especially neurosurgery, is very concerned in the integration of robots in many procedures. In this work, we are interested in the Neuromate robot. The latter uses the procedure of stereotaxic surgery but with better planning, greater preci-sion and simpler execution. The Neuromate robot allows in particular the registration with intraoperative images (ventriculographies, and especially angiographies) in order to perfect the planning. In this book, we focus on the contact force measurement system required for the effectiveness of the stimulation between the robot probe and the patient’s head and thus ensure the safety of the patient. A force sensor is integrated upstream of the wrist, the pressure sensor is part of a silicon matrix that has been bonded to a metal plate at 70°C. The study was carried out under the software COMSOL Multiphysics, ide-ally suited for the simulation of applications (Microelectromechanical systems) “MEMS”. After electromechanical stationary survey, deflection of the quadrant when the pressure difference across the membrane was 25 kPa, as expected, the deviation was expected to be greatest at the center of the membrane. The proposed sensor structure is a suitable selection for MEMS capacitive pressure sensors.","PeriodicalId":411781,"journal":{"name":"Becoming Human with Humanoid - From Physical Interaction to Social Intelligence","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124392766","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Physical Interaction and Control of Robotic Systems Using Hardware-in-the-Loop Simulation","authors":"Senthil K. Perumal, S. Ganesan","doi":"10.5772/INTECHOPEN.85251","DOIUrl":"https://doi.org/10.5772/INTECHOPEN.85251","url":null,"abstract":"Robotic systems used in industries and other complex applications need huge investment, and testing of them under robust conditions are highly challenging. Controlling and testing of such systems can be done with ease with the support of hardware-in-the-loop (HIL) simulation technique and it saves lot of time and resources. The chapter deals on the various interaction methods of robotic systems with physical environments using tactile, force, and vision sensors. It also discusses about the usage of hardware-in-the-loop technique for testing of grasp and task control algorithms in the model of robotic systems. The chapter also elaborates on usage of hardware and software platforms for implementing the control algorithms for performing physical interaction. Finally, the chapter summarizes with the case study of HIL implementation of the control algorithms in Texas Instruments (TI) C2000 microcontroller, interacting with model of Kuka’s youBot Mobile Manipulator. The mathematical model is developed using MATLAB software and the virtual animation setup of the robot is developed using the Virtual Robot Experimentation Platform (V-REP) robot simulator. By actuating the Kuka’s youBot mobile manipulator in the V-REP tool, it is observed to produce a tracking accuracy of 92% for physical interaction and object handling tasks.","PeriodicalId":411781,"journal":{"name":"Becoming Human with Humanoid - From Physical Interaction to Social Intelligence","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115869287","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
G. Santamaría-Bonfil, Orlando Grabiel Toledano López
{"title":"Emoji as a Proxy of Emotional Communication","authors":"G. Santamaría-Bonfil, Orlando Grabiel Toledano López","doi":"10.5772/intechopen.88636","DOIUrl":"https://doi.org/10.5772/intechopen.88636","url":null,"abstract":"Nowadays, emoji plays a fundamental role in human computer-mediated com-munications, allowing the latter to convey body language, objects, symbols, or ideas in text messages using Unicode standardized pictographs and logographs. Emoji allows people expressing more “ authentically ” emotions and their personalities, by increasing the semantic content of visual messages. The relationship between language, emoji, and emotions is now being studied by several disciplines such as linguistics, psychology, natural language processing (NLP), and machine learning (ML). Particularly, the last two are employed for the automatic detection of emotions and personality traits, building emoji sentiment lexicons, as well as for conveying artificial agents with the ability of expressing emotions through emoji. In this chapter, we introduce the concept of emoji and review the main challenges in using these as a proxy of language and emotions, the ML, and NLP techniques used for classification and detection of emotions using emoji, and presenting new trends for the exploitation of discovered emotional patterns for robotic emotional communication.","PeriodicalId":411781,"journal":{"name":"Becoming Human with Humanoid - From Physical Interaction to Social Intelligence","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-09-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127077582","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Motion Generation during Vocalized Emotional Expressions and Evaluation in Android Robots","authors":"C. Ishi","doi":"10.5772/INTECHOPEN.88457","DOIUrl":"https://doi.org/10.5772/INTECHOPEN.88457","url":null,"abstract":"Vocalized emotional expressions such as laughter and surprise often occur in natural dialogue interactions and are important factors to be considered in order to achieve smooth robot-mediated communication. Miscommunication may be caused if there is a mismatch between audio and visual modalities, especially in android robots, which have a highly humanlike appearance. In this chapter, motion generation methods are introduced for laughter and vocalized surprise events, based on analysis results of human behaviors during dialogue interactions. The effectiveness of controlling different modalities of the face, head, and upper body (eyebrow raising, eyelid widening/narrowing, lip corner/cheek raising, eye blinking, head motion, and torso motion control) and different motion control levels are evaluated using an android robot. Subjective experiments indicate the importance of each modality in the perception of motion naturalness (humanlikeness) and the degree of emotional expression.","PeriodicalId":411781,"journal":{"name":"Becoming Human with Humanoid - From Physical Interaction to Social Intelligence","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-08-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125679087","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Computer Simulation of Human-Robot Collaboration in the Context of Industry Revolution 4.0","authors":"Y. Rizal","doi":"10.5772/INTECHOPEN.88335","DOIUrl":"https://doi.org/10.5772/INTECHOPEN.88335","url":null,"abstract":"The essential role of robot simulation for industrial robots, in particular the collaborative robots is presented in this chapter. We begin by discussing the robot utilization in the industry which includes mobile robots, arm robots, and humanoid robots. The author emphasizes the application of collaborative robots in regard to industry revolution 4.0. Then, we present how the collaborative robot utilization in the industry can be achieved through computer simulation by means of virtual robots in simulated environments. The robot simulation presented here is based on open dynamic engine (ODE) using anyKode Marilou. The author surveys on the use of dynamic simulations in application of collaborative robots toward industry 4.0. Due to the challenging problems which related to humanoid robots for collaborative robots and behavior in human-robot collaboration, the use of robot simulation may open the opportunities in collaborative robotic research in the context of industry 4.0. As developing a real collaborative robot is still expensive and time-consuming, while accessing commercial collaborative robots is relatively limited; thus, the development of robot simulation can be an option for collaborative robotic research and education purposes.","PeriodicalId":411781,"journal":{"name":"Becoming Human with Humanoid - From Physical Interaction to Social Intelligence","volume":"55 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-08-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117311126","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Y. Yamakawa, Shouren Huang, A. Namiki, M. Ishikawa
{"title":"Toward Dynamic Manipulation of Flexible Objects by High-Speed Robot System: From Static to Dynamic","authors":"Y. Yamakawa, Shouren Huang, A. Namiki, M. Ishikawa","doi":"10.5772/INTECHOPEN.82521","DOIUrl":"https://doi.org/10.5772/INTECHOPEN.82521","url":null,"abstract":"This chapter explains dynamic manipulation of flexible objects, where the target objects to be manipulated include rope, ribbon, cloth, pizza dough, and so on. Previ-ously, flexible object manipulation has been performed in a static or quasi-static state. Therefore, the manipulation time becomes long, and the efficiency of the manipulation is not considered to be sufficient. In order to solve these problems, we propose a novel control strategy and motion planning for achieving flexible object manipulation at high speed. The proposed strategy simplifies the flexible object dynamics. More-over, we implemented a high-speed vision system and high-speed image processing to improve the success rate by manipulating the robot trajectory. By using this strategy, motion planning, and high-speed visual feedback, we demonstrated several tasks, including dynamic manipulation and knotting of a rope, generating a ribbon shape, dynamic folding of cloth, rope insertion, and pizza dough rotation, and we show experimental results obtained by using the high-speed robot system.","PeriodicalId":411781,"journal":{"name":"Becoming Human with Humanoid - From Physical Interaction to Social Intelligence","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-04-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133928630","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}