2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)最新文献

筛选
英文 中文
Learning Sequential Human-Robot Interaction Tasks from Demonstrations: The Role of Temporal Reasoning 从演示中学习顺序人机交互任务:时间推理的作用
2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) Pub Date : 2019-10-01 DOI: 10.1109/RO-MAN46459.2019.8956346
Estuardo Carpio, Madison Clark-Turner, M. Begum
{"title":"Learning Sequential Human-Robot Interaction Tasks from Demonstrations: The Role of Temporal Reasoning","authors":"Estuardo Carpio, Madison Clark-Turner, M. Begum","doi":"10.1109/RO-MAN46459.2019.8956346","DOIUrl":"https://doi.org/10.1109/RO-MAN46459.2019.8956346","url":null,"abstract":"There are many human-robot interaction (HRI) tasks that are highly structured and follow a certain temporal sequence. Learning such tasks from demonstrations requires understanding the underlying rules governing the interactions. This involves identifying and generalizing the key spatial and temporal features of the task and capturing the high-level relationships among them. Despite its crucial role in sequential task learning, temporal reasoning is often ignored in existing learning from demonstration (LFD) research. This paper proposes a holistic LFD framework that learns the underlying temporal structure of sequential HRI tasks. The proposed Temporal-Reasoning-based LFD (TR-LFD) framework relies on an automated spatial reasoning layer to identify and generalize relevant spatial features, and a temporal reasoning layer to analyze and learn the high-level temporal structure of a HRI task. We evaluate the performance of this framework by learning a well-explored task in HRI research: robot-mediated autism intervention. The source code for this implementation is available at https://github.com/AssistiveRoboticsUNH/TR-LFD.","PeriodicalId":286478,"journal":{"name":"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126594053","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Brand Recognition with Partial Visible Image in the Bottle Random Picking Task based on Inception V3 基于Inception V3的随机挑瓶任务中部分可见图像的品牌识别
2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) Pub Date : 2019-10-01 DOI: 10.1109/RO-MAN46459.2019.8956374
Chen Zhu, T. Matsumaru
{"title":"Brand Recognition with Partial Visible Image in the Bottle Random Picking Task based on Inception V3","authors":"Chen Zhu, T. Matsumaru","doi":"10.1109/RO-MAN46459.2019.8956374","DOIUrl":"https://doi.org/10.1109/RO-MAN46459.2019.8956374","url":null,"abstract":"In the brand-wise random-ordered drinking PET bottles picking task, the overlapping and viewing angle problem makes a low accuracy of the brand recognition. In this paper, we set the problem to increase the brand recognition accuracy and try to find out how the overlapping rate infects on the recognition accuracy. By using a stepping motor and transparent fixture, the training images were taken automatically from the bottles under 360 degrees to simulate a picture taken from viewing angle. After that, the images are augmented with random cropping and rotating to simulate the overlapping and rotation in a real application. By using the automatically constructed dataset, the Inception V3, which was transferred learning from ImageNet, is trained for brand recognition. By generating a random mask with a specific overlapping rate on the original image, the Inception V3 can give 80% accuracy when 45% of the object in the image is visible or 86% accuracy when the overlapping rate is lower than 30%.","PeriodicalId":286478,"journal":{"name":"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","volume":"96 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115088143","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Towards automatic visual fault detection in highly expressive human-like animatronic faces with soft skin 面向高表现力、柔软皮肤的仿人电子人脸的自动视觉缺陷检测
2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) Pub Date : 2019-10-01 DOI: 10.1109/RO-MAN46459.2019.8956418
Ralf Mayet, J. Diprose, A. Pandey
{"title":"Towards automatic visual fault detection in highly expressive human-like animatronic faces with soft skin","authors":"Ralf Mayet, J. Diprose, A. Pandey","doi":"10.1109/RO-MAN46459.2019.8956418","DOIUrl":"https://doi.org/10.1109/RO-MAN46459.2019.8956418","url":null,"abstract":"Designing reliable, humanoid social robots with highly expressive human-like faces is a challenging problem. Their construction requires complex mechanical assemblies, large numbers of actuators and involves soft material. When deploying these robots in the field they face problems of wear and tear and mechanical abuse. Mechanical defects of such faces can be hard to analyze automatically or by manual visual inspection. We propose a method of automatic visual calibration and actuator fault detection for complex animatronic faces. We use our approach to scan three expressive animatronic faces, and analyze the data. Our findings indicate that our approach is able to detect faulty actuators even when they contribute to the overall expression of the face only marginally, and are hard to spot visually.","PeriodicalId":286478,"journal":{"name":"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115117147","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Health Counseling by Robots: Modalities for Breastfeeding Promotion 机器人健康咨询:促进母乳喂养的方式
2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) Pub Date : 2019-10-01 DOI: 10.1109/RO-MAN46459.2019.8956342
Prasanth Murali, Teresa K. O'Leary, Ameneh Shamekhi, T. Bickmore
{"title":"Health Counseling by Robots: Modalities for Breastfeeding Promotion","authors":"Prasanth Murali, Teresa K. O'Leary, Ameneh Shamekhi, T. Bickmore","doi":"10.1109/RO-MAN46459.2019.8956342","DOIUrl":"https://doi.org/10.1109/RO-MAN46459.2019.8956342","url":null,"abstract":"Conversational humanoid robots are being increasingly used for health education and counseling. Prior research provides mixed indications regarding the best modalities to use for these systems, including user inputs spanning completely constrained multiple choice options vs. unconstrained speech, and embodiments of humanoid robots vs. virtual agents, especially for potentially sensitive health topics such as breastfeeding. We report results from an experiment comparing five different interface modalities, finding that all result in significant increases in user knowledge and intent to adhere to recommendations, with few differences among them. Users are equally satisfied with constrained (multiple choice) touch screen input and unconstrained speech input, but are relatively unsatisfied with constrained speech input. Women find conversational robots are an effective, safe, and non-judgmental medium for obtaining information about breastfeeding.","PeriodicalId":286478,"journal":{"name":"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","volume":"91 2-3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123575231","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Mood Estimation as a Social Profile Predictor in an Autonomous, Multi-Session, Emotional Support Robot for Children 情绪评估作为一个自主的、多会话的、儿童情感支持机器人的社会概况预测器
2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) Pub Date : 2019-10-01 DOI: 10.1109/RO-MAN46459.2019.8956460
Edwinn Gamborino, Hsiu-Ping Yueh, Weijane Lin, Su-Ling Yeh, L. Fu
{"title":"Mood Estimation as a Social Profile Predictor in an Autonomous, Multi-Session, Emotional Support Robot for Children","authors":"Edwinn Gamborino, Hsiu-Ping Yueh, Weijane Lin, Su-Ling Yeh, L. Fu","doi":"10.1109/RO-MAN46459.2019.8956460","DOIUrl":"https://doi.org/10.1109/RO-MAN46459.2019.8956460","url":null,"abstract":"In this work, we created an end-to-end autonomous robotic platform to give emotional support to children in long-term, multi-session interactions. Using a mood estimation algorithm based on visual cues of the user’s behaviors through their facial expressions and body posture, a multidimensional model predicts a qualitative measure of the subject’s affective state. Using a novel Interactive Reinforcement Learning algorithm, the robot is able to learn over several sessions the social profile of the user, adjusting its behavior to match their preferences. Although the robot is completely autonomous, a third party can optionally provide feedback to the robot through an additional UI to accelerate its learning of the user’s preferences. To validate the proposed methodology, we evaluated the impact of the robot on elementary school aged children in a long-term, multi-session interaction setting. Our findings show that using this methodology, the robot is able to learn the social profile of the users over a number of sessions, either with or without external feedback as well as maintain the user in a positive mood, as shown by the consistently positive rewards received by the robot using our proposed learning algorithm.","PeriodicalId":286478,"journal":{"name":"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","volume":"79 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114947939","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Evaluation of an Industrial Robotic Assistant in an Ecological Environment 生态环境下工业机器人助手的评价
2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) Pub Date : 2019-10-01 DOI: 10.1109/RO-MAN46459.2019.8956399
Baptiste Busch, G. Deacon, Duncan Russell, A. Billard, Giuseppe Cotugno, Mahdi Khoramshahi, Grigorios Skaltsas, Dario Turchi, L. Urbano, Mirko Wächter, You Zhou, T. Asfour
{"title":"Evaluation of an Industrial Robotic Assistant in an Ecological Environment","authors":"Baptiste Busch, G. Deacon, Duncan Russell, A. Billard, Giuseppe Cotugno, Mahdi Khoramshahi, Grigorios Skaltsas, Dario Turchi, L. Urbano, Mirko Wächter, You Zhou, T. Asfour","doi":"10.1109/RO-MAN46459.2019.8956399","DOIUrl":"https://doi.org/10.1109/RO-MAN46459.2019.8956399","url":null,"abstract":"Social robotic assistants have been widely studied and deployed as telepresence tools or caregivers. Evaluating their design and impact on the people interacting with them is of prime importance. In this research, we evaluate the usability and impact of ARMAR-6, an industrial robotic assistant for maintenance tasks. For this evaluation, we have used a modified System Usability Scale (SUS) to assess the general usability of the robotic system and the Godspeed questionnaire series for the subjective perception of the coworker. We have also recorded the subjects’ gaze fixation patterns and analyzed how they differ when working with the robot compared to a human partner.","PeriodicalId":286478,"journal":{"name":"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","volume":"303 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130112088","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Collaborative Transportation of Cable-Suspended Payload using Two Quadcopters with Human in the loop 两架四轴飞行器与人在环的悬索载荷协同运输
2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) Pub Date : 2019-10-01 DOI: 10.1109/RO-MAN46459.2019.8956380
Pratik Prajapati, Sagar Parekh, V. Vashista
{"title":"Collaborative Transportation of Cable-Suspended Payload using Two Quadcopters with Human in the loop","authors":"Pratik Prajapati, Sagar Parekh, V. Vashista","doi":"10.1109/RO-MAN46459.2019.8956380","DOIUrl":"https://doi.org/10.1109/RO-MAN46459.2019.8956380","url":null,"abstract":"We study the problem of collaborative trans-portation of cable-suspended payload using two quadcopters. While previous works on transportation using quadcopters emphasize more on autonomous control and generating complex trajectory, in this paper a master-slave strategy is implemented where the master quadcopter is controlled by human and the slave quadcopter tries to stabilize the oscillations of the payload. Two quadcopters with a cable-suspended payload system is under-actuated with coupled dynamics and hence, manual control is difficult. We use Lagrangian mechanics on a manifold for deriving equations of motion and apply variation based linearization to linearize the system. We designed a Lyapunov based controller to minimize the oscillations of the payload while transportation, leading to an easier manual control of master quadcopter.","PeriodicalId":286478,"journal":{"name":"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125932906","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Hospital Receptionist Robot v2: Design for Enhancing Verbal Interaction with Social Skills 医院接待员机器人v2:增强语言交流和社交技能的设计
2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) Pub Date : 2019-10-01 DOI: 10.1109/RO-MAN46459.2019.8956300
H. Ahn, Wesley Yep, Jongyoon Lim, B. Ahn, D. Johanson, E. Hwang, Min Ho Lee, E. Broadbent, B. MacDonald
{"title":"Hospital Receptionist Robot v2: Design for Enhancing Verbal Interaction with Social Skills","authors":"H. Ahn, Wesley Yep, Jongyoon Lim, B. Ahn, D. Johanson, E. Hwang, Min Ho Lee, E. Broadbent, B. MacDonald","doi":"10.1109/RO-MAN46459.2019.8956300","DOIUrl":"https://doi.org/10.1109/RO-MAN46459.2019.8956300","url":null,"abstract":"This paper presents a new version of robot receptionist system for healthcare facility environment. Our HealthBots consists of three subsystems: a receptionist robot system, a nurse assistant robot system, and a medical server. Our first version of receptionist robot, interacts with human at hospital reception, gives instructions to human verbally, but cannot understand what human says, so it uses a touch screen to get the response from human. In this paper, we design a receptionist robot that recognizes human face as well as speech, which enhances verbal interaction skill of robot. In addition, we design a reaction generation engine to generate appropriate reactive motions and speech. Moreover, we study which social skills are important to a hospital receptionist robot to enhance social interaction, such as friendliness and attention. We implemented perception modules, decision-making modules, and reaction modules to our HealthBots architecture, and did two case studies to find essential social skills for hospital receptionist robots.","PeriodicalId":286478,"journal":{"name":"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130463810","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 12
Robot Analytics: What Do Human-Robot Interaction Traces Tell Us About Learning? 机器人分析:人机交互轨迹告诉我们关于学习的什么?
2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) Pub Date : 2019-10-01 DOI: 10.1109/RO-MAN46459.2019.8956465
Jauwairia Nasir, Utku Norman, W. Johal, Jennifer K. Olsen, Sina Shahmoradi, P. Dillenbourg
{"title":"Robot Analytics: What Do Human-Robot Interaction Traces Tell Us About Learning?","authors":"Jauwairia Nasir, Utku Norman, W. Johal, Jennifer K. Olsen, Sina Shahmoradi, P. Dillenbourg","doi":"10.1109/RO-MAN46459.2019.8956465","DOIUrl":"https://doi.org/10.1109/RO-MAN46459.2019.8956465","url":null,"abstract":"In this paper, we propose that the data generated by educational robots can be better used by applying learning analytics methods and techniques which can lead to a deeper understanding of the learners’ apprehension and behavior as well as refined guidelines for roboticists and improved interventions by the teachers. As a step towards this, we put forward analyzing behavior and task performance at team and/or individual levels by coupling robot data with the data from conventional methods of assessment through quizzes. Classifying learners/teams in the behavioral feature space with respect to the task performance gives insight into the behavior patterns relevant for high performance, which could be backed by feature ranking. As a use case, we present an open-ended learning activity using tangible haptic-enabled Cellulo robots in a classroom-level setting. The pilot study, spanning over approximately an hour, is conducted with 25 children in teams of two that are aged between 11-12. A linear separation is observed between the high and low performing teams where two of the behavioral features, namely number of distinct attempts and the visits to the destination, are found to be important. Although the pilot study in its current form has limitations, e.g. its low sample size, it contributes to highlighting the potential of the use of learning analytics in educational robotics.","PeriodicalId":286478,"journal":{"name":"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131983105","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 12
A Robot-Mediated Assessment of Tinetti Balance scale for Sarcopenia Evaluation in Frail Elderly* 机器人对体弱多病老年人肌肉减少症评估的耳鸣平衡量表评估*
2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) Pub Date : 2019-10-01 DOI: 10.1109/RO-MAN46459.2019.8956439
L. Fiorini, G. D’Onofrio, E. Rovini, Alessandra Sorrentino, Luigi Coviello, Raffaele Limosani, Daniele Sancarlo, F. Cavallo
{"title":"A Robot-Mediated Assessment of Tinetti Balance scale for Sarcopenia Evaluation in Frail Elderly*","authors":"L. Fiorini, G. D’Onofrio, E. Rovini, Alessandra Sorrentino, Luigi Coviello, Raffaele Limosani, Daniele Sancarlo, F. Cavallo","doi":"10.1109/RO-MAN46459.2019.8956439","DOIUrl":"https://doi.org/10.1109/RO-MAN46459.2019.8956439","url":null,"abstract":"Aging society is characterized by a high prevalence of sarcopenia, which is considered one of the most common health problems of the elderly population. Sarcopenia is due to the age-related loss of muscle mass and muscle strength. Recent literature findings highlight that the Tinetti Balance Assessment (TBA) scale is used to assess the sarcopenia in elderly people. In this context, this article proposes a model for sarcopenia assessment that is able to provide a quantitative assessment of TBA-gait motor parameters by means of a cloud robotics approach. The proposed system is composed of cloud resources, an assistive robot namely ASTRO and two inertial wearable sensors. Particularly, data from two inertial sensors (i.e., accelerometers and gyroscopes), placed on the patient’s feet, and data from ASTRO laser sensor (position in the environment) were analyzed and combined to propose a set of motor features correspondent to the TBA gait domains. The system was preliminarily tested at the hospital of “Fondazione Casa Sollievo della Sofferenza” in Italy. The preliminary results suggest that the extracted set of features is able to describe the motor performance. In the future, these parameters could be used to support the clinicians in the assessment of sarcopenia, to monitoring the motor parameters over time and to propose personalized care-plan.","PeriodicalId":286478,"journal":{"name":"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134584009","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信