2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids)最新文献

筛选
英文 中文
Whole-body walking pattern using pelvis-rotation for long stride and arm swing for yaw angular momentum compensation 全身步行模式使用骨盆旋转大跨步和手臂摆动偏航角动量补偿
2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids) Pub Date : 2021-07-19 DOI: 10.1109/HUMANOIDS47582.2021.9555794
Beomyeong Park, Myeong-Ju Kim, E. Sung, Junhyung Kim, Jaeheung Park
{"title":"Whole-body walking pattern using pelvis-rotation for long stride and arm swing for yaw angular momentum compensation","authors":"Beomyeong Park, Myeong-Ju Kim, E. Sung, Junhyung Kim, Jaeheung Park","doi":"10.1109/HUMANOIDS47582.2021.9555794","DOIUrl":"https://doi.org/10.1109/HUMANOIDS47582.2021.9555794","url":null,"abstract":"A long stride can enable a humanoid robot achieve fast and stable walking. For a long stride, the kinematics of the robot should be fully utilized, and walking with pelvic rotation can be a solution. A rotational trajectory of pelvis considering kinematic limitations is needed for pelvis-rotation walking. When the robot walks with a long stride while rotating the pelvis, the yaw momentum may be larger than that when walks with the pelvis fixed. This is caused by the rotation of the pelvis and leg motion, and hence, walking with pelvic rotation may become unstable. In this paper, we propose to control the lower body of a robot as a redundant system with leg joints and a waist joint for walking with pelvic rotation. The position of the base frame to implement the redundant system for the lower body of the robot is also proposed. In addition, the a quadratic programming (QP) controller is formulated to enable arm swing for yaw momentum compensation while controlling the lower body. The feasibility of the proposed control method was verified using a simulation and an experiment of walking with a long stride while rotating the pelvis using a QP controller and compensating the yaw momentum by means of arm swing.","PeriodicalId":320510,"journal":{"name":"2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids)","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-07-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126945446","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
A Human-Aware Method to Plan Complex Cooperative and Autonomous Tasks using Behavior Trees 基于行为树的复杂协同自主任务规划方法研究
2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids) Pub Date : 2021-07-19 DOI: 10.1109/HUMANOIDS47582.2021.9555683
Fabio Fusaro, Edoardo Lamon, E. Momi, A. Ajoudani
{"title":"A Human-Aware Method to Plan Complex Cooperative and Autonomous Tasks using Behavior Trees","authors":"Fabio Fusaro, Edoardo Lamon, E. Momi, A. Ajoudani","doi":"10.1109/HUMANOIDS47582.2021.9555683","DOIUrl":"https://doi.org/10.1109/HUMANOIDS47582.2021.9555683","url":null,"abstract":"This paper proposes a novel human-aware method that generates robot plans for autonomous and human-robot cooperative tasks in industrial environments. We modify the standard Behavior Trees (BTs) formulation in order to take into account the action-related costs, and design suitable metrics and cost functions to account for the cooperation with a worker considering human availability, decisions, and ergonomics. The developed approach allows the robot to online adapt its plan to the human partner, by choosing the tasks that minimize the execution cost(s). Through simulations, we first tuned the weights of the cost function for a realistic scenario. Subsequently, the developed method is validated through a proof-of-concept experiment representing the boxing of 4 different objects. The results show that the proposed cost-based BTs, along with the defined costs, enable the robot to online react and plan new tasks according to the dynamic changes of the environment, in terms of human presence and intentions. Our results indicate that the proposed solution demonstrates high potential in increasing robot reactivity and flexibility while, at the same time, in optimizing the decision-making process according to human actions.","PeriodicalId":320510,"journal":{"name":"2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids)","volume":"117 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-07-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130418137","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 12
Footstep and Timing Adaptation for Humanoid Robots Utilizing Pre-computation of Capture Regions 基于捕获区域预计算的仿人机器人脚步与时序自适应
2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids) Pub Date : 2021-07-19 DOI: 10.1109/HUMANOIDS47582.2021.9555675
Y. Tazaki
{"title":"Footstep and Timing Adaptation for Humanoid Robots Utilizing Pre-computation of Capture Regions","authors":"Y. Tazaki","doi":"10.1109/HUMANOIDS47582.2021.9555675","DOIUrl":"https://doi.org/10.1109/HUMANOIDS47582.2021.9555675","url":null,"abstract":"This study proposes a real-time footstep and timing adaptation mechanism for humanoid robots that can be integrated into a conventional walking pattern generator and increase the robustness of walking against disturbances. In order to meet the strict real-time constraint of humanoid robot control, the proposed method computes viable capture basins in the design phase. This pre-computed data can be used at runtime to modify the foot placement, the timing of landing, and the center-of-mass movement in response to applied disturbances with small computation cost. The performance of the proposed method is evaluated in simulation experiments.","PeriodicalId":320510,"journal":{"name":"2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids)","volume":"92 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-07-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134569779","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Vision for Prosthesis Control Using Unsupervised Labeling of Training Data 基于训练数据无监督标记的假肢视觉控制
2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids) Pub Date : 2021-07-19 DOI: 10.1109/HUMANOIDS47582.2021.9555789
Vijeth Rai, David Boe, E. Rombokas
{"title":"Vision for Prosthesis Control Using Unsupervised Labeling of Training Data","authors":"Vijeth Rai, David Boe, E. Rombokas","doi":"10.1109/HUMANOIDS47582.2021.9555789","DOIUrl":"https://doi.org/10.1109/HUMANOIDS47582.2021.9555789","url":null,"abstract":"Transitioning from one activity to another is one of the key challenges of prosthetic control. Vision sensors provide a glance into the environment’s desired and future movements, unlike body sensors (EMG, mechanical). This could be employed to anticipate and trigger transitions in prosthesis to provide a smooth user experience. A significant bottleneck in using vision sensors has been the acquisition of large labeled training data. Labeling the terrain in thousands of images is labor-intensive; it would be ideal to simply collect visual data for long periods without needing to label each frame. Toward that goal, we apply an unsupervised learning method to generate mode labels for kinematic gait cycles in training data. We use these labels with images from the same training data to train a vision classifier. The classifier predicts the target mode an average of 2.2 seconds before the kinematic changes. We report 96.6% overall and 99.5% steady-state mode classification accuracy. These results are comparable to studies using manually labeled data. This method, however, has the potential to dramatically scale without requiring additional labeling.","PeriodicalId":320510,"journal":{"name":"2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids)","volume":"58 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-07-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130987339","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Detection of Collaboration and Collision Events during Contact Task Execution 联系任务执行过程中协作和冲突事件的检测
2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids) Pub Date : 2021-07-19 DOI: 10.1109/HUMANOIDS47582.2021.9555677
Felix Franzel, Thomas Eiband, Dongheui Lee
{"title":"Detection of Collaboration and Collision Events during Contact Task Execution","authors":"Felix Franzel, Thomas Eiband, Dongheui Lee","doi":"10.1109/HUMANOIDS47582.2021.9555677","DOIUrl":"https://doi.org/10.1109/HUMANOIDS47582.2021.9555677","url":null,"abstract":"This work introduces a contact event pipeline to distinguish task-contact from Human-Robot interaction and collision during task execution. The increasing need for close proximity physical human-robot interaction (pHRI) in the private, health and industrial sector demands for new safety solutions. One of the most important issues regarding safe collaboration is the robust recognition and classification of contacts between human and robot. A solution is designed, that enables simple task teaching and accurate contact monitoring during task execution. Besides an external force and torque sensor, only proprioceptive data is used for the contact evaluation. An approach based on demonstrated task knowledge and the offset resulting from human interaction is designed to distinguish contact events from normal execution by a contact event detector. A contact type classifier implemented as Support Vector Machine is trained with the identified events. The system is set up to quickly identify contact incidents and enable appropriate robot reactions. An offline evaluation is conducted with data recorded from intended and unintended contacts as well as examples of task-contacts like object manipulation and environmental interactions. The system’s performance and its high responsiveness are evaluated in different experiments including a real world task.","PeriodicalId":320510,"journal":{"name":"2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids)","volume":"65 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-07-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117288574","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Gait Percent Estimation during Walking and Running using Sagittal Shank or Thigh Angles 用小腿或大腿矢状角度估计走路和跑步时的步态百分比
2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids) Pub Date : 2021-07-19 DOI: 10.1109/HUMANOIDS47582.2021.9555673
M. Eslamy, A. Schilling
{"title":"Gait Percent Estimation during Walking and Running using Sagittal Shank or Thigh Angles","authors":"M. Eslamy, A. Schilling","doi":"10.1109/HUMANOIDS47582.2021.9555673","DOIUrl":"https://doi.org/10.1109/HUMANOIDS47582.2021.9555673","url":null,"abstract":"In this work we analyzed the relationship between the shank and thigh angles (separately) and the gait cycle progression, to develop a novel approach for gait percent estimation. To do so, the angles were integrated. Our findings show that the integral of shank and thigh angle has a monotonic behavior and therefore can approximate the gait percents during a gait cycle through a one-to-one relationship. For all of the individuals, speeds and gaits a quasi-linear relationship was found between the shank and thigh angle integrals and the gait percents. Average $mathrm{R}^{2}$ values close to one and average RMS errors less than 2.2 were achieved. The proposed approach was investigated for different subjects (21 subjects), speeds (10 speeds) and gaits (walking and running) and can be potentially used for human motion analysis as well as for motion planning of assistive devices.","PeriodicalId":320510,"journal":{"name":"2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids)","volume":"83 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-07-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134183146","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
SURENAIV: Towards A Cost-effective Full-size Humanoid Robot for Real-world Scenarios SURENAIV:面向现实世界场景的高性价比全尺寸人形机器人
2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids) Pub Date : 2021-07-19 DOI: 10.1109/HUMANOIDS47582.2021.9555686
A. Yousefi-Koma, B. Maleki, Hessam Maleki, A. Amani, M. Bazrafshani, Hossein Keshavarz, Ala Iranmanesh, A. Yazdanpanah, H. Alai, Sahel Salehi, Mahyar Ashkvari, Milad Mousavi, M. Shafiee-Ashtiani
{"title":"SURENAIV: Towards A Cost-effective Full-size Humanoid Robot for Real-world Scenarios","authors":"A. Yousefi-Koma, B. Maleki, Hessam Maleki, A. Amani, M. Bazrafshani, Hossein Keshavarz, Ala Iranmanesh, A. Yazdanpanah, H. Alai, Sahel Salehi, Mahyar Ashkvari, Milad Mousavi, M. Shafiee-Ashtiani","doi":"10.1109/HUMANOIDS47582.2021.9555686","DOIUrl":"https://doi.org/10.1109/HUMANOIDS47582.2021.9555686","url":null,"abstract":"This paper describes the hardware, software framework, and experimental testing of SURENA IV humanoid robotics platform. SURENA IV has 43 degrees of freedom (DoFs), including seven DoFs for each arm, six DoFs for each hand, and six DoFs for each leg, with a height of 170 cm and a mass of 68 kg and morphological and mass properties similar to an average adult human. SURENA IV aims to realize a cost-effective and anthropomorphic humanoid robot for real-world scenarios. In this way, we demonstrate a locomotion framework based on a novel and inexpensive predictive foot sensor that enables walking with 7cm foot position error because of accumulative error of links and connections’ deflection(that has been manufactured by the tools which are available in the Universities). Thanks to this sensor, the robot can walk on unknown obstacles without any force feedback, by online adaptation of foot height and orientation. Moreover, the arm and hand of the robot have been designed to grasp the objects with different stiffness and geometries that enable the robot to do drilling, visual servoing of a moving object, and writing his name on the white-board.","PeriodicalId":320510,"journal":{"name":"2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids)","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-07-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133296046","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
An Experimental Validation and Comparison of Reaching Motion Models for Unconstrained Handovers: Towards Generating Humanlike Motions for Human-Robot Handovers 无约束移交到达运动模型的实验验证与比较:面向人-机器人移交的类人运动生成
2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids) Pub Date : 2021-07-19 DOI: 10.1109/HUMANOIDS47582.2021.9555779
Wesley P. Chan, T. Tran, Sara Sheikholeslami, E. Croft
{"title":"An Experimental Validation and Comparison of Reaching Motion Models for Unconstrained Handovers: Towards Generating Humanlike Motions for Human-Robot Handovers","authors":"Wesley P. Chan, T. Tran, Sara Sheikholeslami, E. Croft","doi":"10.1109/HUMANOIDS47582.2021.9555779","DOIUrl":"https://doi.org/10.1109/HUMANOIDS47582.2021.9555779","url":null,"abstract":"The Minimum Jerk motion model has long been cited in literature for human point-to-point reaching motions in single-person tasks. While it has been demonstrated that applying minimum-jerk-like trajectories to robot reaching motions in the joint action task of human-robot handovers allows a robot giver to be perceived as more careful, safe, and skilled, it has not been verified whether human reaching motions in handovers follow the Minimum Jerk model. To experimentally test and verify motion models for human reaches in handovers, we examined human reaching motions in unconstrained handovers (where the person is allowed to move their whole body) and fitted against 1) the Minimum Jerk model, 2) its variation, the Decoupled Minimum Jerk model, and 3) the recently proposed Elliptical (Conic) model. Results showed that Conic model fits unconstrained human handover reaching motions best. Furthermore, we discovered that unlike constrained, single-person reaching motions, which have been found to be elliptical, there is a split between elliptical and hyperbolic conic types. We expect our results will help guide generation of more humanlike reaching motions for human-robot handover tasks.","PeriodicalId":320510,"journal":{"name":"2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids)","volume":"64 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-07-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121440173","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
The KIT Bimanual Manipulation Dataset KIT手工操作数据集
2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids) Pub Date : 2021-07-19 DOI: 10.1109/HUMANOIDS47582.2021.9555788
F. Krebs, Andre Meixner, Isabel Patzer, T. Asfour
{"title":"The KIT Bimanual Manipulation Dataset","authors":"F. Krebs, Andre Meixner, Isabel Patzer, T. Asfour","doi":"10.1109/HUMANOIDS47582.2021.9555788","DOIUrl":"https://doi.org/10.1109/HUMANOIDS47582.2021.9555788","url":null,"abstract":"Learning models of bimanual manipulation tasks from human demonstration requires capturing human body and hand motions, as well as the objects involved in the demonstration, to provide all the information needed for learning manipulation task models on symbolic and subsymbolic level. We provide a new multi-modal dataset of bimanual manipulation actions consisting of accurate human whole-body motion data, full configuration of both hands, and the 6D pose and trajectories of all objects involved in the task. The data is collected using five different sensor systems: a motion capture system, two data gloves, three RGB-D cameras, a headmounted egocentric camera and three inertial measurement units (IMUs). The dataset includes 12 actions of bimanual daily household activities performed by two healthy subjects with a large number of intra-action variations and three repetitions of each action variation, resulting in 588 recorded demonstrations. A total of 21 household items are used to perform the various actions. In addition to the data collection, we developed tools and methods for the standardized representation and organization of multi-modal sensor data in large-scale human motion databases. We extended our Master Motor Map (MMM) framework to allow the mapping of collected demonstrations to a reference model of the human body as well as the segmentation and annotation of recorded manipulation tasks. The dataset includes raw sensor data, normalized data in the MMM format and annotations, and is made publicly available in the KIT Whole-Body Human Motion Database.","PeriodicalId":320510,"journal":{"name":"2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-07-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127578835","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 13
Spatial calibration of whole-body artificial skin on a humanoid robot: comparing self-contact, 3D reconstruction, and CAD-based calibration 仿人机器人全身人造皮肤的空间标定:比较自接触、三维重建和基于cad的标定
2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids) Pub Date : 2021-07-19 DOI: 10.1109/HUMANOIDS47582.2021.9555806
Lukas Rustler, Bohumila Potočná, Michal Polic, K. Štěpánová, M. Hoffmann
{"title":"Spatial calibration of whole-body artificial skin on a humanoid robot: comparing self-contact, 3D reconstruction, and CAD-based calibration","authors":"Lukas Rustler, Bohumila Potočná, Michal Polic, K. Štěpánová, M. Hoffmann","doi":"10.1109/HUMANOIDS47582.2021.9555806","DOIUrl":"https://doi.org/10.1109/HUMANOIDS47582.2021.9555806","url":null,"abstract":"Robots were largely missing the sense of touch for decades. As artificial sensitive skins covering large areas of robot bodies are starting to appear, to be useful to the machines, sensor positions on the robot body are needed. In this work, a Nao humanoid robot was retrofitted with pressure-sensitive skin on the head, torso, and arms. We experimentally compare the accuracy and effort associated with the following skin spatial calibration approaches and their combinations: (i) combining CAD models and skin layout in 2D, (ii) 3D reconstruction from images, (iii) using robot kinematics to calibrate skin by self-contact. To acquire 3D positions of taxels on individual skin parts, methods (i) and (ii) were similarly laborious but 3D reconstruction was more accurate. To align these 3D point clouds with the robot kinematics, two variants of self-contact were employed: skin-on-skin and utilization of a custom end effector (finger). In combination with the 3D reconstruction data, mean calibration errors below the radius of individual sensors were achieved (2 mm). Significant perturbation of more than 100 torso taxel positions could be corrected using self-contact calibration, reaching approx. 3 mm mean error. This work is not a proof of concept but deployment of the approaches at scale: the outcome is actual spatial calibration of all 970 taxels on the robot body. As the different calibration approaches are evaluated in isolation as well as in different combinations, this work provides a guideline applicable to spatial calibration of different sensor arrays.","PeriodicalId":320510,"journal":{"name":"2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-07-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125826209","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信