2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)最新文献

筛选
英文 中文
AR Circuit Constructor: Combining Electricity Building Blocks and Augmented Reality for Analogy-Driven Learning and Experimentation AR电路构造器:结合电力构建模块和增强现实模拟驱动的学习和实验
2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) Pub Date : 2020-11-01 DOI: 10.1109/ISMAR-Adjunct51615.2020.00019
Tobias Kreienbühl, Richard Wetzel, Naomi Burgess, A. M. Schmid, Dorothee Brovelli
{"title":"AR Circuit Constructor: Combining Electricity Building Blocks and Augmented Reality for Analogy-Driven Learning and Experimentation","authors":"Tobias Kreienbühl, Richard Wetzel, Naomi Burgess, A. M. Schmid, Dorothee Brovelli","doi":"10.1109/ISMAR-Adjunct51615.2020.00019","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00019","url":null,"abstract":"We present AR Circuit Constructor (ARCC), an augmented reality application to explore and inspect electric circuits for use in educational settings. Learners use tangible electricity building blocks to construct a working electric circuit. Then, they can use a tablet device for exploring the circuit in an augmented reality visualization. Learners can switch between three distinct conceptual analogies: bicycle chain, water pipes, and waterfalls. Through experimentation with different circuit configurations, learners explore different properties of electricity to ultimately improve their understanding of it. We describe the development of our application, including a qualitative user study with a group of STEM teachers. The latter allowed us to gain insights into the qualities required for such an application before it can ultimately be deployed in a classroom setting.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"229 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122117893","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
LCR-SMPL: Toward Real-time Human Detection and 3D Reconstruction from a Single RGB Image LCR-SMPL:从单个RGB图像走向实时人体检测和三维重建
2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) Pub Date : 2020-11-01 DOI: 10.1109/ISMAR-Adjunct51615.2020.00062
E. Peña-Tapia, Ryo Hachiuma, Antoine Pasquali, H. Saito
{"title":"LCR-SMPL: Toward Real-time Human Detection and 3D Reconstruction from a Single RGB Image","authors":"E. Peña-Tapia, Ryo Hachiuma, Antoine Pasquali, H. Saito","doi":"10.1109/ISMAR-Adjunct51615.2020.00062","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00062","url":null,"abstract":"This paper presents a novel method for simultaneous human detection and 3D shape reconstruction from a single RGB image. It offers a low-cost alternative to existing motion capture solutions, allowing to reconstruct realistic human 3D shapes and poses by leveraging the speed of an object-detection based architecture and the extended applicability of a parametric human mesh model. Evaluation results using a synthetic dataset show that our approach is on-par with conventional 3D reconstruction methods in terms of accuracy, and outperforms them in terms of inference speed, particularly in the case of multi-person images.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117218727","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Exploring Virtual Environments by Visually Impaired Using a Mixed Reality Cane Without Visual Feedback 视障人士使用无视觉反馈的混合现实手杖探索虚拟环境
2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) Pub Date : 2020-11-01 DOI: 10.1109/ISMAR-Adjunct51615.2020.00028
Lei Zhang, Klevin Wu, Bin Yang, Hao Tang, Zhigang Zhu
{"title":"Exploring Virtual Environments by Visually Impaired Using a Mixed Reality Cane Without Visual Feedback","authors":"Lei Zhang, Klevin Wu, Bin Yang, Hao Tang, Zhigang Zhu","doi":"10.1109/ISMAR-Adjunct51615.2020.00028","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00028","url":null,"abstract":"Though virtual reality (VR) has been advanced to certain levels of maturity in recent years, the general public, especially the population of the blind and visually impaired (BVI), still cannot enjoy the benefit provided by VR. Current VR accessibility applications have been developed either on expensive head-mounted displays or with extra accessories and mechanisms, which are either not accessible or inconvenient for BVI individuals. In this paper, we present a mobile VR app that enables BVI users to access a virtual environment on an iPhone in order to build their skills of perception and recognition of the virtual environment and the virtual objects in the environment. The app uses the iPhone on a selfie stick to simulate a long cane in VR, and applies Augmented Reality (AR) techniques to track the iPhone’s real-time poses in an empty space of the real world, which is then synchronized to the long cane in the VR environment. Due to the use of mixed reality (the integration of VR & AR), we call it the Mixed Reality cane (MR Cane), which provides BVI users auditory and vibrotactile feedback whenever the virtual cane comes in contact with objects in VR. Thus, the MR Cane allows BVI individuals to interact with the virtual objects and identify approximate sizes and locations of the objects in the virtual environment. We performed preliminary user studies with blind-folded participants to investigate the effectiveness of the proposed mobile approach and the results indicate that the proposed MR Cane could be effective to help BVI individuals in understanding the interaction with virtual objects and exploring 3D virtual environments. The MR Cane concept can be extended to new applications of navigation, training and entertainment for BVI individuals without more significant efforts.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"63 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126117956","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Stencil Marker: Designing Partially Transparent Markers for Stacking Augmented Reality Objects 模板标记:设计用于堆叠增强现实对象的部分透明标记
2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) Pub Date : 2020-11-01 DOI: 10.1109/ISMAR-Adjunct51615.2020.00073
Xuan Zhang, Jonathan Lundgren, Yoya Mesaki, Yuichi Hiroi, Yuta Itoh
{"title":"Stencil Marker: Designing Partially Transparent Markers for Stacking Augmented Reality Objects","authors":"Xuan Zhang, Jonathan Lundgren, Yoya Mesaki, Yuichi Hiroi, Yuta Itoh","doi":"10.1109/ISMAR-Adjunct51615.2020.00073","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00073","url":null,"abstract":"We propose a transparent colored AR marker that allows 3D objects to be stacked in space. Conventional AR markers make it difficult to display multiple objects in the same position in space, or to manipulate the order or rotation of objects. The proposed transparent colored markers are designed to detect the order and rotation direction of each marker in the stack from the observed image, based on mathematical constraints. We describe these constraints to design markers, the implementation to detect its stacking order and rotation of each marker, and a proof-of-concept application Totem Poles. We also discuss the limitations of the current prototype and possible research directions.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128622512","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Industrial Augmented Reality: 3D-Content Editor for Augmented Reality Maintenance Worker Support System 工业增强现实:增强现实维护工作者支持系统的3d内容编辑器
2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) Pub Date : 2020-11-01 DOI: 10.1109/ISMAR-Adjunct51615.2020.00060
Mario Lorenz, Sebastian Knopp, Jisu Kim, Philipp Klimant
{"title":"Industrial Augmented Reality: 3D-Content Editor for Augmented Reality Maintenance Worker Support System","authors":"Mario Lorenz, Sebastian Knopp, Jisu Kim, Philipp Klimant","doi":"10.1109/ISMAR-Adjunct51615.2020.00060","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00060","url":null,"abstract":"Supporting maintenance with 3D object enhanced instruction is one of the key applications of Augmented Reality (AR) in industry. For the breakthrough of AR in maintenance, it is important that the technicians themselves can create AR-instructions and perform the challenging task of placing 3D objects as they know best how to perform a task and what necessary information needs to be displayed. For this challenge, a 3D-content editor is being presented wherein a first step the 3D objects can roughly be placed using a 2D image of the machine, therefore, limiting the time required to access the machine. In a second step, the positions of the 3D objects can be fine-tuned at the machine site using live footage. The key challenges were to develop an easily accessible UI that requires no prior knowledge of AR content creation in a tool that works both with live footage and images and is usable with a touch screen and keyboard/mouse. The 3D-content editor was qualitatively assessed by technicians revealing its general applicability, but also the requirement for a lot of time to gain the necessary experience for positioning 3D objects.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127006373","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
An Exploratory Study for Designing Social Experience of Watching VR Movies Based on Audience’s Voice Comments 基于观众语音评论的VR观影社交体验设计探索性研究
2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) Pub Date : 2020-11-01 DOI: 10.1109/ISMAR-Adjunct51615.2020.00049
Shuo Yan, Wenli Jiang, Menghan Xiong, Xukun Shen
{"title":"An Exploratory Study for Designing Social Experience of Watching VR Movies Based on Audience’s Voice Comments","authors":"Shuo Yan, Wenli Jiang, Menghan Xiong, Xukun Shen","doi":"10.1109/ISMAR-Adjunct51615.2020.00049","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00049","url":null,"abstract":"Social experience is important when audience are watching movies. Virtual reality (VR) movies engage audience through immersive environment and interactive narrative. However, VR headsets restrict audience to an individual experience, which disrupt the potential for shared social realities. In our study, we propose an approach to design an asynchronous social experience that allows the participant to receive other audiences’ voice comments (such as their opinions, impressions or emotional reactions) in VR movies. We measured the participants’ feedback on their engagement levels, recall abilities and social presence. The results showed that in VR-Voice Comment (VR-VC) movie, the audience’s voice comments could affect participant’s engagement and the recall of information in the scenes. The participants obtained social awareness and enjoyment at the same time. A few of them were worried mainly because of the potential auditory clutter that resulted from unpredictable voice comments. We discuss the design implications for this and directions for future research. Overall, we observe a positive tendency in watching VR-VC movie, which could be adapted for future VR movie experience.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"74 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126275213","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
3D human model creation on a serverless environment 在无服务器环境中创建3D人体模型
2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) Pub Date : 2020-11-01 DOI: 10.1109/ISMAR-Adjunct51615.2020.00044
Peter Fasogbon, Yu You, Emre B. Aksu
{"title":"3D human model creation on a serverless environment","authors":"Peter Fasogbon, Yu You, Emre B. Aksu","doi":"10.1109/ISMAR-Adjunct51615.2020.00044","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00044","url":null,"abstract":"The creation of realistic 3D human model is traditionally timeconsuming and cumbersome, and is typically done by professionals. In recent years computer vision technologies can assist in generating human models from controlled environments, we demonstrate a different but easy capturing scenario with less constraints on the subject or the environmental setup. The reconstruction process for 3D human model consists of various intermediate process such as semantic human segmentation, human skeletal keypoint detection, and texture generation. In order to achieve easy, scalable, and flexible deployment to different cloud environments, we have chosen the serverless architecture to offload some common service functionalities to the cloud infrastructure but focused on the core task,which is the reconstruction itself. The event-driven serverless architecture eases the building of such multimedia web services with minimal coding efforts, but simply defines the APIs and declares the APIs with correspondent lambda functions. The proposed approach in this paper allow anyone with a mobile phone to generate 3D models easily and quickly in the scale of few 2-3 minutes, rather than hours.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126289340","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
A Virtual Morris Water Maze to Study Neurodegenarative Disorders 一个虚拟莫里斯水迷宫研究神经退行性疾病
2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) Pub Date : 2020-11-01 DOI: 10.1109/ISMAR-Adjunct51615.2020.00048
Daniel Roth, Christian Felix Purps, W. Neumann
{"title":"A Virtual Morris Water Maze to Study Neurodegenarative Disorders","authors":"Daniel Roth, Christian Felix Purps, W. Neumann","doi":"10.1109/ISMAR-Adjunct51615.2020.00048","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00048","url":null,"abstract":"Navigation is a crucial cognitive skill that allows humans and animals to move from one place to another without getting lost. In neurological patients this skill can be impaired, when neural structures that form the brain networks important for spatial learning and navigation are impaired. Thus, spatial navigation represents an important measure of cognitive health that is impossible to test in a clinical examination, due to lack of space in examination rooms. Consequently, spatial navigation is largely neglected in the clinical assessment of neurological, neurosurgical and psychiatric patients. Virtual reality represents a unique opportunity to develop a systematic assessment of spatial navigation for diagnosis and therapeutic monitoring of millions of patients presenting with cognitive decline in the clinical routine. Therefore, we have adapted a classical spatial navigation paradigm that was developed for animal research, the \"Morris Water Maze\" as an openly available Virtual Reality (VR) application, that allows objective quantification of navigational skills in humans. This tool may be used in the future to aid the assessment of the human navigation system in health and neurological disease.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115038556","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Intention to use an interactive AR app for engineering education 打算使用交互式AR应用程序进行工程教育
2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) Pub Date : 2020-11-01 DOI: 10.1109/ISMAR-Adjunct51615.2020.00033
Alejandro Álvarez-Marín, J. Velázquez‐Iturbide, M. Castillo-Vergara
{"title":"Intention to use an interactive AR app for engineering education","authors":"Alejandro Álvarez-Marín, J. Velázquez‐Iturbide, M. Castillo-Vergara","doi":"10.1109/ISMAR-Adjunct51615.2020.00033","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00033","url":null,"abstract":"Augmented reality (AR) has been incorporated into educational processes in various subjects to improve academic performance. One of these areas is the field of electronics since students often have difficulty understanding electricity. An interactive AR app on electrical circuits was developed. The app allows the manipulation of circuit elements, computes the voltage and amperage values using the loop method, and applies Kirchhoff's voltage law. This research aims to determine the intention of using the AR app by students. It also looks to determine if it is conditioned by how the survey is applied (online or face-to-face) or students' gender. The results show that the app is well evaluated on the intention of use by students. Regarding how the survey is applied, the attitude towards using does not present significant differences. In contrast, the students who carried out the online survey presented a higher behavioral intention to use than those who participated in the guided laboratory. Regarding gender, women showed a higher attitude toward using and behavioral intention to use this technology than men.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125724000","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Modeling Emotions for Training in Immersive Simulations (METIS): A Cross-Platform Virtual Classroom Study 沉浸式模拟训练中的情绪建模(METIS):跨平台虚拟教室研究
2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) Pub Date : 2020-11-01 DOI: 10.1109/ISMAR-Adjunct51615.2020.00036
A. Delamarre, C. Lisetti, Cédric Buche
{"title":"Modeling Emotions for Training in Immersive Simulations (METIS): A Cross-Platform Virtual Classroom Study","authors":"A. Delamarre, C. Lisetti, Cédric Buche","doi":"10.1109/ISMAR-Adjunct51615.2020.00036","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00036","url":null,"abstract":"Virtual training environments (VTEs) using immersive technology have been able to successfully provide training for technical skills. Combined with recent advances in virtual social agent technologies and in affective computing, VTEs can now also support the training of social skills. Research looking at the effects of different immersive technologies on users’ experience (UX) can provide important insights about their impact on user’s engagement with the technology, sense presence and co-presence. However, current studies do not address whether emotions displayed by virtual agents provide the same level of UX across different virtual reality (VR) platforms. In this study, we considered a virtual classroom simulator built for desktop computer, and adapted for an immersive VR platform (CAVE). Users interact with virtual animated disruptive students able to display facial expressions, to help them practice their classroom behavior management skills. We assessed effects of the VR platforms and of the display of facial expressions on presence, co-presence, engagement, and believability. Results indicate that users were engaged, found the virtual students believable and felt presence and co-presence for both VR platforms. We also observed an interaction effects of facial expressions and VR platforms for co-presence (p = .018 < .05).","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"221 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128624438","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信