2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)最新文献

筛选
英文 中文
Social Presence and Cooperation in Large-Scale Multi-User Virtual Reality - The Relevance of Social Interdependence for Location-Based Environments 大规模多用户虚拟现实中的社会存在与合作——基于位置的环境中社会相互依赖的相关性
2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) Pub Date : 2018-03-18 DOI: 10.1109/VR.2018.8446575
C. Wienrich, Kristina Schindler, Nina Döllinger, Simon Kock, O. Traupe
{"title":"Social Presence and Cooperation in Large-Scale Multi-User Virtual Reality - The Relevance of Social Interdependence for Location-Based Environments","authors":"C. Wienrich, Kristina Schindler, Nina Döllinger, Simon Kock, O. Traupe","doi":"10.1109/VR.2018.8446575","DOIUrl":"https://doi.org/10.1109/VR.2018.8446575","url":null,"abstract":"Introduction. An increasing number of location-based entertainment centers offers the possibility of entering multi-user virtual reality (VR) scenarios. Until now, neither cognition and emotions of users nor team experience have been scientifically evaluated in such an application. The present study investigated the gain of positive social interdependence while experiencing an adventure on the Immersive Deck of Illusion Walk (Berlin, Germany). Method. The preliminary version of a VR group adventure of the company was enriched by a task establishing social interdependence (IDP condition). The impact of IDP on social presence and cooperation (i.e., mutual importance) was evaluated relative to a control task without interdependence (nIDP condition). Results. Social IDP increased social presence and cooperation among participants. Additionally, behavioral involvement (part of presence), certain aspects of the adventure experience, and the affective evaluation during the experience were positively influenced by IDP. Discussion. The present study showed that interdependence can substantially enhance social presence and cooperation (i.e., mutual importance) in a VR setting already characterized by social co-experience. Thus, it revealed one design option (social IDP) to improve the experience, particularly the social experience, of location-based entertainment. Conclusion. The present research addressed one goal of location-based VR hosts to scientifically established design principles for social and collective adventures by supporting the impact of “collectively mastering an adventurous challenge”. In addition, our evaluation demonstrated that the multi-modal tracking, the free movement, as well as the multi-user features enabled natural interaction with other users and the environment, and thereby engendered a comfortable social experience.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"48 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115217354","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 36
Impact of Alignment Point Distance Distribution on SPAAM Calibration of Optical See-Through Head-Mounted Displays 对准点距离分布对光学透明头戴式显示器SPAAM标定的影响
2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) Pub Date : 2018-03-18 DOI: 10.1109/VR.2018.8446429
Kenneth R. Moser, M. S. Arefin, J. Swan
{"title":"Impact of Alignment Point Distance Distribution on SPAAM Calibration of Optical See-Through Head-Mounted Displays","authors":"Kenneth R. Moser, M. S. Arefin, J. Swan","doi":"10.1109/VR.2018.8446429","DOIUrl":"https://doi.org/10.1109/VR.2018.8446429","url":null,"abstract":"The use of Optical See-Through Head-Mounted Displays (OST-HMDs) for presenting Augmented Reality experiences has become more common, due to the increasing availability of lower cost head-worn device options. Despite this growth, commercially available OST hardware remains devoid of the integrated eye-tracking cameras necessary for automatically calibrating user-specific view parameters, leaving manual calibration methods as the most consistently viable option across display types. The Single Point Active Alignment Method (SPAAM) is currently the most-cited manual calibration technique, due to the relaxation of user constraints with respect to allowable motion during the calibration process. This work presents the first formal study directly investigating the effects that alignment point distribution imposes on SPAAM calibration accuracy and precision. A user experiment, employing a single expert user, is presented, in which SPAAM calibrations are performed under each of five conditions. Four of the conditions cross alignment distance (arm length, room scale) with user pose (sitting, standing). The fifth condition is a control condition, in which the user is replaced with a rigidly mounted camera; the control condition removes the effect of noise from uncontrollable postural sway. The final experimental results show no significant impact on calibration due to user pose (sitting, standing). The control condition also did not differ from the user produced calibration results, suggesting that posture sway was not a significant factor. However, both the user and control conditions show significant improvement using arm's length alignment points over room scale alignments, with an order of magnitude difference in eye location estimate error between conditions.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"116 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124134415","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
A Method of View-Dependent Stereoscopic Projection on Curved Screen 曲面屏幕上视相关立体投影方法
2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) Pub Date : 2018-03-18 DOI: 10.1109/VR.2018.8446222
Juan Liu, Hanchao Li, Lu Zhao, Siwei Zhao, Guowen Qi, Yulong Bian, Xiangxu Meng, Chenglei Yang
{"title":"A Method of View-Dependent Stereoscopic Projection on Curved Screen","authors":"Juan Liu, Hanchao Li, Lu Zhao, Siwei Zhao, Guowen Qi, Yulong Bian, Xiangxu Meng, Chenglei Yang","doi":"10.1109/VR.2018.8446222","DOIUrl":"https://doi.org/10.1109/VR.2018.8446222","url":null,"abstract":"In this paper, we present a method of view-dependent stereoscopic projection on curved screen. It allows the user to walk around with the correct perspective view of the virtual scene consistent with his/her location. To solve the problem of distortion and drift of virtual objects when projecting the view-dependent scene images on curved screen, we operate a dynamic parallax adjustment of the stereoscopic images according to viewpoints. User evaluation shows that our proposed approaches are effective on improving visual experience.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116317604","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
A Realtime Virtual Grasping System for Manipulating Complex Objects 一种用于操纵复杂物体的实时虚拟抓取系统
2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) Pub Date : 2018-03-18 DOI: 10.1109/VR.2018.8446538
Hao Tian, Chanabo Wana, Xinvu Zhana
{"title":"A Realtime Virtual Grasping System for Manipulating Complex Objects","authors":"Hao Tian, Chanabo Wana, Xinvu Zhana","doi":"10.1109/VR.2018.8446538","DOIUrl":"https://doi.org/10.1109/VR.2018.8446538","url":null,"abstract":"With the introduction of new VR/AR devices, realistic and fast interaction within virtual environments becomes more and more appealing. However, the challenge is to make interactions with virtual objects accurately reflect interactions with physical objects in realtime. In this paper, we present a virtual grasping system for multi-fingered hands when manipulating complex objects. Humanlike grasping postures and realistic grasping motions guarantee a physically plausible appearance for hand grasping. Our system does not require any pre-captured motion data. Our system is fast enough to allow realtime interaction during virtual grasping for complex objects.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123191895","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Toward Intuitive 3D User Interfaces for Climbing, Flying and Stacking 走向直观的3D用户界面攀登,飞行和堆叠
2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) Pub Date : 2018-03-18 DOI: 10.1109/VR.2018.8446047
Antonin Bernardin, G. Cortes, Rebecca Fribourg, Tiffany Luong, Florian Nouviale, Hakim Si-Mohammed
{"title":"Toward Intuitive 3D User Interfaces for Climbing, Flying and Stacking","authors":"Antonin Bernardin, G. Cortes, Rebecca Fribourg, Tiffany Luong, Florian Nouviale, Hakim Si-Mohammed","doi":"10.1109/VR.2018.8446047","DOIUrl":"https://doi.org/10.1109/VR.2018.8446047","url":null,"abstract":"In this paper, we propose 3D user interfaces (3DUI) that are adapted to specific Virtual Reality (VR) tasks: climbing a ladder using a puppet metaphor, piloting a drone thanks to a 3D virtual compass and stacking 3D objects with physics-based manipulation and time control. These metaphors have been designed to provide the user with an intuitive, playful and efficient way to perform each task.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121090986","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Enhancing the Stiffness Perception of Tangible Objects in Mixed Reality Using Wearable Haptics 利用可穿戴触觉增强混合现实中有形物体的刚度感知
2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) Pub Date : 2018-03-18 DOI: 10.1109/VR.2018.8446280
Xavier de Tinguy, C. Pacchierotti, M. Marchal, A. Lécuyer
{"title":"Enhancing the Stiffness Perception of Tangible Objects in Mixed Reality Using Wearable Haptics","authors":"Xavier de Tinguy, C. Pacchierotti, M. Marchal, A. Lécuyer","doi":"10.1109/VR.2018.8446280","DOIUrl":"https://doi.org/10.1109/VR.2018.8446280","url":null,"abstract":"This paper studies the combination of tangible objects and wearable haptics for improving the display of stiffness sensations in virtual environments. Tangible objects enable to feel the general shape of objects, but they are often passive or unable to simulate several varying mechanical properties. Wearable haptic devices are portable and unobtrusive interfaces able to generate varying tactile sensations, but they often fail at providing convincing stiff contacts and distributed shape sensations. We propose to combine these two approaches in virtual and augmented reality (VR/AR), becoming able of arbitrarily augmenting the perceived stiffness of real/tangible objects by providing timely tactile stimuli at the fingers. We developed a proof-of-concept enabling to simulate varying elasticity/stiffness sensations when interacting with tangible objects by using wearable tactile modules at the fingertips. We carried out a user study showing that wearable haptic stimulation can well alter the perceived stiffness of real objects, even when the tactile stimuli are not delivered at the contact point. We illustrated our approach both in VR and AR, within several use cases and different tangible settings, such as when touching surfaces, pressing buttons and pistons, or holding an object. Taken together, our results pave the way for novel haptic sensations in VR/AR by better exploiting the multiple ways of providing simple, unobtrusive, and low-cost haptic displays.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"193 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122500377","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 32
Investigating a Sparse Peripheral Display in a Head-Mounted Display for VR Locomotion 研究VR运动头戴式显示器中的稀疏外围显示
2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) Pub Date : 2018-03-18 DOI: 10.1109/VR.2018.8446345
Abraham M. Hashemian, Alexandra Kitson, Thinh Nguyen-Vo, Hrvoje Benko, W. Stuerzlinger, B. Riecke
{"title":"Investigating a Sparse Peripheral Display in a Head-Mounted Display for VR Locomotion","authors":"Abraham M. Hashemian, Alexandra Kitson, Thinh Nguyen-Vo, Hrvoje Benko, W. Stuerzlinger, B. Riecke","doi":"10.1109/VR.2018.8446345","DOIUrl":"https://doi.org/10.1109/VR.2018.8446345","url":null,"abstract":"Head-Mounted Displays (HMDs) provide immersive experiences for virtual reality. However, their field of view (FOV) is still relatively small compared to the human eye, which adding sparse peripheral displays (SPDs) could address. We designed a new SPD, SparseLightVR2, which increases the HMD's FOV to 180° horizontally. We evaluated SparseLightVR2 with a study (N=29) by comparing three conditions: 1) no SPD, where the peripheral display (PD) was inactive; 2) extended SPD, where the PD provided visual cues consistent with and extending the HMD's main screen; and 3) counter-vection SPD, where the PD's visuals were flipped horizontally during VR travel to provide optic flow in the opposite direction of the travel. The participants experienced passive motion on a linear path and reported introspective measures such as sensation of self-motion. Results showed, compared to no SPD, both extended and counter-vection SPDs provided a more natural experience of motion, while extended SPD also enhanced vection intensity and believability of movement. Yet, visually induced motion sickness (VIMS) was not affected by display condition. To investigate the reason behind these non-significant results, we conducted a follow-up study and had users increase peripheral counter-vection visuals on the central HMD screen until they nulled out vection. Our results suggest extending HMDs through SPDs enhanced vection, naturalness, and believability of movement without enhancing VIMS, but reversed SPD motion cues might not be strong enough to reduce vection and VIMS.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122721138","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Visually-Induced Motion Sickness Reduction via Static and Dynamic Rest Frames 通过静态和动态休息框架减少视觉引起的晕动病
2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) Pub Date : 2018-03-18 DOI: 10.1109/VR.2018.8446210
Zekun Cao, J. Jerald, Regis Kopper
{"title":"Visually-Induced Motion Sickness Reduction via Static and Dynamic Rest Frames","authors":"Zekun Cao, J. Jerald, Regis Kopper","doi":"10.1109/VR.2018.8446210","DOIUrl":"https://doi.org/10.1109/VR.2018.8446210","url":null,"abstract":"Visually-induced motion sickness (VIMS), also known as cyber-sickness, is a major challenge for wide-spread Virtual Reality (VR) adoption. VIMS can be reduced in different ways, for example by using high-quality tracking systems and reducing the user's field of view. However, there are no universal solutions for all situations, and a wide variety of techniques are needed in order for developers to choose the most appropriate options depending on their needs. One way to reduce VIMS is through the use of rest frames-portions of the virtual environment that remain fixed in relation to the real world and do not move as the user virtually moves. We report the results of two multi-day within-subjects studies with 44 subjects who used virtual travel to navigate the environment. In the first study, we investigated the influence of static rest frames with fixed opacity on user comfort. For the second study, we present an enhanced version of rest frames that we call dynamic rest frames, where the opacity of the rest frame changes in response to visually perceived motion as users virtually traversed the virtual environment. Results show that a virtual environment with a static or dynamic rest frame allowed users to travel through more waypoints before stopping due to discomfort compared to a virtual environment without a rest frame. Further, a virtual environment with a static rest frame was also found to result in more real-time reported comfort than when there was no rest frame.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128631883","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 76
Immersive Exploration of OSGi-Based Software Systems in Virtual Reality 基于osgi的虚拟现实软件系统的沉浸式探索
2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) Pub Date : 2018-03-18 DOI: 10.1109/VR.2018.8446057
Martin Mišiak, D. Seider, Sascha Zur, Arnulph Fuhrmann, A. Schreiber
{"title":"Immersive Exploration of OSGi-Based Software Systems in Virtual Reality","authors":"Martin Mišiak, D. Seider, Sascha Zur, Arnulph Fuhrmann, A. Schreiber","doi":"10.1109/VR.2018.8446057","DOIUrl":"https://doi.org/10.1109/VR.2018.8446057","url":null,"abstract":"We present an approach for exploring OSGi-based software systems in virtual reality. We employ an island metaphor, which represents every module as a distinct island. The resulting island system is displayed in the confines of a virtual table, where users can explore the software visualization on multiple levels of granularity by performing intuitive navigational tasks. Our approach allows users to get a first overview about the complexity of an OSGi-based software system by interactively exploring its modules as well as the dependencies between them.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128778958","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 13
Towards Joint Attention Training for Children with ASD - a VR Game Approach and Eye Gaze Exploration 面向ASD儿童的联合注意力训练——VR游戏方法与眼睛凝视探索
2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) Pub Date : 2018-03-18 DOI: 10.1109/VR.2018.8446242
Chao Mei, Bushra Tasnim Zahed, L. Mason, J. Quarles
{"title":"Towards Joint Attention Training for Children with ASD - a VR Game Approach and Eye Gaze Exploration","authors":"Chao Mei, Bushra Tasnim Zahed, L. Mason, J. Quarles","doi":"10.1109/VR.2018.8446242","DOIUrl":"https://doi.org/10.1109/VR.2018.8446242","url":null,"abstract":"Joint attention is critical to the education and development of a child. Deficits in joint attention are considered by many researchers to be an early predictor of children with Autism Spectrum Disorder (ASD). Training of joint attention have been a significant topic in ASD intervention education research. We propose a novel joint attention training approach using a Customizable Virtual Human (CVH) and a Virtual Reality (VR) game to assist with joint attention training. Previous work has shown that CVHs potentially help the users with ASD to increase their performance in hand-eye coordination, motivate the users to play longer, as well as improve user experience in a training game. Based upon these discovered CVH benefits, we hypothesize that CVHs may also be beneficial in training joint attention for users with ASD. To test our hypothesis, we developed a CVH with customizable facial features in an educational game - Imagination Drums - and conducted a user study on adolescents with high functioning ASD to investigate the effects of CVHs. We collected users' eye-gaze data and task performance during the game to evaluate the users' joint attention with CVHs and the effectiveness of CVHs compared with Non-Customizable Virtual Humans (NCVHs). The study results showed that the CVH make the participants gaze less at the irrelevant area of the game's storyline (i.e. background), but surprisingly, also provided evidence that participants react slower to the CVH's joint attention bids, compared with NCVH. Overall, the study reveals insights of how users with ASD interact with CVHs and how these interactions affect joint attention.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128801426","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 20
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信